Sample records for term model evaluations

  1. Source term model evaluations for the low-level waste facility performance assessment

    Energy Technology Data Exchange (ETDEWEB)

    Yim, M.S.; Su, S.I. [North Carolina State Univ., Raleigh, NC (United States)


    The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.

  2. Using Reactive Transport Modeling to Evaluate the Source Term at Yucca Mountain

    Energy Technology Data Exchange (ETDEWEB)

    Y. Chen


    The conventional approach of source-term evaluation for performance assessment of nuclear waste repositories uses speciation-solubility modeling tools and assumes pure phases of radioelements control their solubility. This assumption may not reflect reality, as most radioelements (except for U) may not form their own pure phases. As a result, solubility limits predicted using the conventional approach are several orders of magnitude higher then the concentrations of radioelements measured in spent fuel dissolution experiments. This paper presents the author's attempt of using a non-conventional approach to evaluate source term of radionuclide release for Yucca Mountain. Based on the general reactive-transport code AREST-CT, a model for spent fuel dissolution and secondary phase precipitation has been constructed. The model accounts for both equilibrium and kinetic reactions. Its predictions have been compared against laboratory experiments and natural analogues. It is found that without calibrations, the simulated results match laboratory and field observations very well in many aspects. More important is the fact that no contradictions between them have been found. This provides confidence in the predictive power of the model. Based on the concept of Np incorporated into uranyl minerals, the model not only predicts a lower Np source-term than that given by conventional Np solubility models, but also produces results which are consistent with laboratory measurements and observations. Moreover, two hypotheses, whether Np enters tertiary uranyl minerals or not, have been tested by comparing model predictions against laboratory observations, the results favor the former. It is concluded that this non-conventional approach of source term evaluation not only eliminates over-conservatism in conventional solubility approach to some extent, but also gives a realistic representation of the system of interest, which is a prerequisite for truly understanding the long-term

  3. Standardizing the performance evaluation of short-term wind prediction models

    DEFF Research Database (Denmark)

    Madsen, Henrik; Pinson, Pierre; Kariniotakis, G.


    evaluation of model performance. This paper proposes a standardized protocol for the evaluation of short-term wind-poser preciction systems. A number of reference prediction models are also described, and their use for performance comparison is analysed. The use of the protocol is demonstrated using results...... from both on-shore and off-shore wind forms. The work was developed in the frame of the Anemos project (EU R&D project) where the protocol has been used to evaluate more than 10 prediction systems....

  4. Population Pharmacokinetics of Intravenous Paracetamol (Acetaminophen) in Preterm and Term Neonates: Model Development and External Evaluation. (United States)

    Cook, Sarah F; Roberts, Jessica K; Samiee-Zafarghandy, Samira; Stockmann, Chris; King, Amber D; Deutsch, Nina; Williams, Elaine F; Allegaert, Karel; Wilkins, Diana G; Sherwin, Catherine M T; van den Anker, John N


    The aims of this study were to develop a population pharmacokinetic model for intravenous paracetamol in preterm and term neonates and to assess the generalizability of the model by testing its predictive performance in an external dataset. Nonlinear mixed-effects models were constructed from paracetamol concentration-time data in NONMEM 7.2. Potential covariates included body weight, gestational age, postnatal age, postmenstrual age, sex, race, total bilirubin, and estimated glomerular filtration rate. An external dataset was used to test the predictive performance of the model through calculation of bias, precision, and normalized prediction distribution errors. The model-building dataset included 260 observations from 35 neonates with a mean gestational age of 33.6 weeks [standard deviation (SD) 6.6]. Data were well-described by a one-compartment model with first-order elimination. Weight predicted paracetamol clearance and volume of distribution, which were estimated as 0.348 L/h (5.5 % relative standard error; 30.8 % coefficient of variation) and 2.46 L (3.5 % relative standard error; 14.3 % coefficient of variation), respectively, at the mean subject weight of 2.30 kg. An external evaluation was performed on an independent dataset that included 436 observations from 60 neonates with a mean gestational age of 35.6 weeks (SD 4.3). The median prediction error was 10.1 % [95 % confidence interval (CI) 6.1-14.3] and the median absolute prediction error was 25.3 % (95 % CI 23.1-28.1). Weight predicted intravenous paracetamol pharmacokinetics in neonates ranging from extreme preterm to full-term gestational status. External evaluation suggested that these findings should be generalizable to other similar patient populations.

  5. Evaluating Clouds in Long-Term Cloud-Resolving Model Simulations with Observational Data (United States)

    Zeng, Xiping; Tao, Wei-Kuo; Zhang, Minghua; Peters-Lidard, Christa; Lang, Stephen; Simpson, Joanne; Kumar, Sujay; Xie, Shaocheng; Eastman, Joseph L.; Shie, Chung-Lin; hide


    Two 20-day, continental midlatitude cases are simulated with a three-dimensional (3D) cloud-resolving model (CRM) and compared to Atmospheric Radiation Measurement (ARM) data. This evaluation of long-term cloud-resolving model simulations focuses on the evaluation of clouds and surface fluxes. All numerical experiments, as compared to observations, simulate surface precipitation well but over-predict clouds, especially in the upper troposphere. The sensitivity of cloud properties to dimensionality and other factors is studied to isolate the origins of the over prediction of clouds. Due to the difference in buoyancy damping between 2D and 3D models, surface precipitation fluctuates rapidly with time, and spurious dehumidification occurs near the tropopause in the 2D CRM. Surface fluxes from a land data assimilation system are compared with ARM observations. They are used in place of the ARM surface fluxes to test the sensitivity of simulated clouds to surface fluxes. Summertime simulations show that surface fluxes from the assimilation system bring about a better simulation of diurnal cloud variation in the lower troposphere.

  6. Empirical evaluation of the conceptual model underpinning a regional aquatic long-term monitoring program using causal modelling (United States)

    Irvine, Kathryn M.; Miller, Scott; Al-Chokhachy, Robert K.; Archer, Erik; Roper, Brett B.; Kershner, Jeffrey L.


    Conceptual models are an integral facet of long-term monitoring programs. Proposed linkages between drivers, stressors, and ecological indicators are identified within the conceptual model of most mandated programs. We empirically evaluate a conceptual model developed for a regional aquatic and riparian monitoring program using causal models (i.e., Bayesian path analysis). We assess whether data gathered for regional status and trend estimation can also provide insights on why a stream may deviate from reference conditions. We target the hypothesized causal pathways for how anthropogenic drivers of road density, percent grazing, and percent forest within a catchment affect instream biological condition. We found instream temperature and fine sediments in arid sites and only fine sediments in mesic sites accounted for a significant portion of the maximum possible variation explainable in biological condition among managed sites. However, the biological significance of the direct effects of anthropogenic drivers on instream temperature and fine sediments were minimal or not detected. Consequently, there was weak to no biological support for causal pathways related to anthropogenic drivers’ impact on biological condition. With weak biological and statistical effect sizes, ignoring environmental contextual variables and covariates that explain natural heterogeneity would have resulted in no evidence of human impacts on biological integrity in some instances. For programs targeting the effects of anthropogenic activities, it is imperative to identify both land use practices and mechanisms that have led to degraded conditions (i.e., moving beyond simple status and trend estimation). Our empirical evaluation of the conceptual model underpinning the long-term monitoring program provided an opportunity for learning and, consequently, we discuss survey design elements that require modification to achieve question driven monitoring, a necessary step in the practice of

  7. Evaluation of long-term ozone simulations from seven regional air quality models and their ensemble

    NARCIS (Netherlands)

    Loon, van M.; Vautard, R.; Schaap, M.; Bergström, R.; Bessagnet, B.; Brandt, J.; Krol, M.C.


    Long-term ozone simulations from seven regional air quality models, the Unified EMEP model, LOTOS-EUROS, CHIMERE, RCG, MATCH, DEHM and TM5, are intercompared and compared to ozone measurements within the framework of the EuroDelta experiment, designed to assess air quality improvement at the

  8. Modeling the Interest Rate Term Structure: Derivatives Contracts Dynamics and Evaluation

    Directory of Open Access Journals (Sweden)

    Pedro L. Valls Pereira


    Full Text Available This article deals with a model for the term structure of interest rates and the valuation of derivative contracts directly dependent on it. The work is of a theoretical nature and deals, exclusively, with continuous time models, making ample use of stochastic calculus results and presents original contributions that we consider relevant to the development of the fixed income market modeling. We develop a new multifactorial model of the term structure of interest rates. The model is based on the decomposition of the yield curve into the factors level, slope, curvature, and the treatment of their collective dynamics. We show that this model may be applied to serve various objectives: analysis of bond price dynamics, valuation of derivative contracts and also market risk management and formulation of operational strategies which is presented in another article.

  9. Short Term Evaluation of an Anatomically Shaped Polycarbonate Urethane Total Meniscus Replacement in a Goat Model

    NARCIS (Netherlands)

    Vrancken, A.C.T.; Madej, W.; Hannink, G.; Verdonschot, N.J.; Tienen, T.G. van; Buma, P.


    PURPOSE: Since the treatment options for symptomatic total meniscectomy patients are still limited, an anatomically shaped, polycarbonate urethane (PCU), total meniscus replacement was developed. This study evaluates the in vivo performance of the implant in a goat model, with a specific focus on

  10. Short term evaluation of an anatomically shaped polycarbonate urethane total meniscus replacement in a goat model

    NARCIS (Netherlands)

    Vrancken, A.C.T.; Madej, W.; Hannink, G.; Verdonschot, Nicolaas Jacobus Joseph; van Tienen, T.G.; Buma, P.


    Purpose: Since the treatment options for symptomatic total meniscectomy patients are still limited, an anatomically shaped, polycarbonate urethane (PCU), total meniscus replacement was developed. This study evaluates the in vivo performance of the implant in a goat model, with a specific focus on

  11. Dynamic term structure models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller; Meldrum, Andrew

    This paper studies whether dynamic term structure models for US nominal bond yields should enforce the zero lower bound by a quadratic policy rate or a shadow rate specification. We address the question by estimating quadratic term structure models (QTSMs) and shadow rate models with at most four...

  12. Short Term Evaluation of an Anatomically Shaped Polycarbonate Urethane Total Meniscus Replacement in a Goat Model.

    Directory of Open Access Journals (Sweden)

    A C T Vrancken

    Full Text Available Since the treatment options for symptomatic total meniscectomy patients are still limited, an anatomically shaped, polycarbonate urethane (PCU, total meniscus replacement was developed. This study evaluates the in vivo performance of the implant in a goat model, with a specific focus on the implant location in the joint, geometrical integrity of the implant and the effect of the implant on synovial membrane and articular cartilage histopathological condition.The right medial meniscus of seven Saanen goats was replaced by the implant. Sham surgery (transection of the MCL, arthrotomy and MCL suturing was performed in six animals. The contralateral knee joints of both groups served as control groups. After three months follow-up the following aspects of implant performance were evaluated: implant position, implant deformation and the histopathological condition of the synovium and cartilage.Implant geometry was well maintained during the three month implantation period. No signs of PCU wear were found and the implant did not induce an inflammatory response in the knee joint. In all animals, implant fixation was compromised due to suture breakage, wear or elongation, likely causing the increase in extrusion observed in the implant group. Both the femoral cartilage and tibial cartilage in direct contact with the implant showed increased damage compared to the sham and sham-control groups.This study demonstrates that the novel, anatomically shaped PCU total meniscal replacement is biocompatible and resistant to three months of physiological loading. Failure of the fixation sutures may have increased implant mobility, which probably induced implant extrusion and potentially stimulated cartilage degeneration. Evidently, redesigning the fixation method is necessary. Future animal studies should evaluate the improved fixation method and compare implant performance to current treatment standards, such as allografts.

  13. Identfying the Needs of Pre-Service Classroom Teachers about Science Teaching Methodology Courses in Terms of Parlett's Illuminative Program Evaluation Model (United States)

    Çaliskan, Ilke


    The aim of this study was to identify the needs of third grade classroom teaching students about science teaching course in terms of Parlett's Illuminative program evaluation model. Phenomographic research design was used in this study. Illuminative program evaluation model was chosen for this study in terms of its eclectic and process-based…

  14. Identfying the Needs of Pre-Service Classroom Teachers about Science Teaching Methodology Course in Terms of Parlett's Illuminative Program Evaluation Model (United States)

    Çaliskan, Ilke


    The aim of this study was to identify the needs of third grade classroom teaching students about science teaching course in terms of Parlett's Illuminative program evaluation model. Phenomographic research design was used in this study. Illuminative program evaluation model was chosen for this study in terms of its eclectic and process-based…

  15. Spatial variability of the effect of air pollution on term birth weight: evaluating influential factors using Bayesian hierarchical models. (United States)

    Li, Lianfa; Laurent, Olivier; Wu, Jun


    exposure or higher socioeconomic status with lower vulnerability. Our Bayesian models effectively combined a priori knowledge with training data to infer the posterior association of air pollution with term birth weight and to evaluate the influence of the tract-level factors on spatial variability of such association. This study contributes new findings about non-linear influences of socio-demographic factors, land-use patterns, and unaccounted exposures on spatial variability of the effects of air pollution.

  16. Development and evaluation of a stochastic daily rainfall model with long-term variability

    Directory of Open Access Journals (Sweden)

    A. F. M. K. Chowdhury


    Full Text Available The primary objective of this study is to develop a stochastic rainfall generation model that can match not only the short resolution (daily variability but also the longer resolution (monthly to multiyear variability of observed rainfall. This study has developed a Markov chain (MC model, which uses a two-state MC process with two parameters (wet-to-wet and dry-to-dry transition probabilities to simulate rainfall occurrence and a gamma distribution with two parameters (mean and standard deviation of wet day rainfall to simulate wet day rainfall depths. Starting with the traditional MC-gamma model with deterministic parameters, this study has developed and assessed four other variants of the MC-gamma model with different parameterisations. The key finding is that if the parameters of the gamma distribution are randomly sampled each year from fitted distributions rather than fixed parameters with time, the variability of rainfall depths at both short and longer temporal resolutions can be preserved, while the variability of wet periods (i.e. number of wet days and mean length of wet spell can be preserved by decadally varied MC parameters. This is a straightforward enhancement to the traditional simplest MC model and is both objective and parsimonious.

  17. LandSoil, a model for evaluating soil erosion on mid-term agricultural landscape evolution: Sensitivity analysis (United States)

    Ciampalini, R.; Cheviron, B.; Follain, S.; Le Bissonnais, Y.


    Soil-landscape evolution modelling is a widespread research topic; many models have been developed to analyse space-time dynamics in soil redistribution processes. Such modelling presumes both to use precision input data and affordable models able to model different land use scenarios and climatic variations. Based on this context, we tested the LandSoil model (Landscape design for Soil conservation under soil use and climate change) for sensitivity analysis. This model is designed for the analysis of agricultural landscape evolution at a fine spatial resolution scale [1-10 meters] and a mid-term temporal scale [10-100 years]. It is spatially distributed, event-based, and considers water and tillage erosion processes. Specificity of the model is to have dynamic representation of the agricultural landscape with a monthly representation of soil surface properties and to account for the climate component directly in rainfall events. Sensitivity analysis (SA) is a classical tool for the evaluation of the model's reaction to the different input variables. We investigated local SA of the model to rainfall inputs, related hydrological fluxes and specific erosion parameters responsible for diffusion and concentrated soil erosion. Tests analysed multiple combinations of rain amounts and intensities, as well as different runoff conditions within the soil parameter space using the one-at-a-time and Latin-Hypercube resampling methods. Sensitivity to spatial distributions of erosion parameters was calculated as an index of numerical spread of soil loss results obtained at the outlet of virtual catchment endowed with a fixed flow network. The study furnished a ranking of the parameters' sensitivity and provides evidence that some discontinuities in response are due to the non-linearity in parameterisations.

  18. Numerical modeling of the releases of (90)SR from Fukushima to the ocean: an evaluation of the source term. (United States)

    Periáñez, R; Suh, Kyung-Suk; Byung-Il, Min; Casacuberta, N; Masqué, P


    A numerical model consisting of a 3D advection/diffusion equation, including uptake/release reactions between water and sediments described in a dynamic way, has been applied to simulate the marine releases of (90)Sr from the Fukushima power plant after the March 2011 tsunami. This is a relevant issue since (90)Sr releases are still occurring. The model used here had been successfully applied to simulate (137)Cs releases. Assuming that the temporal trend of (90)Sr releases was the same as for (137)Cs during the four months after the accident simulated here, the source term could be evaluated, resulting in a total release of 80 TBq of (90)Sr until the end of June, which is in the lower range of previous estimates. Computed vertical profiles of (90)Sr in the water column have been compared with measured ones. The (90)Sr inventories within the model domain have also been calculated for the water column and for bed sediments. Maximum dissolved inventory (obtained for April 10th, 2011) within the model domain results in about 58 TBq. Inventories in bed sediments are 3 orders of magnitude lower than in the water column due to the low reactivity of this radionuclide. (90)Sr/(137)Cs ratios in the ocean have also been calculated and compared with measured values, showing both spatial and temporal variations.

  19. Toyotarity. Term, model, range

    Directory of Open Access Journals (Sweden)

    Stanisław Borkowski


    Full Text Available The Toyotarity and BOST term was presented in the chapter. The BOST method allows to define relations between material resources and human resources and between human resources and human resources (TOYOTARITY. This term was also invented by the Author (and is legally protected. The idea of methodology is an outcome of 12 years of work.

  20. Evaluation of Long-Term Cloud-Resolving Model Simulations Using Satellite Radiance Observations and Multi-Frequency Satellite Simulators (United States)

    Matsui, Toshihisa; Zeng, Xiping; Tao, Wei-Kuo; Masunaga, Hirohiko; Olson, William S.; Lang, Stephen


    This paper proposes a methodology known as the Tropical Rainfall Measuring Mission (TRMM) Triple-Sensor Three-step Evaluation Framework (T3EF) for the systematic evaluation of precipitating cloud types and microphysics in a cloud-resolving model (CRM). T3EF utilizes multi-frequency satellite simulators and novel statistics of multi-frequency radiance and backscattering signals observed from the TRMM satellite. Specifically, T3EF compares CRM and satellite observations in the form of combined probability distributions of precipitation radar (PR) reflectivity, polarization-corrected microwave brightness temperature (Tb), and infrared Tb to evaluate the candidate CRM. T3EF is used to evaluate the Goddard Cumulus Ensemble (GCE) model for cases involving the South China Sea Monsoon Experiment (SCSMEX) and Kwajalein Experiment (KWAJEX). This evaluation reveals that the GCE properly captures the satellite-measured frequencies of different precipitating cloud types in the SCSMEX case but underestimates the frequencies of deep convective and deep stratiform types in the KWAJEX case. Moreover, the GCE tends to simulate excessively large and abundant frozen condensates in deep convective clouds as inferred from the overestimated GCE-simulated radar reflectivities and microwave Tb depressions. Unveiling the detailed errors in the GCE s performance provides the best direction for model improvements.

  1. Interaction Terms in Nonlinear Models (United States)

    Karaca-Mandic, Pinar; Norton, Edward C; Dowd, Bryan


    Objectives To explain the use of interaction terms in nonlinear models. Study Design We discuss the motivation for including interaction terms in multivariate analyses. We then explain how the straightforward interpretation of interaction terms in linear models changes in nonlinear models, using graphs and equations. We extend the basic results from logit and probit to difference-in-differences models, models with higher powers of explanatory variables, other nonlinear models (including log transformation and ordered models), and panel data models. Empirical Application We show how to calculate and interpret interaction effects using a publicly available Stata data set with a binary outcome. Stata 11 has added several features which make those calculations easier. LIMDEP code also is provided. Conclusions It is important to understand why interaction terms are included in nonlinear models in order to be clear about their substantive interpretation. PMID:22091735

  2. Internal and external validation strategies for the evaluation of long-term effects in NIR calibration models. (United States)

    Sileoni, Valeria; van den Berg, Frans; Marconi, Ombretta; Perretti, Giuseppe; Fantozzi, Paolo


    Some of the practical aspects of long-term calibration-set building are presented in this study. A calibration model able to predict the Kolbach index for brewing malt is defined, and four different validations and resampling schemes were applied to determine its real predictive power. The results obtained demonstrated that one single performance criterion might be not sufficient and can lead to over- or underestimation of the model quality. Comparing a simple leave-one-sample-out cross-validation (CV) with two more challenging CVs with leave-N-samples-out, where the resamplings were repeated 200 times, it is demonstrated that the error of prediction value has an uncertainty, and these values change according to the type and the number of validation samples. Then, two kinds of test-set validations were applied, using data blocks based on the sample collection's year, demonstrating that it is necessary to consider long-term effects on NIR calibrations and to be conservative in the number of factors selected. The conclusion is that one should be modest in reporting the prediction error because it changes according to the type of validation used to estimate it and it is necessary to consider the long-term effects.

  3. Modelling dynamic transport and adsorption of arsenic in soil-bed filters for long-term performance evaluation (United States)

    Mondal, Sourav; Mondal, Raka; de, Sirshendu; Griffiths, Ian


    Purification of contaminated water following the safe water guidelines while generating sufficiently large throughput is a crucial requirement for the steady supply of safe water to large populations. Adsorption-based filtration processes using a multilayer soil bed has been posed as a viable method to achieve this goal. This work describes the theory of operation and prediction of the long-term behaviour of such a system. The fixed-bed column has a single input of contaminated water from the top and an output from the bottom. As the contaminant passes through the column, it is adsorbed by the medium. Like any other adsorption medium, the filter has a certain lifespan, beyond which the filtrate does not meet the safe limit of drinking water, which is defined as `breakthrough'. A mathematical model is developed that couples the fluid flow through the porous medium to the convective, diffusive and adsorptive transport of the contaminant. The results are validated with experimental observations and the model is then used to predict the breakthrough and lifetime of the filter. The key advantage of this model is that it can predict the long-term behaviour of any adsorption column system for any set of physical characteristics of the system. This worked was supported by the EPSRC Global Challenge Research Fund Institutional Sponsorship 2016.

  4. Long term morphological modelling

    DEFF Research Database (Denmark)

    Kristensen, Sten Esbjørn; Deigaard, Rolf; Taaning, Martin


    in the surf zone. Two parameterization schemes are tested for two different morphological phenomena: 1) Shoreline changes due to the presence of coastal structures and 2) alongshore migration of a nearshore nourishment and a bar by-passing a harbour. In the case of the shoreline evolution calculations......, a concept often used in one-line modelling of cross-shore shifting of an otherwise constant shape cross-shore profile is applied for the case of a groyne and a detached breakwater. In the case of alongshore bar/nourishment migration an alternative parameterization is adopted. All examples are presented......, analysed and discussed with respect to the question of realistic representation, time scale and general applicability of the model concept....

  5. Financing institutional long-term care for the elderly in China: a policy evaluation of new models. (United States)

    Yang, Wei; Jingwei He, Alex; Fang, Lijie; Mossialos, Elias


    A rapid ageing population coupled with changes in family structure has brought about profound implications to social policy in China. Although the past decade has seen a steady increase in public funding to long-term care (LTC), the narrow financing base and vast population have created significant unmet demand, calling for reforms in financing. This paper focuses on the financing of institutional LTC care by examining new models that have emerged from local policy experiments against two policy goals: equity and efficiency. Three emerging models are explored: Social Health Insurance (SHI) in Shanghai, LTC Nursing Insurance (LTCNI) in Qingdao and a means-tested model in Nanjing. A focused systematic narrative review of academic and grey literature is conducted to identify and assess these models, supplemented with qualitative interviews with government officials from relevant departments, care home staff and service users. This paper argues that, although SHI appears to be a convenient solution to fund LTC, this model has led to systematic bias in affordable access among participants of different insurance schemes, and has created a powerful incentive for the over-provision of unnecessary services. The means-tested method has been remarkably constrained by narrow eligibility and insufficiency of funding resources. The LTCNI model is by far the most desirable policy option among the three studied here, but the narrow definition of eligibility has substantively excluded a large proportion of elders in need from access to care, which needs to be addressed in future reforms. This paper proposes three lines of LTC financing reforms for policy-makers: (1) the establishment of a prepaid financing mechanism pooled specifically for LTC costs; (2) the incorporation of more stringent eligibility rules and needs assessment; and (3) reforming the dominant fee-for-service methods in paying LTC service providers. © The Author 2016. Published by Oxford University Press in

  6. Evaluation and analysis of term scoring methods for term extraction

    NARCIS (Netherlands)

    Verberne, S.; Sappelli, M.; Hiemstra, D.; Kraaij, W.


    We evaluate five term scoring methods for automatic term extraction on four different types of text collections: personal document collections, news articles, scientific articles and medical discharge summaries. Each collection has its own use case: author profiling, boolean query term suggestion,

  7. Evaluation for Long Term PM10 Concentration Forecasting using Multi Linear Regression (MLR and Principal Component Regression (PCR Models

    Directory of Open Access Journals (Sweden)

    Samsuri Abdullah


    Full Text Available Air pollution in Peninsular Malaysia is dominated by particulate matter which is demonstrated by having the highest Air Pollution Index (API value compared to the other pollutants at most part of the country. Particulate Matter (PM10 forecasting models development is crucial because it allows the authority and citizens of a community to take necessary actions to limit their exposure to harmful levels of particulates pollution and implement protection measures to significantly improve air quality on designated locations. This study aims in improving the ability of MLR using PCs inputs for PM10 concentrations forecasting. Daily observations for PM10 in Kuala Terengganu, Malaysia from January 2003 till December 2011 were utilized to forecast PM10 concentration levels. MLR and PCR (using PCs input models were developed and the performance was evaluated using RMSE, NAE and IA. Results revealed that PCR performed better than MLR due to the implementation of PCA which reduce intricacy and eliminate data multi-collinearity.

  8. Health economic modeling to assess short-term costs of maternal overweight, gestational diabetes, and related macrosomia - a pilot evaluation. (United States)

    Lenoir-Wijnkoop, Irene; van der Beek, Eline M; Garssen, Johan; Nuijten, Mark J C; Uauy, Ricardo D


    Despite the interest in the impact of overweight and obesity on public health, little is known about the social and economic impact of being born large for gestational age or macrosomic. Both conditions are related to maternal obesity and/or gestational diabetes mellitus (GDM) and associated with increased morbidity for mother and child in the perinatal period. Poorly controlled diabetes during pregnancy, pre- pregnancy maternal obesity and/or excessive maternal weight gain during pregnancy are associated with intermittent periods of fetal exposure to hyperglycemia and subsequent hyperinsulinemia, leading to increased birth weight (e.g., macrosomia), body adiposity, and glycogen storage in the liver. Macrosomia is associated with an increased risk of developing obesity and type 2 diabetes mellitus later in life. Provide insight in the short-term health-economic impact of maternal overweight, GDM, and related macrosomia. To this end, a health economic framework was designed. This pilot study also aims to encourage further health technology assessments, based on country- and population-specific data. The estimation of the direct health-economic burden of maternal overweight, GDM and related macrosomia indicates that associated healthcare expenditures are substantial. The calculation of a budget impact of GDM, based on a conservative approach of our model, using USA costing data, indicates an annual cost of more than $1,8 billion without taking into account long-term consequences. Although overweight and obesity are a recognized concern worldwide, less attention has been given to the health economic consequences of these conditions in women of child-bearing age and their offspring. The presented outcomes underline the need for preventive management strategies and public health interventions on life style, diet and physical activity. Also, the predisposition in people of Asian ethnicity to develop diabetes emphasizes the urgent need to collect more country

  9. Redintegration and the Benefits of Long-Term Knowledge in Verbal Short-Term Memory: An Evaluation of Schweickert's (1993) Multinomial Processing Tree Model (United States)

    Thorn, Annabel S. C.; Gathercole, Susan E.; Frankish, Clive R.


    The impact of four long-term knowledge variables on serial recall accuracy was investigated. Serial recall was tested for high and low frequency words and high and low phonotactic frequency nonwords in 2 groups: monolingual English speakers and French-English bilinguals. For both groups the recall advantage for words over nonwords reflected more…

  10. An evaluation of Bayesian techniques for controlling model complexity and selecting inputs in a neural network for short-term load forecasting. (United States)

    Hippert, Henrique S; Taylor, James W


    Artificial neural networks have frequently been proposed for electricity load forecasting because of their capabilities for the nonlinear modelling of large multivariate data sets. Modelling with neural networks is not an easy task though; two of the main challenges are defining the appropriate level of model complexity, and choosing the input variables. This paper evaluates techniques for automatic neural network modelling within a Bayesian framework, as applied to six samples containing daily load and weather data for four different countries. We analyse input selection as carried out by the Bayesian 'automatic relevance determination', and the usefulness of the Bayesian 'evidence' for the selection of the best structure (in terms of number of neurones), as compared to methods based on cross-validation. Copyright 2009 Elsevier Ltd. All rights reserved.

  11. New Rapid Evaluation for Long-Term Behavior in Deep Geological Repository by Geotechnical Centrifuge—Part 2: Numerical Simulation of Model Tests in Isothermal Condition (United States)

    Sawada, Masataka; Nishimoto, Soshi; Okada, Tetsuji


    In high-level radioactive waste disposal repositories, there are long-term complex thermal, hydraulic, and mechanical (T-H-M) phenomena that involve the generation of heat from the waste, the infiltration of ground water, and swelling of the bentonite buffer. The ability to model such coupled phenomena is of particular importance to the repository design and assessments of its safety. We have developed a T-H-M-coupled analysis program that evaluates the long-term behavior around the repository (called "near-field"). We have also conducted centrifugal model tests that model the long-term T-H-M-coupled behavior in the near-field. In this study, we conduct H-M-coupled numerical simulations of the centrifugal near-field model tests. We compare numerical results with each other and with results obtained from the centrifugal model tests. From the comparison, we deduce that: (1) in the numerical simulation, water infiltration in the rock mass was in agreement with the experimental observation. (2) The constant-stress boundary condition in the centrifugal model tests may cause a larger expansion of the rock mass than in the in situ condition, but the mechanical boundary condition did not affect the buffer behavior in the deposition hole. (3) The numerical simulation broadly reproduced the measured bentonite pressure and the overpack displacement, but did not reproduce the decreasing trend of the bentonite pressure after 100 equivalent years. This indicates the effect of the time-dependent characteristics of the surrounding rock mass. Further investigations are needed to determine the effect of initial heterogeneity in the deposition hole and the time-dependent behavior of the surrounding rock mass.

  12. Using targeted short-term field investigations to calibrate and evaluate the structure of a hydrological model

    CSIR Research Space (South Africa)

    Hughes, DA


    Full Text Available catchments and are applied in a daily version of the model. The results demonstrate the importance of ensuring that field observations are measuring the same hydrological variables as the model simulations. At one study site, there was a mismatch in the soil...

  13. Qos Performance Analysis : Design And Development Of Voice And Video Mobility Over Long Term Evaluation (Lte) Model


    Mahmud, Shahrear; Chowdhury, Md.Sadat Hossain


    The evolution of 3G systems has contributed to a significant amount of progress towards 4th generation wireless technology, Long Term Evolution (LTE). On the other hand, demand for more bandwidth has been evidenced by the ever growing usage of real-time application such as video conference. For instance, users tend to have reliable and efficient connection when they are on the go maintaining the minimum quality of the video conference. In order to meet these challenges, QoS of LTE makes it an...

  14. Evaluation of the E mu-pim-1 transgenic mouse model for short-term carcinogenicity testing

    DEFF Research Database (Denmark)

    van Kreijl, C. F.; van Oordt, C. W. V.; Kroese, E. D.


    The value of the chronic rodent carcinogenicity assay in adequately predicting cancer risk in humans has become a matter of debate over the past few years. Therefore, more rapid and accurate alternative tests are urgently needed. Transgenic mouse models, those harboring genetic changes that are r......The value of the chronic rodent carcinogenicity assay in adequately predicting cancer risk in humans has become a matter of debate over the past few years. Therefore, more rapid and accurate alternative tests are urgently needed. Transgenic mouse models, those harboring genetic changes...... that are relevant to the multistage cancer process, may provide such alternative tests. Transgenic E mu-pim-1 mice, developed by Berns and coworkers in 1989, contain the pim-1 oncogene, which is expressed at elevated levels in their lymphoid compartments. As a result, these mice are predisposed to the development...

  15. Integrated Assessment Model Evaluation (United States)

    Smith, S. J.; Clarke, L.; Edmonds, J. A.; Weyant, J. P.


    Integrated assessment models of climate change (IAMs) are widely used to provide insights into the dynamics of the coupled human and socio-economic system, including emission mitigation analysis and the generation of future emission scenarios. Similar to the climate modeling community, the integrated assessment community has a two decade history of model inter-comparison, which has served as one of the primary venues for model evaluation and confirmation. While analysis of historical trends in the socio-economic system has long played a key role in diagnostics of future scenarios from IAMs, formal hindcast experiments are just now being contemplated as evaluation exercises. Some initial thoughts on setting up such IAM evaluation experiments are discussed. Socio-economic systems do not follow strict physical laws, which means that evaluation needs to take place in a context, unlike that of physical system models, in which there are few fixed, unchanging relationships. Of course strict validation of even earth system models is not possible (Oreskes etal 2004), a fact borne out by the inability of models to constrain the climate sensitivity. Energy-system models have also been grappling with some of the same questions over the last quarter century. For example, one of "the many questions in the energy field that are waiting for answers in the next 20 years" identified by Hans Landsberg in 1985 was "Will the price of oil resume its upward movement?" Of course we are still asking this question today. While, arguably, even fewer constraints apply to socio-economic systems, numerous historical trends and patterns have been identified, although often only in broad terms, that are used to guide the development of model components, parameter ranges, and scenario assumptions. IAM evaluation exercises are expected to provide useful information for interpreting model results and improving model behavior. A key step is the recognition of model boundaries, that is, what is inside


    Directory of Open Access Journals (Sweden)

    N. Kovtun


    Full Text Available The paper studies the methodological principles and guidance of the statistical evaluation of terms of trade for the United Nations classification model – Harmonized Commodity Description and Coding System (HS. The practical implementation of the proposed three-stage model of index analysis and estimation of terms of trade for Ukraine's commodity-members for the period of 2011-2012 are realized.

  17. Character process model for semen volume in AI rams: evaluation of correlation structures for long and short-term environmental effects

    Directory of Open Access Journals (Sweden)

    Robert-Granié Christèle


    Full Text Available Abstract The objective of this study was to build a character process model taking into account serial correlations for the analysis of repeated measurements of semen volume in AI rams. For each ram, measurements were repeated within and across years. Therefore, we considered a model including three environmental effects: the long-term environmental effect, which is a random year* subject effect, the short-term environmental effect, which is a random within year subject* collection effect, and the classical measurement error. We used a four-step approach to build the model. The first step explored graphically the serial correlations. The second step compared four models with different correlation structures for the short-term environmental effect. We selected fixed effects in the third step. In the fourth step, we compared four correlation structures for the long-term environmental effect. The model, which fitted best the data, used a spatial power correlation structure for the short-term environmental effect and a first order autoregressive process for the long-term environmental effect. The heritability estimate was 0.27 (0.04, the within year repeatability decreased from 0.56 to 0.44 and the repeatability across years decreased from 0.43 to 0.37.

  18. An interfacial shear term evaluation study for adiabatic dispersed air–water two-phase flow with the two-fluid model using CFD

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, S.L., E-mail: [School of Nuclear Engineering, Purdue University, West Lafayette, IN (United States); Hibiki, T.; Ishii, M. [School of Nuclear Engineering, Purdue University, West Lafayette, IN (United States); Schlegel, J.P. [Department of Mining and Nuclear Engineering, Missouri University of Science and Technology, Rolla, MO (United States); Buchanan, J.R.; Hogan, K.J. [Bettis Laboratory, Naval Nuclear Laboratory, West Mifflin, PA (United States); Guilbert, P.W. [ANSYS UK Ltd, Oxfordshire (United Kingdom)


    Highlights: • Closure form of the interfacial shear term in three-dimensional form is investigated. • Assessment against adiabatic upward bubbly air–water flow data using CFD. • Effect of addition of the interfacial shear term on the phase distribution. - Abstract: In commercially available Computational Fluid Dynamics (CFD) codes such as ANSYS CFX and Fluent, the interfacial shear term is missing in the field momentum equations. The derivation of the two-fluid model (Ishii and Hibiki, 2011) indicates the presence of this term as a momentum source in the right hand side of the field momentum equation. The inclusion of this term is considered important for proper modeling of the interfacial momentum coupling between phases. For separated flows, such as annular flow, the importance of the shear term is understood in the one-dimensional (1-D) form as the major mechanism by which the wall shear is transferred to the gas phase (Ishii and Mishima, 1984). For gas dispersed two-phase flow CFD simulations, it is important to assess the significance of this term in the prediction of phase distributions. In the first part of this work, the closure of this term in three-dimensional (3-D) form in a CFD code is investigated. For dispersed gas–liquid flow, such as bubbly or churn-turbulent flow, bubbles are dispersed in the shear layer of the continuous phase. The continuous phase shear stress is mainly due to the presence of the wall and the modeling of turbulence through the Boussinesq hypothesis. In a 3-D simulation, the continuous phase shear stress can be calculated from the continuous fluid velocity gradient, so that the interfacial shear term can be closed using the local values of the volume fraction and the total stress of liquid phase. This form also assures that the term acts as an action-reaction force for multiple phases. In the second part of this work, the effect of this term on the volume fraction distribution is investigated. For testing the model two

  19. The IIR evaluation model

    DEFF Research Database (Denmark)

    Borlund, Pia


    of IIR systems as realistically as possible with reference to actual information searching and retrieval processes, though still in a relatively controlled evaluation environment; and 2) to calculate the IIR system performance taking into account the non-binary nature of the assigned relevance......An alternative approach to evaluation of interactive information retrieval (IIR) systems, referred to as the IIR evaluation model, is proposed. The model provides a framework for the collection and analysis of IR interaction data. The aim of the model is two-fold: 1) to facilitate the evaluation...... assessments. The IIR evaluation model is presented as an alternative to the system-driven Cranfield model (Cleverdon, Mills & Keen, 1966; Cleverdon & Keen, 1966) which still is the dominant approach to the evaluation of IR and IIR systems. Key elements of the IIR evaluation model are the use of realistic...

  20. BWR Source Term Generation and Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    J.C. Ryman


    This calculation is a revision of a previous calculation (Ref. 7.5) that bears the same title and has the document identifier BBAC00000-01717-0210-00006 REV 01. The purpose of this revision is to remove TBV (to-be-verified) -41 10 associated with the output files of the previous version (Ref. 7.30). The purpose of this and the previous calculation is to generate source terms for a representative boiling water reactor (BWR) spent nuclear fuel (SNF) assembly for the first one million years after the SNF is discharged from the reactors. This calculation includes an examination of several ways to represent BWR assemblies and operating conditions in SAS2H in order to quantify the effects these representations may have on source terms. These source terms provide information characterizing the neutron and gamma spectra in particles per second, the decay heat in watts, and radionuclide inventories in curies. Source terms are generated for a range of burnups and enrichments (see Table 2) that are representative of the waste stream and stainless steel (SS) clad assemblies. During this revision, it was determined that the burnups used for the computer runs of the previous revision were actually about 1.7% less than the stated, or nominal, burnups. See Section 6.6 for a discussion of how to account for this effect before using any source terms from this calculation. The source term due to the activation of corrosion products deposited on the surfaces of the assembly from the coolant is also calculated. The results of this calculation support many areas of the Monitored Geologic Repository (MGR), which include thermal evaluation, radiation dose determination, radiological safety analyses, surface and subsurface facility designs, and total system performance assessment. This includes MGR items classified as Quality Level 1, for example, the Uncanistered Spent Nuclear Fuel Disposal Container (Ref. 7.27, page 7). Therefore, this calculation is subject to the requirements of the

  1. Decision-analytic modeling to evaluate the long-term effectiveness and cost-effectiveness of HPV-DNA testing in primary cervical cancer screening in Germany

    Directory of Open Access Journals (Sweden)

    Krämer, Alexander


    Full Text Available Background: Persistent infections with high-risk types of human papillomavirus (HPV are associated with the development of cervical neoplasia. Compared to cytology HPV testing is more sensitive in detecting high-grade cervical cancer precursors, but with lower specificity. HPV based primary screening for cervical cancer is currently discussed in Germany. Decisions should be based on a systematic evaluation of the long-term effectiveness and cost-effectiveness of HPV based primary screening. Research questions: What is the long-term clinical effectiveness (reduction in lifetime risk of cervical cancer and death due to cervical cancer, life years gained of HPV testing and what is the cost-effectiveness in Euro per life year gained (LYG of including HPV testing in primary cervical cancer screening in the German health care context? How can the screening program be improved with respect to test combination, age at start and end of screening and screening interval and which recommendations should be made for the German health care context? Methods: A previously published and validated decision-analytic model for the German health care context was extended and adapted to the natural history of HPV infection and cervical cancer in order to evaluate different screening strategies that differ by screening interval, and tests, including cytology alone, HPV testing alone or in combination with cytology, and HPV testing with cytology triage for HPV-positive women. German clinical, epidemiological and economic data were used. In the absence of individual data, screening adherence was modelled independently from screening history. Test accuracy data were retrieved from international meta-analyses. Predicted outcomes included reduction in lifetime-risk for cervical cancer cases and deaths, life expectancy, lifetime costs, and discounted incremental cost-effectiveness ratios (ICER. The perspective of the third party payer and 3% annual discount rate were

  2. Models of Short-Term Synaptic Plasticity. (United States)

    Barroso-Flores, Janet; Herrera-Valdez, Marco A; Galarraga, Elvira; Bargas, José


    We focus on dynamical descriptions of short-term synaptic plasticity. Instead of focusing on the molecular machinery that has been reviewed recently by several authors, we concentrate on the dynamics and functional significance of synaptic plasticity, and review some mathematical models that reproduce different properties of the dynamics of short term synaptic plasticity that have been observed experimentally. The complexity and shortcomings of these models point to the need of simple, yet physiologically meaningful models. We propose a simplified model to be tested in synapses displaying different types of short-term plasticity.

  3. Introducing Program Evaluation Models

    Directory of Open Access Journals (Sweden)

    Raluca GÂRBOAN


    Full Text Available Programs and project evaluation models can be extremely useful in project planning and management. The aim is to set the right questions as soon as possible in order to see in time and deal with the unwanted program effects, as well as to encourage the positive elements of the project impact. In short, different evaluation models are used in order to minimize losses and maximize the benefits of the interventions upon small or large social groups. This article introduces some of the most recently used evaluation models.

  4. Baby Skyrme models without a potential term (United States)

    Ashcroft, Jennifer; Haberichter, Mareike; Krusch, Steffen


    We develop a one-parameter family of static baby Skyrme models that do not require a potential term to admit topological solitons. This is a novel property as the standard baby Skyrme model must contain a potential term in order to have stable soliton solutions, though the Skyrme model does not require this. Our new models satisfy an energy bound that is linear in terms of the topological charge and can be saturated in an extreme limit. They also satisfy a virial theorem that is shared by the Skyrme model. We calculate the solitons of our new models numerically and observe that their form depends significantly on the choice of parameter. In one extreme, we find compactons while at the other there is a scale invariant model in which solitons can be obtained exactly as solutions to a Bogomolny equation. We provide an initial investigation into these solitons and compare them with the baby Skyrmions of other models.

  5. Brand Equity Evaluation Model

    National Research Council Canada - National Science Library

    Migle Eleonora cernikovaite


    ...) benefits to the consumer. This article aims to assess how the international dimension is reflected in the various models of brand equity, analyze Lithuanian actualities in managing and evaluating brands, also to reveal...

  6. Virtual Models of Long-Term Care (United States)

    Phenice, Lillian A.; Griffore, Robert J.


    Nursing homes, assisted living facilities and home-care organizations, use web sites to describe their services to potential consumers. This virtual ethnographic study developed models representing how potential consumers may understand this information using data from web sites of 69 long-term-care providers. The content of long-term-care web…

  7. Using a full annual cycle model to evaluate long-term population viability of the conservation-reliant Kirtland's warbler after successful recovery (United States)

    Brown, Donald J.; Ribic, Christine; Donner, Deahn M.; Nelson, Mark D.; Bocetti, Carol I.; Deloria-Sheffield, Christie M.


    Long-term management planning for conservation-reliant migratory songbirds is particularly challenging because habitat quality in different stages and geographic locations of the annual cycle can have direct and carry-over effects that influence the population dynamics. The Neotropical migratory songbird Kirtland's warbler Setophaga kirtlandii (Baird 1852) is listed as endangered under the U.S. Endangered Species Act and Near Threatened under the IUCN Red List. This conservation-reliant species is being considered for U.S. federal delisting because the species has surpassed the designated 1000 breeding pairs recovery threshold since 2001.To help inform the delisting decision and long-term management efforts, we developed a population simulation model for the Kirtland's warbler that incorporated both breeding and wintering grounds habitat dynamics, and projected population viability based on current environmental conditions and potential future management scenarios. Future management scenarios included the continuation of current management conditions, reduced productivity and carrying capacity due to the changes in habitat suitability from the creation of experimental jack pine Pinus banksiana (Lamb.) plantations, and reduced productivity from alteration of the brown-headed cowbird Molothrus ater (Boddaert 1783) removal programme.Linking wintering grounds precipitation to productivity improved the accuracy of the model for replicating past observed population dynamics. Our future simulations indicate that the Kirtland's warbler population is stable under two potential future management scenarios: (i) continuation of current management practices and (ii) spatially restricting cowbird removal to the core breeding area, assuming that cowbirds reduce productivity in the remaining patches by ≤41%. The additional future management scenarios we assessed resulted in population declines.Synthesis and applications. Our study indicates that the Kirtland's warbler population

  8. A new model to evaluate the long-term cost effectiveness of orphan and highly specialised drugs following listing on the Australian Pharmaceutical Benefits Scheme: the Bosentan Patient Registry. (United States)

    Owen, Aj; Spinks, J; Meehan, A; Robb, T; Hardy, M; Kwasha, D; Wlodarczyk, J; Reid, Cm


    Pharmaceutical subsidy schemes are under increasing pressure to evaluate the cost effectiveness of new highly specialised and orphan drugs for universal subsidy. In the absence of longer-term outcome data, drug sponsors often present modelled data, which can carry a significant level of uncertainty over longer-term projections. Risk-sharing schemes between drug sponsor and government may provide an acceptable method of balancing the uncertainty of longer-term cost effectiveness with the public demand for equitable and timely access to new drugs. The Bosentan Patient Registry (BPR) is an example of a unique risk-sharing model utilised in Australia aiming to provide clinical evidence to support the modelled predictions, with the registry survival outcomes linked to future price. Concomitant medication, health and vital status data was collected from clinicians, government health departments and death registries. The BPR has identified a number of issues surrounding registry governance, ethics and patient privacy, and the collection of timely and accurate data, which need to be addressed for the development of a generic registry model for systematic evaluation. The success of a generic drug registry model based on the BPR will be enhanced by addressing a number of operational issues identified during the implementation of this project.

  9. A Biofilm Pocket Model to Evaluate Different Non-Surgical Periodontal Treatment Modalities in Terms of Biofilm Removal and Reformation, Surface Alterations and Attachment of Periodontal Ligament Fibroblasts (United States)

    Hägi, Tobias T.; Klemensberger, Sabrina; Bereiter, Riccarda; Nietzsche, Sandor; Cosgarea, Raluca; Flury, Simon; Lussi, Adrian; Sculean, Anton; Eick, Sigrun


    Background and Aim There is a lack of suitable in vitro models to evaluate various treatment modalities intending to remove subgingival bacterial biofilm. Consequently, the aims of this in vitro-study were: a) to establish a pocket model enabling mechanical removal of biofilm and b) to evaluate repeated non-surgical periodontal treatment with respect to biofilm removal and reformation, surface alterations, tooth hard-substance-loss, and attachment of periodontal ligament (PDL) fibroblasts. Material and Methods Standardized human dentin specimens were colonized by multi-species biofilms for 3.5 days and subsequently placed into artificially created pockets. Non-surgical periodontal treatment was performed as follows: a) hand-instrumentation with curettes (CUR), b) ultrasonication (US), c) subgingival air-polishing using erythritol (EAP) and d) subgingival air-polishing using erythritol combined with chlorhexidine digluconate (EAP-CHX). The reduction and recolonization of bacterial counts, surface roughness (Ra and Rz), the caused tooth substance-loss (thickness) as well as the attachment of PDL fibroblasts were evaluated and statistically analyzed by means of ANOVA with Post-Hoc LSD. Results After 5 treatments, bacterial reduction in biofilms was highest when applying EAP-CHX (4 log10). The lowest reduction was found after CUR (2 log10). Additionally, substance-loss was the highest when using CUR (128±40 µm) in comparison with US (14±12 µm), EAP (6±7 µm) and EAP-CHX (11±10) µm). Surface was roughened when using CUR and US. Surfaces exposed to US and to EAP attracted the highest numbers of PDL fibroblasts. Conclusion The established biofilm model simulating a periodontal pocket combined with interchangeable placements of test specimens with multi-species biofilms enables the evaluation of different non-surgical treatment modalities on biofilm removal and surface alterations. Compared to hand instrumentation the application of ultrasonication and of air

  10. Evaluation of melphalan, oxaliplatin, and paclitaxel in colon, liver, and gastric cancer cell lines in a short-term exposure model of chemosaturation therapy by percutaneous hepatic perfusion. (United States)

    Uzgare, Rajneesh P; Sheets, Timothy P; Johnston, Daniel S


    The goal of this study was to determine whether liver, gastric, or colonic cancer may be suitable targets for chemosaturation therapy with percutaneous hepatic perfusion (CS-PHP) and to assess the feasibility of utilizing other cytotoxic agents besides melphalan in the CS-PHP system. Forty human cell lines were screened against three cytotoxic chemotherapeutic agents. Specifically, the dose-dependent effect of melphalan, oxaliplatin, and paclitaxel on proliferation and apoptosis in each cell line was evaluated. These agents were also evaluated for their ability to induce apoptosis in normal primary human hepatocytes. A high-dose short-term drug exposure protocol was employed to simulate conditions encountered during CS-PHP. The average concentration of melphalan required for inducing significant apoptosis was 61 μM, or about 3-fold less than the theoretical concentration of 192 μM, achieved in the hepatic artery during CS-PHP dosing with melphalan. Additionally, we found that gastric cancer cell lines were 2-5 fold more sensitive to apoptosis than liver cancer cell lines to all three compounds, suggesting that in addition to colonic and gastric cancer metastases to the liver, primary gastric cancer may also be amenable to management by CS-PHP using an appropriate therapeutic agent. Significantly, at concentrations that are predicted using the CS-PHP system, these agents caused apoptosis of colonic, gastric, and liver cancer cells but were not toxic to primary human hepatocytes. The compounds tested are potential candidates for use in the CS-PHP system to treat patients with gastric and colonic metastases, and primary cancer of the liver.

  11. Health economic modelling to assess short-term costs of maternal overweight, gestational diabetes and related macrosomia – a pilot evaluation

    Directory of Open Access Journals (Sweden)

    Irene eLenoir-Wijnkoop


    Full Text Available Background: Despite the interest in the impact of overweight and obesity on public health, little is known about the social and economic impact of being born large for gestational age or macrosomic. Both conditions are related to maternal obesity and/or gestational diabetes (GDM and associated with increased morbidity for mother and child in the perinatal period. Poorly controlled diabetes during pregnancy, pre- pregnancy maternal obesity and/or excessive maternal weight gain during pregnancy are associated with intermittent periods of fetal exposure to hyperglycemia and subsequent hyperinsulinemia, leading to increased birth weight (e.g. macrosomia, body adiposity and glycogen storage in the liver. Macrosomia is associated with an increased risk of developing obesity and type 2 diabetes mellitus later in life.Objective: Provide insight in the short-term health-economic impact of maternal overweight, gestational diabetes (GDM and related macrosomia. To this end, a health economic framework was designed. This pilot study also aims to encourage further health technology assessments, based on country- and population-specific data. Results: The estimation of the direct health-economic burden of maternal overweight, GDM and related macrosomia indicates that associated healthcare expenditures are substantial. The calculation of a budget impact of GDM, based on a conservative approach of our model, using USA costing data, indicates an annual cost of more than $1,8 billion without taking into account long-term consequences.Conclusion: Although overweight and obesity are a recognized concern worldwide, less attention has been given to the health economic consequences of these conditions in women of child-bearing age and their offspring. The presented outcomes underline the need for preventive management strategies and public health interventions on life style, diet and physical activity. Also, the predisposition in people of Asian ethnicity to develop

  12. Health economic modeling to assess short-term costs of maternal overweight, gestational diabetes, and related macrosomia – a pilot evaluation (United States)

    Lenoir-Wijnkoop, Irene; van der Beek, Eline M.; Garssen, Johan; Nuijten, Mark J. C.; Uauy, Ricardo D.


    Background: Despite the interest in the impact of overweight and obesity on public health, little is known about the social and economic impact of being born large for gestational age or macrosomic. Both conditions are related to maternal obesity and/or gestational diabetes mellitus (GDM) and associated with increased morbidity for mother and child in the perinatal period. Poorly controlled diabetes during pregnancy, pre- pregnancy maternal obesity and/or excessive maternal weight gain during pregnancy are associated with intermittent periods of fetal exposure to hyperglycemia and subsequent hyperinsulinemia, leading to increased birth weight (e.g., macrosomia), body adiposity, and glycogen storage in the liver. Macrosomia is associated with an increased risk of developing obesity and type 2 diabetes mellitus later in life. Objective: Provide insight in the short-term health-economic impact of maternal overweight, GDM, and related macrosomia. To this end, a health economic framework was designed. This pilot study also aims to encourage further health technology assessments, based on country- and population-specific data. Results: The estimation of the direct health-economic burden of maternal overweight, GDM and related macrosomia indicates that associated healthcare expenditures are substantial. The calculation of a budget impact of GDM, based on a conservative approach of our model, using USA costing data, indicates an annual cost of more than $1,8 billion without taking into account long-term consequences. Conclusion: Although overweight and obesity are a recognized concern worldwide, less attention has been given to the health economic consequences of these conditions in women of child-bearing age and their offspring. The presented outcomes underline the need for preventive management strategies and public health interventions on life style, diet and physical activity. Also, the predisposition in people of Asian ethnicity to develop diabetes emphasizes the

  13. CMAQ Model Evaluation Framework (United States)

    CMAQ is tested to establish the modeling system’s credibility in predicting pollutants such as ozone and particulate matter. Evaluation of CMAQ has been designed to assess the model’s performance for specific time periods and for specific uses.

  14. Composite Load Model Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Ning; Qiao, Hong (Amy)


    The WECC load modeling task force has dedicated its effort in the past few years to develop a composite load model that can represent behaviors of different end-user components. The modeling structure of the composite load model is recommended by the WECC load modeling task force. GE Energy has implemented this composite load model with a new function CMPLDW in its power system simulation software package, PSLF. For the last several years, Bonneville Power Administration (BPA) has taken the lead and collaborated with GE Energy to develop the new composite load model. Pacific Northwest National Laboratory (PNNL) and BPA joint force and conducted the evaluation of the CMPLDW and test its parameter settings to make sure that: • the model initializes properly, • all the parameter settings are functioning, and • the simulation results are as expected. The PNNL effort focused on testing the CMPLDW in a 4-bus system. An exhaustive testing on each parameter setting has been performed to guarantee each setting works. This report is a summary of the PNNL testing results and conclusions.

  15. Immunohistochemical evaluation of iron accumulation in term ...

    African Journals Online (AJOL)

    Classical immunohistochemical studies on placenta have shown that there is a linear increase in iron storage in the placenta in the first half of a normal pregnancy, however, these stocks are decreased in normal 3rd trimester placenta. Iron accumulation in term placentas of preeclamptic and normal pregnancies were ...

  16. Term structure modeling and asymptotic long rate

    NARCIS (Netherlands)

    Yao, Y.


    This paper examines the dynamics of the asymptotic long rate in three classes of term structure models. It shows that, in a frictionless and arbitrage-free market, the asymptotic long rate is a non-decreasing process. This gives an alternative proof of the same result of Dybvig et al. (Dybvig, P.H.,

  17. Discrete choice models with multiplicative error terms

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Bierlaire, Michel


    The conditional indirect utility of many random utility maximization (RUM) discrete choice models is specified as a sum of an index V depending on observables and an independent random term ε. In general, the universe of RUM consistent models is much larger, even fixing some specification of V due...... differences. We develop some properties of this type of model and show that in several cases the change from an additive to a multiplicative formulation, maintaining a specification of V, may lead to a large improvement in fit, sometimes larger than that gained from introducing random coefficients in V....

  18. Using a full annual cycle model to evaluate long-term population viability of the conservation-reliant Kirtland's warbler after successful recovery (United States)

    Donald J. Brown; Christine A. Ribic; Deahn M. Donner; Mark D. Nelson; Carol I. Bocetti; Christie M. Deloria-Sheffield; Des Thompson


    Long-term management planning for conservation-reliant migratory songbirds is particularly challenging because habitat quality in different stages and geographic locations of the annual cycle can have direct and carry-over effects that influence the population dynamics. The Neotropical migratory songbird Kirtland's warbler Setophaga kirtlandii...

  19. Mid-term evaluation of ten National Research schools

    DEFF Research Database (Denmark)

    Gustafsson, Göran; Dahl, Hanne Marlene; Gustafsson, Christina

    grant applications, monitoring the progress of the FORSKERSKOLER scheme and serving as the evaluation panel for the mid-term evaluation in 2013 and in 2016/2017. The task of the evaluation panel has been to: 1) evaluate the quality of and progress achieved by the ten research schools which were awarded...

  20. Sparse model selection via integral terms (United States)

    Schaeffer, Hayden; McCalla, Scott G.


    Model selection and parameter estimation are important for the effective integration of experimental data, scientific theory, and precise simulations. In this work, we develop a learning approach for the selection and identification of a dynamical system directly from noisy data. The learning is performed by extracting a small subset of important features from an overdetermined set of possible features using a nonconvex sparse regression model. The sparse regression model is constructed to fit the noisy data to the trajectory of the dynamical system while using the smallest number of active terms. Computational experiments detail the model's stability, robustness to noise, and recovery accuracy. Examples include nonlinear equations, population dynamics, chaotic systems, and fast-slow systems.

  1. Toward Standardizing a Lexicon of Infectious Disease Modeling Terms (United States)

    Milwid, Rachael; Steriu, Andreea; Arino, Julien; Heffernan, Jane; Hyder, Ayaz; Schanzer, Dena; Gardner, Emma; Haworth-Brockman, Margaret; Isfeld-Kiely, Harpa; Langley, Joanne M.; Moghadas, Seyed M.


    Disease modeling is increasingly being used to evaluate the effect of health intervention strategies, particularly for infectious diseases. However, the utility and application of such models are hampered by the inconsistent use of infectious disease modeling terms between and within disciplines. We sought to standardize the lexicon of infectious disease modeling terms and develop a glossary of terms commonly used in describing models’ assumptions, parameters, variables, and outcomes. We combined a comprehensive literature review of relevant terms with an online forum discussion in a virtual community of practice, mod4PH (Modeling for Public Health). Using a convergent discussion process and consensus amongst the members of mod4PH, a glossary of terms was developed as an online resource. We anticipate that the glossary will improve inter- and intradisciplinary communication and will result in a greater uptake and understanding of disease modeling outcomes in heath policy decision-making. We highlight the role of the mod4PH community of practice and the methodologies used in this endeavor to link theory, policy, and practice in the public health domain. PMID:27734014

  2. Model of long-term seismogenesis

    Directory of Open Access Journals (Sweden)

    D. Rhoades


    Full Text Available A three-stage faulting model explains the observed quantitative relations between long-term precursory seismicity, mainshocks and aftershocks. Seismogenesis starts with the formation of a major crack, culminates in the corresponding major fracture and earthquake, and ends with healing. Crack formation is a self-organised critical phenomenon, and shear fracture is a delayed sequel to crack formation. It is postulated that the major crack generates a set of minor cracks, just as, later, the major fracture generates a set of minor fractures. Fracturing of the minor cracks raises the average seismicity level. By Mogi’s uniformity criterion, the major earthquake is delayed until the minor fractures have healed and the stress-field has regained relative uniformity. In accord with the scaling principle, the model applies at all magnitude levels. The size of any given initial crack determines the scale of the ensuing seismogenic process. A graphical technique of cumulative magnitude analysis gives a quantitative representation of the seismicity aspects of the model. Examples are given for large earthquakes in a region of continental collision and a subduction region. The principle of hierarchy is exemplified by the seismogenesis of a M 5.9 mainshock occurring entirely within the precursory stage of a M 7.0 mainshock. The model is capable of accommodating a variety of proposed shorter-term precursory phenomena.

  3. Evaluation of Accounting Education Offered in Formal Education in Turkey in Terms of Infrastructure and Human Standards -A Model Practice in Erzurum-


    KARCIOGLU, Resat


    With the new Turkish Trade Act, whichwas introduced in 2011 in Turkey, organisation of accounting records andfinancial statements must be based on the International Accounting andFinancial Reporting Standards. These standards are constituted and accepted ona global scale. In order to practice standards and make it the foundation fordesired right evaluations, it is very important to organise accounting recordsand financial statements according to these standards. Ensuring compliance withstanda...

  4. Short term load forecasting: two stage modelling

    Directory of Open Access Journals (Sweden)

    SOARES, L. J.


    Full Text Available This paper studies the hourly electricity load demand in the area covered by a utility situated in the Seattle, USA, called Puget Sound Power and Light Company. Our proposal is put into proof with the famous dataset from this company. We propose a stochastic model which employs ANN (Artificial Neural Networks to model short-run dynamics and the dependence among adjacent hours. The model proposed treats each hour's load separately as individual single series. This approach avoids modeling the intricate intra-day pattern (load profile displayed by the load, which varies throughout days of the week and seasons. The forecasting performance of the model is evaluated in similiar mode a TLSAR (Two-Level Seasonal Autoregressive model proposed by Soares (2003 using the years of 1995 and 1996 as the holdout sample. Moreover, we conclude that non linearity is present in some series of these data. The model results are analyzed. The experiment shows that our tool can be used to produce load forecasting in tropical climate places.

  5. A suggested method for dispersion model evaluation. (United States)

    Irwin, John S


    Too often operational atmospheric dispersion models are evaluated in their ability to replicate short-term concentration maxima, when in fact a valid model evaluation procedure would evaluate a model's ability to replicate ensemble-average patterns in hourly concentration values. A valid model evaluation includes two basic tasks: In Step 1 we should analyze the observations to provide average patterns for comparison with modeled patterns, and in Step 2 we should account for the uncertainties inherent in Step 1 so we can tell whether differences seen in a comparison of performance of several models are statistically significant. Using comparisons of model simulation results from AERMOD and ISCST3 with tracer concentration values collected during the EPRI Kincaid experiment, a candidate model evaluation procedure is demonstrated that assesses whether a model has the correct total mass at the receptor level (crosswind integrated concentration values) and whether a model is correctly spreading the mass laterally (lateral dispersion), and assesses the uncertainty in characterizing the transport. The use of the BOOT software (preferably using the ASTMD 6589 resampling procedure) is suggested to provide an objective assessment of whether differences in model performance between models are significant. Regulatory agencies can choose to treat modeling results as "pseudo-monitors," but air quality models actually only predict what they are constructed to predict, which certainly does not include the stochastic variations that result in observed short-term maxima (e.g., arc-maxima). Models predict the average concentration pattern of a collection of hours having very similar dispersive conditions. An easy-to-implement evaluation procedure is presented that challenges a model to properly estimate ensemble average concentration values, reveals where to look in a model to remove bias, and provides statistical tests to assess the significance of skill differences seen between

  6. Evaluating topic models with stability

    CSIR Research Space (South Africa)

    De Waal, A


    Full Text Available Topic models are unsupervised techniques that extract likely topics from text corpora, by creating probabilistic word-topic and topic-document associations. Evaluation of topic models is a challenge because (a) topic models are often employed...

  7. Block term decomposition for modelling epileptic seizures (United States)

    Hunyadi, Borbála; Camps, Daan; Sorber, Laurent; Paesschen, Wim Van; Vos, Maarten De; Huffel, Sabine Van; Lathauwer, Lieven De


    Recordings of neural activity, such as EEG, are an inherent mixture of different ongoing brain processes as well as artefacts and are typically characterised by low signal-to-noise ratio. Moreover, EEG datasets are often inherently multidimensional, comprising information in time, along different channels, subjects, trials, etc. Additional information may be conveyed by expanding the signal into even more dimensions, e.g. incorporating spectral features applying wavelet transform. The underlying sources might show differences in each of these modes. Therefore, tensor-based blind source separation techniques which can extract the sources of interest from such multiway arrays, simultaneously exploiting the signal characteristics in all dimensions, have gained increasing interest. Canonical polyadic decomposition (CPD) has been successfully used to extract epileptic seizure activity from wavelet-transformed EEG data (Bioinformatics 23(13):i10-i18, 2007; NeuroImage 37:844-854, 2007), where each source is described by a rank-1 tensor, i.e. by the combination of one particular temporal, spectral and spatial signature. However, in certain scenarios, where the seizure pattern is nonstationary, such a trilinear signal model is insufficient. Here, we present the application of a recently introduced technique, called block term decomposition (BTD) to separate EEG tensors into rank- ( L r , L r ,1) terms, allowing to model more variability in the data than what would be possible with CPD. In a simulation study, we investigate the robustness of BTD against noise and different choices of model parameters. Furthermore, we show various real EEG recordings where BTD outperforms CPD in capturing complex seizure characteristics.

  8. Bayesian statistical approaches to evaluating cognitive models. (United States)

    Annis, Jeffrey; Palmeri, Thomas J


    Cognitive models aim to explain complex human behavior in terms of hypothesized mechanisms of the mind. These mechanisms can be formalized in terms of mathematical structures containing parameters that are theoretically meaningful. For example, in the case of perceptual decision making, model parameters might correspond to theoretical constructs like response bias, evidence quality, response caution, and the like. Formal cognitive models go beyond verbal models in that cognitive mechanisms are instantiated in terms of mathematics and they go beyond statistical models in that cognitive model parameters are psychologically interpretable. We explore three key elements used to formally evaluate cognitive models: parameter estimation, model prediction, and model selection. We compare and contrast traditional approaches with Bayesian statistical approaches to performing each of these three elements. Traditional approaches rely on an array of seemingly ad hoc techniques, whereas Bayesian statistical approaches rely on a single, principled, internally consistent system. We illustrate the Bayesian statistical approach to evaluating cognitive models using a running example of the Linear Ballistic Accumulator model of decision making (Brown SD, Heathcote A. The simplest complete model of choice response time: linear ballistic accumulation. Cogn Psychol 2008, 57:153-178). This article is categorized under: Neuroscience > Computation Psychology > Reasoning and Decision Making Psychology > Theory and Methods. © 2017 Wiley Periodicals, Inc.

  9. Tools for Model Evaluation

    DEFF Research Database (Denmark)

    Olesen, H. R.


    Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France.......Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France....

  10. Evaluation of Short Term Memory Span Function In Children

    Directory of Open Access Journals (Sweden)

    Barış ERGÜL


    Full Text Available Although details of the information encoded in the short-term memory where it is stored temporarily be recorded in the working memory in the next stage. Repeating the information mentally makes it remain in memory for a long time. Studies investigating the relationship between short-term memory and reading skills that are carried out to examine the relationship between short-term memory processes and reading comprehension. In this study information coming to short-term memory and the factors affecting operation of short term memory are investigated with regression model. The aim of the research is to examine the factors (age, IQ and reading skills that are expected the have an effect on short-term memory in children through regression analysis. One of the assumptions of regression analysis is to examine which has constant variance and normal distribution of the error term. In this study, because the error term is not normally distributed, robust regression techniques were applied. Also, for each technique; coefficient of determination is determined. According to the findings, the increase in age, IQ and reading skills caused the increase in short term memory in children. After applying robust regression techniques, the Winsorized Least Squares (WLS technique gives the highest coefficient of determination.

  11. Evaluation of released source terms from burning mock combustible waste

    Energy Technology Data Exchange (ETDEWEB)

    Abe, Hitoshi; Watanabe, Koji; Tashiro, Shinsuke; Takada, Junichi; Uchiyama, Gunzo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment


    To evaluate quantitatively confinement capability of the radioactive materials in the nuclear fuel facility under the fire accident, analysis of accident sequence, including clogging characteristics of the ventilation filters, needs to be performed. For the purpose of the evaluation, accumulation of the source term data such as release rates of the smoke and energy, and particle size distribution of the smoke during the fire accident is necessary. Therefore, experiments for evaluating burning characteristics of combustible solid wastes and recovered solvents, which are disposed from the facilities, have been performed by using the mock combustible wastes and the method for estimating the source terms has been investigated. When mixtures of rubber and cloth gloves as mock combustible solid wastes were burnt, the smoke with above 1 {mu}m in diameter was confined in the carbonized residue of cloth gloves and the release ratio of the smoke in the burning of mixtures was decreased compared with the burning of only rubber gloves. The source terms were evaluated with the cell ventilation system safety analysis code CELVA-1D by using the experimental results as the input, such as temperature of the gas phase, total burnt weight and total collected weight of the smoke under the burning of rubber gloves as mock wastes. The source terms calculated by the CELVA-1D reasonably agreed with the values estimated from the recommended calculation parameters in the Nuclear Fuel Cycle Facility Accident Analysis Handbook (NUREG-1320). Therefore, the present CELVA-1D method for evaluating the source terms during burning is considered to be valid. This means that the source terms can be estimated by using this method if the information such as the temperature of the gas phase, total burnt weight and total collected weight of the smoke are given. (author)

  12. The evaluation of pastures and grazing management in terms of ...

    African Journals Online (AJOL)

    Grazing research in South Africa has been largely pasture oriented and consequently there is still a need to fully evaluate many of our more important pasture types and grazing management practices in terms of livestock production so that efficient pasture-based feeding systems can be constructed. In order to do this it is ...

  13. Evaluating Extensions to Coherent Mortality Forecasting Models

    Directory of Open Access Journals (Sweden)

    Syazreen Shair


    Full Text Available Coherent models were developed recently to forecast the mortality of two or more sub-populations simultaneously and to ensure long-term non-divergent mortality forecasts of sub-populations. This paper evaluates the forecast accuracy of two recently-published coherent mortality models, the Poisson common factor and the product-ratio functional models. These models are compared to each other and the corresponding independent models, as well as the original Lee–Carter model. All models are applied to age-gender-specific mortality data for Australia and Malaysia and age-gender-ethnicity-specific data for Malaysia. The out-of-sample forecast error of log death rates, male-to-female death rate ratios and life expectancy at birth from each model are compared and examined across groups. The results show that, in terms of overall accuracy, the forecasts of both coherent models are consistently more accurate than those of the independent models for Australia and for Malaysia, but the relative performance differs by forecast horizon. Although the product-ratio functional model outperforms the Poisson common factor model for Australia, the Poisson common factor is more accurate for Malaysia. For the ethnic groups application, ethnic-coherence gives better results than gender-coherence. The results provide evidence that coherent models are preferable to independent models for forecasting sub-populations’ mortality.

  14. Source Term Model for Fine Particle Resuspension from Indoor Surfaces

    National Research Council Canada - National Science Library

    Kim, Yoojeong; Gidwani, Ashok; Sippola, Mark; Sohn, Chang W


    This Phase I effort developed a source term model for particle resuspension from indoor surfaces to be used as a source term boundary condition for CFD simulation of particle transport and dispersion in a building...

  15. Animal models of bronchopulmonary dysplasia. The term mouse models (United States)

    Berger, Jessica


    The etiology of bronchopulmonary dysplasia (BPD) is multifactorial, with genetics, ante- and postnatal sepsis, invasive mechanical ventilation, and exposure to hyperoxia being well described as contributing factors. Much of what is known about the pathogenesis of BPD is derived from animal models being exposed to the environmental factors noted above. This review will briefly cover the various mouse models of BPD, focusing mainly on the hyperoxia-induced lung injury models. We will also include hypoxia, hypoxia/hyperoxia, inflammation-induced, and transgenic models in room air. Attention to the stage of lung development at the timing of the initiation of the environmental insult and the duration of lung injury is critical to attempt to mimic the human disease pulmonary phenotype, both in the short term and in outcomes extending into childhood, adolescence, and adulthood. The various indexes of alveolar and vascular development as well as pulmonary function including pulmonary hypertension will be highlighted. The advantages (and limitations) of using such approaches will be discussed in the context of understanding the pathogenesis of and targeting therapeutic interventions to ameliorate human BPD. PMID:25305249

  16. Topological Terms and Phases of Sigma Models


    Thorngren, Ryan


    We study boundary conditions of topological sigma models with the goal of generalizing the concepts of anomalous symmetry and symmetry protected topological order. We find a version of 't Hooft's anomaly matching conditions on the renormalization group flow of boundaries of invertible topological sigma models and discuss several examples of anomalous boundary theories. We also comment on bulk topological transitions in dynamical sigma models and argue that one can, with care, use topological ...

  17. A retrospective evaluation of term infants treated with surfactant therapy

    Directory of Open Access Journals (Sweden)

    Özge Sürmeli-Onay


    Full Text Available Aim: To investigate the clinical and therapeutic characteristics and outcomes of term infants who received surfactant therapy (ST for severe respiratory failure in our neonatal intensive care unit (NICU. Methods: The medical records of term infants (gestational age ≥ 370/7 weeks who received ST between 2003-2012 in NICU of Hacettepe University Ihsan Dogramaci Children’s Hospital were evaluated retrospectively. Results: During ten years period, 32 term infants received ST; the mean gestational age was 38.1 ± 0.88 wk and the mean birth weight was 2,936 ± 665 g. The underlying lung diseases were severe congenital pneumonia (CP in 13 (40.6%, acute respiratory distress syndrome (ARDS in 5 (15.6%, meconium aspiration syndrome (MAS in 5 (15.6%, congenital diaphragmatic hernia (CDH in 4 (12.5%, respiratory distress syndrome in 3 (9.4% and pulmonary hemorrhage in 2 (6.3% infants. The median time of the first dose of ST was 7.75 (0.5-216 hours. Pulmonary hypertension accompanied the primary lung disease in 9 (28.1% infants. Mortality rate was 25%. Conclusion: In term infants, CP, ARDS and MAS were the main causes of respiratory failure requiring ST. However, further prospective studies are needed for defining optimal strategies of ST in term infants with respiratory failure.

  18. Investigation of Teachers' Perceptions of Organizational Citizenship Behavior and Their Evaluation in Terms of Educational Administration (United States)

    Avci, Ahmet


    The aim of this study is to investigate teachers' perceptions of organizational citizenship behaviors and to evaluate them in terms of educational administration. Descriptive survey model was used in the research. The data of the research were obtained from 1,613 teachers working in public and private schools subjected to Ministry of National…

  19. A long-term/short-term model for daily electricity prices with dynamic volatility

    Energy Technology Data Exchange (ETDEWEB)

    Schlueter, Stephan


    In this paper we introduce a new stochastic long-term/short-term model for short-term electricity prices, and apply it to four major European indices, namely to the German, Dutch, UK and Nordic one. We give evidence that all time series contain certain periodic (mostly annual) patterns, and show how to use the wavelet transform, a tool of multiresolution analysis, for filtering purpose. The wavelet transform is also applied to separate the long-term trend from the short-term oscillation in the seasonal-adjusted log-prices. In all time series we find evidence for dynamic volatility, which we incorporate by using a bivariate GARCH model with constant correlation. Eventually we fit various models from the existing literature to the data, and come to the conclusion that our approach performs best. For the error distribution, the Normal Inverse Gaussian distribution shows the best fit. (author)

  20. Sequentially Executed Model Evaluation Framework

    Energy Technology Data Exchange (ETDEWEB)


    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.

  1. Performance Evaluation Model for Application Layer Firewalls.

    Directory of Open Access Journals (Sweden)

    Shichang Xuan

    Full Text Available Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers. Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  2. Performance Evaluation Model for Application Layer Firewalls. (United States)

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan


    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  3. Soft supersymmetry-breaking terms from supergravity and superstring models

    CERN Document Server

    Brignole, A; Muñoz, C


    We review the origin of soft supersymmetry-breaking terms in N=1 supergravity models of particle physics. We first consider general formulae for those terms in general models with a hidden sector breaking supersymmetry at an intermediate scale. The results for some simple models are given. We then consider the results obtained in some simple superstring models in which particular assumptions about the origin of supersymmetry breaking are made. These are models in which the seed of supersymmetry breaking is assumed to be originated in the dilaton/moduli sector of the theory.

  4. Short-term carcinogenesis evaluation of Casearia sylvestris

    Directory of Open Access Journals (Sweden)

    Cleide A.S. Tirloni

    Full Text Available Abstract Casearia sylvestris Sw., Salicaceae, is an important medicinal plant widely used in Brazil for the treatment of various cardiovascular disorders. This species was included as of interest by Brazilian Unified Health System. Although preclinical studies described cardiovascular protective effects and apparent absence of toxicity, no studies have evaluated its carcinogenic potential. In this study, we proposed a short-term carcinogenesis evaluation of C. sylvestris in Wistar rats, aiming to check the safety of this species to use it as proposed by Brazilian Unified Health System. C. sylvestris leaves were obtained and the crude extract was prepared by maceration from methanol/water. Wistar rats were orally treated for 12 weeks with 50, 250 or 500 mg kg−1 of crude extract or vehicle. Body weight, daily morbidity and mortality were monitored. Blood and bone marrow samples were collect for micronucleus test, comet assay and tumor markers evaluation. Vital organs were removed to macro and histopathological analyses. The crude extract did not induce mutagenic and genotoxic effects and no alterations were observed in important tumor markers. Finally, no detectable signs of injury through gross pathology or histopathological examinations were observed. Our results certify the absence of the crude extract toxicity, indicating its safety, even at prolonged exposure as proposed by Brazilian Unified Health System.

  5. The IEA Model of Short-term Energy Security

    Energy Technology Data Exchange (ETDEWEB)



    Ensuring energy security has been at the centre of the IEA mission since its inception, following the oil crises of the early 1970s. While the security of oil supplies remains important, contemporary energy security policies must address all energy sources and cover a comprehensive range of natural, economic and political risks that affect energy sources, infrastructures and services. In response to this challenge, the IEA is currently developing a Model Of Short-term Energy Security (MOSES) to evaluate the energy security risks and resilience capacities of its member countries. The current version of MOSES covers short-term security of supply for primary energy sources and secondary fuels among IEA countries. It also lays the foundation for analysis of vulnerabilities of electricity and end-use energy sectors. MOSES contains a novel approach to analysing energy security, which can be used to identify energy security priorities, as a starting point for national energy security assessments and to track the evolution of a country's energy security profile. By grouping together countries with similar 'energy security profiles', MOSES depicts the energy security landscape of IEA countries. By extending the MOSES methodology to electricity security and energy services in the future, the IEA aims to develop a comprehensive policy-relevant perspective on global energy security. This Working Paper is intended for readers who wish to explore the MOSES methodology in depth; there is also a brochure which provides an overview of the analysis and results.

  6. Infrasound Sensor Models and Evaluations

    Energy Technology Data Exchange (ETDEWEB)



    Sandia National Laboratories has continued to evaluate the performance of infrasound sensors that are candidates for use by the International Monitoring System (IMS) for the Comprehensive Nuclear-Test-Ban Treaty Organization. The performance criteria against which these sensors are assessed are specified in ``Operational Manual for Infra-sound Monitoring and the International Exchange of Infrasound Data''. This presentation includes the results of efforts concerning two of these sensors: (1) Chaparral Physics Model 5; and (2) CEA MB2000. Sandia is working with Chaparral Physics in order to improve the capability of the Model 5 (a prototype sensor) to be calibrated and evaluated. With the assistance of the Scripps Institution of Oceanography, Sandia is also conducting tests to evaluate the performance of the CEA MB2000. Sensor models based on theoretical transfer functions and manufacturer specifications for these two devices have been developed. This presentation will feature the results of coherence-based data analysis of signals from a huddle test, utilizing several sensors of both types, in order to verify the sensor performance.

  7. A model of competition between employed, short-term and long-term unemployed job searchers

    NARCIS (Netherlands)

    Broersma, Lourens


    This paper presents a model in which not only employed job search is endogenized, but also the phenomenon that long-term unemployed may becomediscouraged and stop searching for a job. When this model is applied to Dutch flow data, we find that this discouragement particularly took place in the early

  8. Medium term hurricane catastrophe models: a validation experiment (United States)

    Bonazzi, Alessandro; Turner, Jessica; Dobbin, Alison; Wilson, Paul; Mitas, Christos; Bellone, Enrica


    Climate variability is a major source of uncertainty for the insurance industry underwriting hurricane risk. Catastrophe models provide their users with a stochastic set of events that expands the scope of the historical catalogue by including synthetic events that are likely to happen in a defined time-frame. The use of these catastrophe models is widespread in the insurance industry but it is only in recent years that climate variability has been explicitly accounted for. In the insurance parlance "medium term catastrophe model" refers to products that provide an adjusted view of risk that is meant to represent hurricane activity on a 1 to 5 year horizon, as opposed to long term models that integrate across the climate variability of the longest available time series of observations. In this presentation we discuss how a simple reinsurance program can be used to assess the value of medium term catastrophe models. We elaborate on similar concepts as discussed in "Potential Economic Value of Seasonal Hurricane Forecasts" by Emanuel et al. (2012, WCAS) and provide an example based on 24 years of historical data of the Chicago Mercantile Hurricane Index (CHI), an insured loss proxy. Profit and loss volatility of a hypothetical primary insurer are used to score medium term models versus their long term counterpart. Results show that medium term catastrophe models could help a hypothetical primary insurer to improve their financial resiliency to varying climate conditions.

  9. Pathologic evaluation of normal and perfused term placental tissue

    DEFF Research Database (Denmark)

    Maroun, Lisa Leth; Mathiesen, Line; Hedegaard, Morten


    This study reports for the 1st time the incidence and interobserver variation of morphologic findings in a series of 34 term placentas from pregnancies with normal outcome used for perfusion studies. Histologic evaluation of placental tissue is challenging, especially when it comes to defining...... and selected findings were tested against success parameters from the perfusions. Finally, the criteria for frequent lesions with fair to poor interobserver variation in the nonperfused tissue were revised and reanalyzed. In the perfused tissue, the perfusion artefact "trophoblastic vacuolization," which...... with addition of antibiotics to the medium. In the "normal" tissue, certain lesions were very frequent and showed only fair or poor interobserver agreement. Revised minimum criteria for these lesions were defined and found reproducible. This study has emphasized the value of pathologic examination...

  10. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.


    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  11. The cointegrated vector autoregressive model with general deterministic terms

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    In the cointegrated vector autoregression (CVAR) literature, deterministic terms have until now been analyzed on a case-by-case, or as-needed basis. We give a comprehensive unified treatment of deterministic terms in the additive model X(t)= Z(t) + Y(t), where Z(t) belongs to a large class...... of deterministic regressors and Y(t) is a zero-mean CVAR. We suggest an extended model that can be estimated by reduced rank regression and give a condition for when the additive and extended models are asymptotically equivalent, as well as an algorithm for deriving the additive model parameters from the extended...... model parameters. We derive asymptotic properties of the maximum likelihood estimators and discuss tests for rank and tests on the deterministic terms. In particular, we give conditions under which the estimators are asymptotically (mixed) Gaussian, such that associated tests are khi squared distributed....

  12. A Neural Network Model of the Visual Short-Term Memory

    DEFF Research Database (Denmark)

    Petersen, Anders; Kyllingsbæk, Søren; Hansen, Lars Kai


    In this paper a neural network model of Visual Short-Term Memory (VSTM) is presented. The model links closely with Bundesen’s (1990) well-established mathematical theory of visual attention. We evaluate the model’s ability to fit experimental data from a classical whole and partial report study...

  13. Model for expressing leaf photosynthesis in terms of weather variables

    African Journals Online (AJOL)

    A theoretical mathematical model for describing photosynthesis in individual leaves in terms of weather variables is proposed. The model utilizes a series of efficiency parameters, each of which reflect the fraction of potential photosynthetic rate permitted by the different environmental elements. These parameters are useful ...

  14. Simple model for crop photosynthesis in terms of weather variables ...

    African Journals Online (AJOL)

    A theoretical mathematical model for describing crop photosynthetic rate in terms of the weather variables and crop characteristics is proposed. The model utilizes a series of efficiency parameters, each of which reflect the fraction of possible photosynthetic rate permitted by the different weather elements or crop architecture.

  15. A Team Building Model for Software Engineering Courses Term Projects (United States)

    Sahin, Yasar Guneri


    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  16. Exploring Term Dependences in Probabilistic Information Retrieval Model. (United States)

    Cho, Bong-Hyun; Lee, Changki; Lee, Gary Geunbae


    Describes a theoretic process to apply Bahadur-Lazarsfeld expansion (BLE) to general probabilistic models and the state-of-the-art 2-Poisson model. Through experiments on two standard document collections, one in Korean and one in English, it is demonstrated that incorporation of term dependences using BLE significantly contributes to performance…

  17. A Polynomial Term Structure Model with Macroeconomic Variables

    Directory of Open Access Journals (Sweden)

    José Valentim Vicente


    Full Text Available Recently, a myriad of factor models including macroeconomic variables have been proposed to analyze the yield curve. We present an alternative factor model where term structure movements are captured by Legendre polynomials mimicking the statistical factor movements identified by Litterman e Scheinkmam (1991. We estimate the model with Brazilian Foreign Exchange Coupon data, adopting a Kalman filter, under two versions: the first uses only latent factors and the second includes macroeconomic variables. We study its ability to predict out-of-sample term structure movements, when compared to a random walk. We also discuss results on the impulse response function of macroeconomic variables.

  18. Evaluation Model for Sentient Cities

    Directory of Open Access Journals (Sweden)

    Mª Florencia Fergnani Brion


    Full Text Available In this article we made a research about the Sentient Cities and produced an assessment model to analyse if a city is or could be potentially considered one. It can be used to evaluate the current situation of a city before introducing urban policies based on citizen participation in hybrid environments (physical and digital. To that effect, we've developed evaluation grids with the main elements that form a Sentient City and their measurement values. The Sentient City is a variation of the Smart City, also based on technology progress and innovation, but where the citizens are the principal agent. In this model, governments aim to have a participatory and sustainable system for achieving the Knowledge Society and Collective Intelligence development, as well as the city’s efficiency. Also, they increase communication channels between the Administration and citizens. In this new context, citizens are empowered because they have the opportunity to create a Local Identity and transform their surroundings through open and horizontal initiatives.

  19. Computer simulation of microgravity long-term effects and risk evaluation (United States)

    Perez-Poch, Antoni

    The objective of this work is to analyse and simulate possible long-term effects of microgravity on the human pulmonary function. It is also studied the efficacy of long-term regular exercise on relevant cardiovascular parameters when the human body is also exposed to microgravity. Little is known today about what long-term effects microgravity may cause on pulmonary function. It does not exist a complete explanation of the possible risks involved, although some experiments are under way on the ISS in order to evaluate them. Computer simulations are an important tool which may be used to predict and analyse these possible effects, and compare them with in-flight experiments. We based our study on a previous computer model (NELME: Numerical Evaluation of Long-term Microgravity Effects) which was developed in our laboratory and validated with the available data, focusing on the cardiovascular parameters affected by changes in gravity exposure. In this previous work we simulated part of the cardiovascular systems and we applied it to evaluate risks of blood-forming organs malfunction. NELME is based on an electrical-like control system model of the physiological changes, that may occur when gravity changes are applied. The computer implementation has a modular architecture. Hence, different output parameters, potential effects, organs and countermeasures can be easily implemented and evaluated. In this work we added a module to the system to analyse the pulmonary function with a gravity input parameter, as well as exposure time. We then conducted a battery of simulations when different values of g are applied for long-term exposures. We found no significant evidence of changes and no risks were foreseen. We also carried out an EVA simulation as a perturbation in the system (intense exercise, changes in breathed air) and studied the acute response. This is of great importance as current mission requirements do not allow data collection immediately following real EVAs

  20. A Parametric Factor Model of the Term Structure of Mortality

    DEFF Research Database (Denmark)

    Haldrup, Niels; Rosenskjold, Carsten Paysen T.

    The prototypical Lee-Carter mortality model is characterized by a single common time factor that loads differently across age groups. In this paper we propose a factor model for the term structure of mortality where multiple factors are designed to influence the age groups differently via...... procedure based on cross-section regressions together with a separate model to estimate the dynamics of the factors. Second, we suggest a fully specified model estimated by maximum likelihood via the Kalman filter recursions after the model is put on state space form. We demonstrate the methodology for US...... on the loading functions, the factors are not designed to be orthogonal but can be dependent and can possibly cointegrate when the factors have unit roots. We suggest two estimation procedures similar to the estimation of the dynamic Nelson-Siegel term structure model. First, a two-step nonlinear least squares...

  1. Creating a Long-Term Diabetic Rabbit Model

    Directory of Open Access Journals (Sweden)

    Jianpu Wang


    Full Text Available This study was to create a long-term rabbit model of diabetes mellitus for medical studies of up to one year or longer and to evaluate the effects of chronic hyperglycemia on damage of major organs. A single dose of alloxan monohydrate (100 mg/kg was given intravenously to 20 young New Zealand White rabbits. Another 12 age-matched normal rabbits were used as controls. Hyperglycemia developed within 48 hours after treatment with alloxan. Insulin was given daily after diabetes developed. All animals gained some body weight, but the gain was much less than the age-matched nondiabetic rabbits. Hyperlipidemia, higher blood urea nitrogen and creatinine were found in the diabetic animals. Histologically, the pancreas showed marked beta cell damage. The kidneys showed significantly thickened afferent glomerular arterioles with narrowed lumens along with glomerular atrophy. Lipid accumulation in the cytoplasm of hepatocytes appeared as vacuoles. Full-thickness skin wound healing was delayed. In summary, with careful management, alloxan-induced diabetic rabbits can be maintained for one year or longer in reasonably good health for diabetic studies.

  2. Short-Termed Integrated Forecasting System: 1993 Model documentation report

    Energy Technology Data Exchange (ETDEWEB)


    The purpose of this report is to define the Short-Term Integrated Forecasting System (STIFS) and describe its basic properties. The Energy Information Administration (EIA) of the US Energy Department (DOE) developed the STIFS model to generate short-term (up to 8 quarters), monthly forecasts of US supplies, demands, imports exports, stocks, and prices of various forms of energy. The models that constitute STIFS generate forecasts for a wide range of possible scenarios, including the following ones done routinely on a quarterly basis: A base (mid) world oil price and medium economic growth. A low world oil price and high economic growth. A high world oil price and low economic growth. This report is written for persons who want to know how short-term energy markets forecasts are produced by EIA. The report is intended as a reference document for model analysts, users, and the public.

  3. Decellularized mitral valve in a long-term sheep model. (United States)

    Iablonskii, Pavel; Cebotari, Serghei; Ciubotaru, Anatol; Sarikouch, Samir; Hoeffler, Klaus; Hilfiker, Andres; Haverich, Axel; Tudorache, Igor


    The objective of this study was to evaluate surgical handling, in vivo hemodynamic performance and morphological characteristics of decellularized mitral valves (DMVs) in a long-term sheep model. Ovine mitral valves were decellularized using detergents and β-mercaptoethanol. Orthotopic implantations were performed in 6-month-old sheep (41.3 ± 1.2 kg, n = 11) without annulus reinforcement. Commercially available stented porcine aortic valves [biological mitral valve (BMV), n = 3] were implanted conventionally and used as controls. Valve function was evaluated by transoesophageal echocardiography and explants were investigated by a routine bright field microscopy and immunofluorescent histology. During implantation, 2 DMVs required cleft closure of the anterior leaflet. All valves were competent on water test and early postoperative transoesophageal echocardiography. Six animals (DMV, n = 4; BMV, n = 2) survived 12 months. Six animals died within the first 4 months due to valve-related complications. At 12 months, transoesophageal echocardiography revealed severe degeneration in all BMVs. Macroscopically, BMV revealed calcification at the commissures and leaflet insertion area. Histological examination showed sporadic cells negative for endothelial nitric oxide synthase, von Willebrand factor and CD45 on their surface. In contrast, DMV showed no calcification or stenosis, and the regurgitation was trivial to moderate in all animals. Fibrotic hardening occurred only along the suture line of the valve annulus, immunostaining revealed collagen IV covering the entire leaflet surface and a repopulation with endothelial cells. Surgical implantation of DMV is feasible and results in good early graft function. Additional in vivo investigations are required to minimize the procedure-related complications and to increase the reproducibility of surgical implantation. Degenerative profile of allogeneic DMV is superior to commercially available

  4. Long-term health-related and economic consequences of short-term outcomes in evaluation of perinatal interventions

    NARCIS (Netherlands)

    Teune, Margreet J.; van Wassenaer, Aleid G.; Mol, Ben Willem J.; Opmeer, Brent C.


    ABSTRACT: BACKGROUND: : Many perinatal interventions are performed to improve long-term neonatal outcome. To evaluate the long-term effect of a perinatal intervention follow-up of the child after discharge from the hospital is necessary because serious sequelae from perinatal complications

  5. Autoradiographic thyroid evaluation in short-term experimental diabetes mellitus

    Directory of Open Access Journals (Sweden)

    Nascimento-Saba C.C.A.


    Full Text Available Previous studies have shown that in vitro thyroid peroxidase (TPO iodide oxidation activity is decreased and thyroid T4-5'-deiodinase activity is increased 15 days after induction of experimental diabetes mellitus (DM. In the present study we used thyroid histoautoradiography, an indirect assay of in vivo TPO activity, to determine the possible parallelism between the in vitro and in vivo changes induced by experimental DM. DM was induced in male Wistar rats (about 250 g body weight by a single ip streptozotocin injection (45 mg/kg, while control (C animals received a single injection of the vehicle. Seven and 30 days after diabetes induction, each diabetic and control animal was given ip a tracer dose of 125I (2 µCi, 2.5 h before thyroid excision. The glands were counted, weighed, fixed in Bouin's solution, embedded in paraffin and cut. The sections were stained with HE and exposed to NTB-2 emulsion (Kodak. The autohistograms were developed and the quantitative distribution of silver grains was evaluated with a computerized image analyzer system. Thyroid radioiodine uptake was significantly decreased only after 30 days of DM (C: 0.38 ± 0.05 vs DM: 0.20 ± 0.04%/mg thyroid, P<0.05 while in vivo TPO activity was significantly decreased 7 and 30 days after DM induction (C: 5.3 and 4.5 grains/100 µm2 vs DM: 2.9 and 1.6 grains/100 µm2, respectively, P<0.05 . These data suggest that insulin deficiency first reduces in vivo TPO activity during short-term experimental diabetes mellitus

  6. Murine model of long-term obstructive jaundice. (United States)

    Aoki, Hiroaki; Aoki, Masayo; Yang, Jing; Katsuta, Eriko; Mukhopadhyay, Partha; Ramanathan, Rajesh; Woelfel, Ingrid A; Wang, Xuan; Spiegel, Sarah; Zhou, Huiping; Takabe, Kazuaki


    With the recent emergence of conjugated bile acids as signaling molecules in cancer, a murine model of obstructive jaundice by cholestasis with long-term survival is in need. Here, we investigated the characteristics of three murine models of obstructive jaundice. C57BL/6J mice were used for total ligation of the common bile duct (tCL), partial common bile duct ligation (pCL), and ligation of left and median hepatic bile duct with gallbladder removal (LMHL) models. Survival was assessed by Kaplan-Meier method. Fibrotic change was determined by Masson-Trichrome staining and Collagen expression. Overall, 70% (7 of 10) of tCL mice died by day 7, whereas majority 67% (10 of 15) of pCL mice survived with loss of jaundice. A total of 19% (3 of 16) of LMHL mice died; however, jaundice continued beyond day 14, with survival of more than a month. Compensatory enlargement of the right lobe was observed in both pCL and LMHL models. The pCL model demonstrated acute inflammation due to obstructive jaundice 3 d after ligation but jaundice rapidly decreased by day 7. The LHML group developed portal hypertension and severe fibrosis by day 14 in addition to prolonged jaundice. The standard tCL model is too unstable with high mortality for long-term studies. pCL may be an appropriate model for acute inflammation with obstructive jaundice, but long-term survivors are no longer jaundiced. The LHML model was identified to be the most feasible model to study the effect of long-term obstructive jaundice. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Long-Term Stability Evaluation and Pillar Design Criterion for Room-and-Pillar Mines

    Directory of Open Access Journals (Sweden)

    Yang Yu


    Full Text Available The collapse of abandoned room-and-pillar mines is often violent and unpredictable. Safety concerns often resulted in mine closures with no post-mining stability evaluations. As a result, large amounts of land resources over room-and-pillar mines are wasted. This paper attempts to establish an understanding of the long-term stability issues of goafs (abandoned mines. Considering progressive pillar failures and the effect of single pillar failure on surrounding pillars, this paper proposes a pillar peeling model to evaluate the long-term stability of coal mines and the associated criteria for evaluating the long-term stability of room-and-pillar mines. The validity of the peeling model was verified by numerical simulation, and field data from 500 pillar cases from China, South Africa, and India. It is found that the damage level of pillar peeling is affected by the peel angle and pillar height and is controlled by the pillar width–height ratio.

  8. Evaluating Translational Research: A Process Marker Model (United States)

    Trochim, William; Kane, Cathleen; Graham, Mark J.; Pincus, Harold A.


    Abstract Objective: We examine the concept of translational research from the perspective of evaluators charged with assessing translational efforts. One of the major tasks for evaluators involved in translational research is to help assess efforts that aim to reduce the time it takes to move research to practice and health impacts. Another is to assess efforts that are intended to increase the rate and volume of translation. Methods: We offer an alternative to the dominant contemporary tendency to define translational research in terms of a series of discrete “phases.”Results: We contend that this phased approach has been confusing and that it is insufficient as a basis for evaluation. Instead, we argue for the identification of key operational and measurable markers along a generalized process pathway from research to practice. Conclusions: This model provides a foundation for the evaluation of interventions designed to improve translational research and the integration of these findings into a field of translational studies. Clin Trans Sci 2011; Volume 4: 153–162 PMID:21707944

  9. Risk factors and prognostic models for perinatal asphyxia at term

    NARCIS (Netherlands)

    Ensing, S.


    This thesis will focus on the risk factors and prognostic models for adverse perinatal outcome at term, with a special focus on perinatal asphyxia and obstetric interventions during labor to reduce adverse pregnancy outcomes. For the majority of the studies in this thesis we were allowed to use data

  10. Modelling the Long-term Periglacial Imprint on Mountain Landscapes

    DEFF Research Database (Denmark)

    Andersen, Jane Lund; Egholm, David Lundbek; Knudsen, Mads Faurschou

    Studies of periglacial processes usually focus on small-scale, isolated phenomena, leaving less explored questions of how such processes shape vast areas of Earth’s surface. Here we use numerical surface process modelling to better understand how periglacial processes drive large-scale, long-term...

  11. Viscous cosmological models with a variable cosmological term ...

    African Journals Online (AJOL)

    Einstein's field equations for a Friedmann-Lamaitre Robertson-Walker universe filled with a dissipative fluid with a variable cosmological term L described by full Israel-Stewart theory are considered. General solutions to the field equations for the flat case have been obtained. The solution corresponds to the dust free model ...

  12. Term structure models : a perspective from the long rate

    NARCIS (Netherlands)

    Yao, Yong


    Term structure models resulted from dynamic asset pricing theory are discussed by taking a perspective from the long rate. This paper attempts to answer two questions about the long rate: in frictionless markets having no arbitrage, what should the behavior of the long rate be; and, in existing

  13. Model Performance Evaluation and Scenario Analysis (MPESA) (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  14. A model for Long-term Industrial Energy Forecasting (LIEF)

    Energy Technology Data Exchange (ETDEWEB)

    Ross, M. [Lawrence Berkeley Lab., CA (United States)]|[Michigan Univ., Ann Arbor, MI (United States). Dept. of Physics]|[Argonne National Lab., IL (United States). Environmental Assessment and Information Sciences Div.; Hwang, R. [Lawrence Berkeley Lab., CA (United States)


    The purpose of this report is to establish the content and structural validity of the Long-term Industrial Energy Forecasting (LIEF) model, and to provide estimates for the model`s parameters. The model is intended to provide decision makers with a relatively simple, yet credible tool to forecast the impacts of policies which affect long-term energy demand in the manufacturing sector. Particular strengths of this model are its relative simplicity which facilitates both ease of use and understanding of results, and the inclusion of relevant causal relationships which provide useful policy handles. The modeling approach of LIEF is intermediate between top-down econometric modeling and bottom-up technology models. It relies on the following simple concept, that trends in aggregate energy demand are dependent upon the factors: (1) trends in total production; (2) sectoral or structural shift, that is, changes in the mix of industrial output from energy-intensive to energy non-intensive sectors; and (3) changes in real energy intensity due to technical change and energy-price effects as measured by the amount of energy used per unit of manufacturing output (KBtu per constant $ of output). The manufacturing sector is first disaggregated according to their historic output growth rates, energy intensities and recycling opportunities. Exogenous, macroeconomic forecasts of individual subsector growth rates and energy prices can then be combined with endogenous forecasts of real energy intensity trends to yield forecasts of overall energy demand. 75 refs.

  15. Modeling Wettability Variation during Long-Term Water Flooding

    Directory of Open Access Journals (Sweden)

    Renyi Cao


    Full Text Available Surface property of rock affects oil recovery during water flooding. Oil-wet polar substances adsorbed on the surface of the rock will gradually be desorbed during water flooding, and original reservoir wettability will change towards water-wet, and the change will reduce the residual oil saturation and improve the oil displacement efficiency. However there is a lack of an accurate description of wettability alternation model during long-term water flooding and it will lead to difficulties in history match and unreliable forecasts using reservoir simulators. This paper summarizes the mechanism of wettability variation and characterizes the adsorption of polar substance during long-term water flooding from injecting water or aquifer and relates the residual oil saturation and relative permeability to the polar substance adsorbed on clay and pore volumes of flooding water. A mathematical model is presented to simulate the long-term water flooding and the model is validated with experimental results. The simulation results of long-term water flooding are also discussed.

  16. A model for Long-term Industrial Energy Forecasting (LIEF)

    Energy Technology Data Exchange (ETDEWEB)

    Ross, M. (Lawrence Berkeley Lab., CA (United States) Michigan Univ., Ann Arbor, MI (United States). Dept. of Physics Argonne National Lab., IL (United States). Environmental Assessment and Information Sciences Div.); Hwang, R. (Lawrence Berkeley Lab., CA (United States))


    The purpose of this report is to establish the content and structural validity of the Long-term Industrial Energy Forecasting (LIEF) model, and to provide estimates for the model's parameters. The model is intended to provide decision makers with a relatively simple, yet credible tool to forecast the impacts of policies which affect long-term energy demand in the manufacturing sector. Particular strengths of this model are its relative simplicity which facilitates both ease of use and understanding of results, and the inclusion of relevant causal relationships which provide useful policy handles. The modeling approach of LIEF is intermediate between top-down econometric modeling and bottom-up technology models. It relies on the following simple concept, that trends in aggregate energy demand are dependent upon the factors: (1) trends in total production; (2) sectoral or structural shift, that is, changes in the mix of industrial output from energy-intensive to energy non-intensive sectors; and (3) changes in real energy intensity due to technical change and energy-price effects as measured by the amount of energy used per unit of manufacturing output (KBtu per constant $ of output). The manufacturing sector is first disaggregated according to their historic output growth rates, energy intensities and recycling opportunities. Exogenous, macroeconomic forecasts of individual subsector growth rates and energy prices can then be combined with endogenous forecasts of real energy intensity trends to yield forecasts of overall energy demand. 75 refs.

  17. Designing and evaluating representations to model pedagogy

    Directory of Open Access Journals (Sweden)

    Elizabeth Masterman


    Full Text Available This article presents the case for a theory-informed approach to designing and evaluating representations for implementation in digital tools to support Learning Design, using the framework of epistemic efficacy as an example. This framework, which is rooted in the literature of cognitive psychology, is operationalised through dimensions of fit that attend to: (1 the underlying ontology of the domain, (2 the purpose of the task that the representation is intended to facilitate, (3 how best to support the cognitive processes of the users of the representations, (4 users’ differing needs and preferences, and (5 the tool and environment in which the representations are constructed and manipulated.Through showing how epistemic efficacy can be applied to the design and evaluation of representations, the article presents the Learning Designer, a constructionist microworld in which teachers can both assemble their learning designs and model their pedagogy in terms of students’ potential learning experience. Although the activity of modelling may add to the cognitive task of design, the article suggests that the insights thereby gained can additionally help a lecturer who wishes to reuse a particular learning design to make informed decisions about its value to their practice.

  18. Multivariate Term Structure Models with Level and Heteroskedasticity Effects

    DEFF Research Database (Denmark)

    Christiansen, Charlotte


    The paper introduces and estimates a multivariate level-GARCH model for the long rate and the term-structure spread where the conditional volatility is proportional to the ãth power of the variable itself (level effects) and the conditional covariance matrix evolves according to a multivariate GA...... and the level model. GARCH effects are more important than level effects. The results are robust to the maturity of the interest rates. Udgivelsesdato: MAY......The paper introduces and estimates a multivariate level-GARCH model for the long rate and the term-structure spread where the conditional volatility is proportional to the ãth power of the variable itself (level effects) and the conditional covariance matrix evolves according to a multivariate...... GARCH process (heteroskedasticity effects). The long-rate variance exhibits heteroskedasticity effects and level effects in accordance with the square-root model. The spread variance exhibits heteroskedasticity effects but no level effects. The level-GARCH model is preferred above the GARCH model...

  19. Long term modeling of permafrost in the Alps (United States)

    Scherler, Martin; Hauck, Christian; Stähli, Manfred


    Air Temperature, radiation balance, snow cover and infiltration are known key factors in the thermal regime of permafrost. A model approach to investigate and quantify the influence of changes of these factors could lead to a better understanding regarding the sensitivity of permafrost to changes of climatic factors. Numerical models are well suited instruments to analyze thermal and hydrological processes in permafrost. Furthermore these models have the potential to be used for predicting the reaction of permafrost to climate change. For this application, a well calibrated model is crucial. The model used in this study is a one-dimensional coupled soil water and heat transfer model of the soil-snow-atmosphere boundary layer (COUP Model). It accounts for the accumulation and melt of a seasonal snow cover, as well as for the freezing and thawing of the soil. The model is driven by the following meteorological parameters: air temperature, relative humidity, wind speed, global radiation, and precipitation. A complete energy balance is calculated for the snow or soil surface, yielding a surface temperature representing the upper thermal boundary condition of the soil profile. A constant geothermal heat flux determines the lower thermal boundary. The model has been applied to simulate ground temperatures together with water and ice content evolution of two high-altitude alpine permafrost sites in Switzerland. The sites are Schilthorn in the Bernese Oberland and Murtèl in the Engadin. The aim of the simulations was the long term modelling (9 years for Schilthorn and 6 years for Murtèl) and the calibration of the model for the two study sites. The model is validated with borehole temperature data as well as soil moisture measurements conducted with a newly developed simplified soil moisture probe (SISOMOP). The simulated temperatures are in good agreement with the temperatures measured in the boreholes for both sites. The model results indicate that infiltration events

  20. A Simple Hybrid Model for Short-Term Load Forecasting

    Directory of Open Access Journals (Sweden)

    Suseelatha Annamareddi


    Full Text Available The paper proposes a simple hybrid model to forecast the electrical load data based on the wavelet transform technique and double exponential smoothing. The historical noisy load series data is decomposed into deterministic and fluctuation components using suitable wavelet coefficient thresholds and wavelet reconstruction method. The variation characteristics of the resulting series are analyzed to arrive at reasonable thresholds that yield good denoising results. The constitutive series are then forecasted using appropriate exponential adaptive smoothing models. A case study performed on California energy market data demonstrates that the proposed method can offer high forecasting precision for very short-term forecasts, considering a time horizon of two weeks.

  1. Long term performance evaluation of the TACTIC imaging telescope ...

    Indian Academy of Sciences (India)


    Mar 7, 2014 ... of the full data analysis chain, the main aim of this work is to study the long term performance of the TACTIC ... instruments for the observation of γ-rays in the TeV energy range [1–3]. The γ-rays .... of the instrumentation aspects of the telescope can be seen in [20], we shall present here only the main design.

  2. Regime-based evaluation of cloudiness in CMIP5 models (United States)

    Jin, Daeho; Oreopoulos, Lazaros; Lee, Dongmin


    The concept of cloud regimes (CRs) is used to develop a framework for evaluating the cloudiness of 12 fifth Coupled Model Intercomparison Project (CMIP5) models. Reference CRs come from existing global International Satellite Cloud Climatology Project (ISCCP) weather states. The evaluation is made possible by the implementation in several CMIP5 models of the ISCCP simulator generating in each grid cell daily joint histograms of cloud optical thickness and cloud top pressure. Model performance is assessed with several metrics such as CR global cloud fraction (CF), CR relative frequency of occurrence (RFO), their product [long-term average total cloud amount (TCA)], cross-correlations of CR RFO maps, and a metric of resemblance between model and ISCCP CRs. In terms of CR global RFO, arguably the most fundamental metric, the models perform unsatisfactorily overall, except for CRs representing thick storm clouds. Because model CR CF is internally constrained by our method, RFO discrepancies yield also substantial TCA errors. Our results support previous findings that CMIP5 models underestimate cloudiness. The multi-model mean performs well in matching observed RFO maps for many CRs, but is still not the best for this or other metrics. When overall performance across all CRs is assessed, some models, despite shortcomings, apparently outperform Moderate Resolution Imaging Spectroradiometer cloud observations evaluated against ISCCP like another model output. Lastly, contrasting cloud simulation performance against each model's equilibrium climate sensitivity in order to gain insight on whether good cloud simulation pairs with particular values of this parameter, yields no clear conclusions.

  3. Cosmological models with running cosmological term and decaying dark matter (United States)

    Szydłowski, Marek; Stachowski, Aleksander


    We investigate the dynamics of the generalized ΛCDM model, which the Λ term is running with the cosmological time. On the example of the model Λ(t) =Λbare + α2/t2 we show the existence of a mechanism of the modification of the scaling law for energy density of dark matter: ρdm ∝a - 3 + λ(t). We use an approach developed by Urbanowski in which properties of unstable vacuum states are analyzed from the point of view of the quantum theory of unstable states. We discuss the evolution of Λ(t) term and pointed out that during the cosmic evolution there is a long phase in which this term is approximately constant. We also present the statistical analysis of both the Λ(t) CDM model with dark energy and decaying dark matter and the ΛCDM standard cosmological model. We use data such as Planck, SNIa, BAO, H(z) and AP test. While for the former we find the best fit value of the parameter Ωα2,0 is negative (energy transfer is from the dark matter to dark energy sector) and the parameter Ωα2,0 belongs to the interval (- 0 . 000040 , - 0 . 000383) at 2- σ level. The decaying dark matter causes to lowering a mass of dark matter particles which are lighter than CDM particles and remain relativistic. The rate of the process of decaying matter is estimated. Our model is consistent with the decaying mechanism producing unstable particles (e.g. sterile neutrinos) for which α2 is negative.

  4. The perfused swine uterus model: long-term perfusion

    Directory of Open Access Journals (Sweden)

    Geisler Klaudija


    Full Text Available Abstract Background It has previously been shown that the viability of swine uteri can be maintained within the physiological range in an open perfusion model for up to 8 hours. The aim of this study was to assess medium- to long-term perfusion of swine uteri using a modified Krebs–Ringer bicarbonate buffer solution (KRBB in the established open perfusion model. Methods In an experimental study at an infertility institute, 30 swine uteri were perfused: group 1: n = 11, KRBB; group 2: n = 8, modified KRBB with drainage of perfusate supernatant; group 3: n = 11, modified KRBB with drainage of perfusate every 2 h and substitution with fresh medium. Modified and conventional KRBB were compared with regard to survival and contraction parameters: intrauterine pressure (IUP, area under the curve (AUC, and frequency of contractions (F. Results Modified KRBB showed significantly higher IUP, AUC, and F values than perfusion with conventional KRBB. In group 3, the organ survival time of up to 17 h, with a 98% rate of effective contraction time, differed significantly from group 1 (P  Conclusions Using modified KRBB in combination with perfusate substitution improves the open model for perfusion of swine uteri with regard to survival time and quality of contraction parameters. This model can be used for medium- to long-term perfusion of swine uteri, allowing further metabolic ex vivo studies in a cost-effective way and with little logistic effort.

  5. Application of Multiple Evaluation Models in Brazil

    Directory of Open Access Journals (Sweden)

    Rafael Victal Saliba


    Full Text Available Based on two different samples, this article tests the performance of a number of Value Drivers commonly used for evaluating companies by finance practitioners, through simple regression models of cross-section type which estimate the parameters associated to each Value Driver, denominated Market Multiples. We are able to diagnose the behavior of several multiples in the period 1994-2004, with an outlook also on the particularities of the economic activities performed by the sample companies (and their impacts on the performance through a subsequent analysis with segregation of companies in the sample by sectors. Extrapolating simple multiples evaluation standards from analysts of the main financial institutions in Brazil, we find that adjusting the ratio formulation to allow for an intercept does not provide satisfactory results in terms of pricing errors reduction. Results found, in spite of evidencing certain relative and absolute superiority among the multiples, may not be generically representative, given samples limitation.

  6. User`s Manual for the SOURCE1 and SOURCE2 Computer Codes: Models for Evaluating Low-Level Radioactive Waste Disposal Facility Source Terms (Version 2.0)

    Energy Technology Data Exchange (ETDEWEB)

    Icenhour, A.S.; Tharp, M.L.


    The SOURCE1 and SOURCE2 computer codes calculate source terms (i.e. radionuclide release rates) for performance assessments of low-level radioactive waste (LLW) disposal facilities. SOURCE1 is used to simulate radionuclide releases from tumulus-type facilities. SOURCE2 is used to simulate releases from silo-, well-, well-in-silo-, and trench-type disposal facilities. The SOURCE codes (a) simulate the degradation of engineered barriers and (b) provide an estimate of the source term for LLW disposal facilities. This manual summarizes the major changes that have been effected since the codes were originally developed.


    Energy Technology Data Exchange (ETDEWEB)

    Gary M. Blythe; Richard McMillan


    The objective of this project is to demonstrate the use of alkaline reagents injected into the furnace of coal-fired boilers as a means of controlling sulfuric acid emissions. Sulfuric acid controls are becoming of increasing interest to utilities with coal-fired units for a number of reasons. Sulfuric acid is a Toxic Release Inventory species, a precursor to acid aerosol/condensable emissions, and can cause a variety of plant operation problems such as air heater plugging and fouling, back-end corrosion, and plume opacity. These issues will likely be exacerbated with the retrofit of SCR for NOX control on some coal-fired plants, as SCR catalysts are known to further oxidize a portion of the flue gas SO{sub 2} to SO{sub 3}. The project is testing the effectiveness of furnace injection of four different calcium- and/or magnesium-based alkaline sorbents on full-scale utility boilers. These reagents have been tested during four one- to two-week tests conducted on two FirstEnergy Bruce Mansfield Plant units. One of the sorbents tested was a magnesium hydroxide slurry produced from a wet flue gas desulfurization system waste stream, from a system that employs a Thiosorbic{reg_sign} Lime scrubbing process. The other three sorbents are available commercially and include dolomite, pressure-hydrated dolomitic lime, and commercial magnesium hydroxide. The dolomite reagent was injected as a dry powder through out-of-service burners, while the other three reagents were injected as slurries through air-atomizing nozzles into the front wall of upper furnace, either across from the nose of the furnace or across from the pendant superheater tubes. After completing the four one- to two-week tests, the most promising sorbents were selected for longer-term (approximately 25-day) full-scale tests. The longer-term tests are being conducted to confirm the effectiveness of the sorbents tested over extended operation and to determine balance-of-plant impacts. This reports presents the


    Energy Technology Data Exchange (ETDEWEB)

    Gary M. Blythe; Richard McMillan


    The objective of this project is to demonstrate the use of alkaline reagents injected into the furnace of coal-fired boilers as a means of controlling sulfuric acid emissions. Sulfuric acid controls are becoming of increasing interest to utilities with coal-fired units for a number of reasons. Sulfuric acid is a Toxic Release Inventory species, a precursor to acid aerosol/condensable emissions, and can cause a variety of plant operation problems such as air heater plugging and fouling, back-end corrosion, and plume opacity. These issues will likely be exacerbated with the retrofit of SCR for NO{sub x} control on some coal-fired plants, as SCR catalysts are known to further oxidize a portion of the flue gas SO{sub 2} to SO{sub 3}. The project is testing the effectiveness of furnace injection of four different calcium- and/or magnesium-based alkaline sorbents on full-scale utility boilers. These reagents have been tested during four one- to two-week tests conducted on two First Energy Bruce Mansfield Plant units. One of the sorbents tested was a magnesium hydroxide slurry produced from a wet flue gas desulfurization system waste stream, from a system that employs a Thiosorbic{reg_sign} Lime scrubbing process. The other three sorbents are available commercially and include dolomite, pressure-hydrated dolomitic lime, and commercial magnesium hydroxide. The dolomite reagent was injected as a dry powder through out-of-service burners, while the other three reagents were injected as slurries through air-atomizing nozzles into the front wall of upper furnace, either across from the nose of the furnace or across from the pendant superheater tubes. After completing the four one- to two-week tests, the most promising sorbents were selected for longer-term (approximately 25-day) full-scale tests. The longer-term tests are being conducted to confirm the effectiveness of the sorbents tested over extended operation and to determine balance-of-plant impacts. This reports presents

  9. Evaluation of models for assessing groundwater vulnerability to ...

    African Journals Online (AJOL)

    This paper examines, based on a review and synthesis of available material, the presently most applied models for groundwater vulnerability assessment mapping. The appraoches and the pros and cons of each method are evaluated in terms of both the conditions of their implementation and the result obtained. The paper ...

  10. Monte Carlo Euler approximations of HJM term structure financial models

    KAUST Repository

    Björk, Tomas


    We present Monte Carlo-Euler methods for a weak approximation problem related to the Heath-Jarrow-Morton (HJM) term structure model, based on Itô stochastic differential equations in infinite dimensional spaces, and prove strong and weak error convergence estimates. The weak error estimates are based on stochastic flows and discrete dual backward problems, and they can be used to identify different error contributions arising from time and maturity discretization as well as the classical statistical error due to finite sampling. Explicit formulas for efficient computation of sharp error approximation are included. Due to the structure of the HJM models considered here, the computational effort devoted to the error estimates is low compared to the work to compute Monte Carlo solutions to the HJM model. Numerical examples with known exact solution are included in order to show the behavior of the estimates. © 2012 Springer Science+Business Media Dordrecht.

  11. Long-term stake evaluations of waterborne copper systems (United States)

    Stan Lebow; Cherilyn Hatfield; Douglas Crawford; Bessie Woodward


    Limitations on the use of chromated copper arsenate (CCA) have heightened interest in use of arsenic-free copper-based alternatives. For decades, the USDA Forest Products Laboratory has been evaluating several of these systems in stake plots. Southern Pine 38- by 89- by 457-mm (1.5- by 3.5- by 18-inch) stakes were treated with varying concentrations of acid copper...

  12. Evaluation of effects of long term exposure on lethal toxicity with mammals. (United States)

    Verma, Vibha; Yu, Qiming J; Connell, Des W


    The relationship between exposure time (LT50) and lethal exposure concentration (LC50) has been evaluated over relatively long exposure times using a novel parameter, Normal Life Expectancy (NLT), as a long term toxicity point. The model equation, ln(LT50) = aLC50(ν) + b, where a, b and ν are constants, was evaluated by plotting lnLT50 against LC50 using available toxicity data based on inhalation exposure from 7 species of mammals. With each specific toxicant a single consistent relationship was observed for all mammals with ν always mammals and then be extended to estimate toxicity at any exposure time with other mammals. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  13. A Critique of Kirkpatrick's Evaluation Model (United States)

    Reio, Thomas G., Jr.; Rocco, Tonette S.; Smith, Douglas H.; Chang, Elegance


    Donald Kirkpatrick published a series of articles originating from his doctoral dissertation in the late 1950s describing a four-level training evaluation model. From its beginning, it was easily understood and became one of the most influential evaluation models impacting the field of HRD. While well received and popular, the Kirkpatrick model…

  14. Judging risk behaviour and risk preference: the role of the evaluative connotation of risk terms.

    NARCIS (Netherlands)

    van Schie, E.C.M.; van der Pligt, J.; van Baaren, K.


    Two experiments investigated the impact of the evaluative connotation of risk terms on the judgment of risk behavior and on risk preference. Exp 1 focused on the evaluation congruence of the risk terms with a general risk norm and with Ss' individual risk preference, and its effects on the extremity

  15. Evaluation Plan on In-vessel Source Term in PGSFR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Won; Ha, Kwi-Seok; Ahn, Sang June; Lee, Kwi Lim; Jeong, Taekyeong [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)


    This strategy requires nuclear plants to have features that prevent radionuclide release and multiple barriers to the escape from the plants of any radionuclides that are released despite preventive measures. Considerations of the ability to prevent and mitigate release of radionuclides arise at numerous places in the safety regulations of nuclear plants. The effectiveness of mitigative capabilities in nuclear plants is subject to quantitative analysis. The radionuclide input to these quantitative analyses of effectiveness is the Source Term (ST). All features of the composition, magnitude, timing, chemical form and physical form of accidental radionuclide release constitute the ST. Also, ST is defined as the release of radionuclides from the fuel and coolant into the containment, and subsequently to the environment. The in-vessel STs of PGSFR will be estimated using the methodology of ANL-ART-38 report in additional to 4S methodology. The in-vessel STs are calculated through several phases: The inventory of each radionuclide is calculated by ORIGEN-2 code using the realistic burnup conditions. ST in the release from the core to primary sodium is calculated by using the assumption of ANL methodology. Lastly, ST in the release from the primary sodium to cover gas space is calculated by using equation and experimental materials.

  16. Ability of the MACRO model to predict long-term leaching of metribuzin and diketometribuzin. (United States)

    Rosenbom, Annette E; Kjaer, Jeanne; Henriksen, Trine; Ullum, Marlene; Olsen, Preben


    In a regulatory context, numerical models are increasingly employed to quantify leaching of pesticides and their metabolites. Although the ability of these models to accurately simulate leaching of pesticides has been evaluated, little is known about their ability to accurately simulate long-term leaching of metabolites. A Danish study on the dissipation and sorption of metribuzin, involving both monitoring and batch experiments, concluded that desorption and degradation of metribuzin and leaching of its primary metabolite diketometribuzin continued for 5-6 years after application, posing a risk of groundwater contamination. That study provided a unique opportunity for evaluating the ability of the numerical model MACRO to accurately simulate long-term leaching of metribuzin and diketometribuzin. When calibrated and validated with respect to water and bromide balances and applied assuming equilibrium sorption and first-order degradation kinetics as recommended in the European Union pesticide authorization procedure, MACRO was unable to accurately simulate the long-term fate of metribuzin and diketometribuzin; the concentrations in the soil were underestimated by many orders of magnitude. By introducing alternative kinetics (a two-site approach), we captured the observed leaching scenario, thus underlining the necessity of accounting for the long-term sorption and dissipation characteristics when using models to predict the risk of groundwater contamination.

  17. Evaluating spatial patterns in hydrological modelling

    DEFF Research Database (Denmark)

    Koch, Julian

    with respect to their capability to mimic human evaluations. This PhD thesis aims at expanding the standard toolbox of spatial model evaluation with innovative metrics that adequately compare spatial patterns. Driven by the rise of more complex model structures and the increase of suitable remote sensing......The objective of this PhD study is to investigate possible ways towards a better integration of spatial observations into the modelling process via spatial pattern evaluation. It is widely recognized by the modelling community that the grand potential of readily available spatial observations...... is not fully exploited by current modelling frameworks due to the lack of suitable spatial performance metrics. Furthermore, the traditional model evaluation using discharge is found unsuitable to lay confidence on the predicted catchment inherent spatial variability of hydrological processes in a fully...

  18. Long-Term Morphological Modeling of Barrier Island Tidal Inlets

    Directory of Open Access Journals (Sweden)

    Richard Styles


    Full Text Available The primary focus of this study is to apply a two-dimensional (2-D coupled flow-wave-sediment modeling system to simulate the development and growth of idealized barrier island tidal inlets. The idealized systems are drawn from nine U.S. coastal inlets representing Pacific Coast, Gulf Coast and Atlantic Coast geographical and climatological environments. A morphological factor is used to effectively model 100 years of inlet evolution and the resulting morphological state is gauged in terms of the driving hydrodynamic processes. Overall, the model performs within the range of established theoretically predicted inlet cross-sectional area. The model compares favorably to theoretical models of maximum inlet currents, which serve as a measure of inlet stability. Major morphological differences are linked to inlet geometry and tidal forcing. Narrower inlets develop channels that are more aligned with the inlet axis while wider inlets develop channels that appear as immature braided channel networks similar to tidal flats in regions with abundant sediment supply. Ebb shoals with strong tidal forcing extend further from shore and spread laterally, promoting multi-lobe development bisected by ebb shoal channels. Ebb shoals with moderate tidal forcing form crescent bars bracketing a single shore-normal channel. Longshore transport contributes to ebb shoal asymmetry and provides bed material to help maintain the sediment balance in the bay.

  19. Investigation of Transformational and Transactional Leadership Styles of School Principals, and Evaluation of Them in Terms of Educational Administration (United States)

    Avci, Ahmet


    The aim of this study is to investigate the transformational and transactional leadership styles of school principals, and to evaluate them in terms of educational administration. Descriptive survey model was used in the research. The data of the research were obtained from a total of 1,117 teachers working in public and private schools subjected…

  20. Gauged baby Skyrme model with a Chern-Simons term (United States)

    Samoilenka, A.; Shnir, Ya.


    The properties of the multisoliton solutions of the (2 +1 )-dimensional Maxwell-Chern-Simons-Skyrme model are investigated numerically. Coupling to the Chern-Simons term allows for existence of the electrically charge solitons which may also carry magnetic fluxes. Two particular choices of the potential term is considered: (i) the weakly bounded potential and (ii) the double vacuum potential. In the absence of gauge interaction in the former case the individual constituents of the multisoliton configuration are well separated, while in the latter case the rotational invariance of the configuration remains unbroken. It is shown that coupling of the planar multi-Skyrmions to the electric and magnetic field strongly affects the pattern of interaction between the constituents. We analyze the dependency of the structure of the solutions, the energies, angular momenta, electric and magnetic fields of the configurations on the gauge coupling constant g , and the electric potential. It is found that, generically, the coupling to the Chern-Simons term strongly affects the usual pattern of interaction between the skyrmions, in particular the electric repulsion between the solitons may break the multisoliton configuration into partons. We show that as the gauge coupling becomes strong, both the magnetic flux and the electric charge of the solutions become quantized although they are not topological numbers.

  1. Audiometric evaluation short and medium term in cochlear implants. (United States)

    Alonso-Luján, Laura R; Gutiérrez-Farfán, Ileana; Luna-Reyes, Francisco A; Chamlati-Aguirre, Laura E; Durand Rivera, Alfredo


    Our purpose is report the results of cochlear implant program in this Institute, since our first surgery from November 2007, until December 2012. A cross-sectional study, observational, descriptive, analyzing the information about thresholds before and after implantation, using patients files (diagnosis, onset of hearing loss, brainstem auditory evoked potential (BAEP), computed tomography (CT), magnetic resonance imaging (MRI), implanted ear, brand and model of cochlear implants (CI) and audiometric studies before and after the CI. We report the evolution of 68 patients, age ranged 1 year 8 months to 39 years 3 months old. 94% patients (n = 64) had pre-lingual hearing loss being hereditary non-syndromic hearing loss the most common etiology (29.4%). 100% patients had auditory brainstem responses showing bilateral profound hearing loss, in the 77.9% type A tympanograms were obtained (Jerger's classification), and 100% had absence of stapedial reflexes and otoacoustic emissions with low reproducibility. CT reported as normal in 85.2% of patients, the findings: 5.8% had chronic mastoiditis changes, other findings reported in 1.4% of patients were: digastric right facial nerve, facial nerve canal dehiscence, enlarged vestibular aqueduct, occupation and poor pneumatization of mastoid air cells, lateral semicircular canals agenesis, incomplete partition of the cochlea with wide vestibular and vestibular aqueduct dilatation. Most frequent MR findings of skull with cerebellopontine angle approach were vascular loops of internal auditory canals unilaterally. In 10.2%, 55.8% of patients (n = 38) were implanted in the right ear, 56 (82.3%) with a CI from Advanced Bionics, HiRes 90K model, the remaining with Cochlear, Freedom and Nucleus 5 models. Developments in CI results by audiometric tests: prior to placement was 106.2 dB averages at frequencies assessed, one month later 62.4 dB, at 6 months 44 dB, and with satisfactory threshold 32.9 dB. 55.8% of patients (n = 38) with


    Directory of Open Access Journals (Sweden)

    A Vaishya


    Full Text Available Background: Rising diabetes incidence globally and consequently diabetic nephropathy is a major concern. Being chronic disease patients are continuously monitored. Clinical improvement of major sign/symptoms in short course of therapy may lead to satisfaction of the patient’s that will increase better compliance to the treatment. Objective: To evaluate signs/symptoms and GFR status of diabetic nephropathy patient in short course of treatment therapy and nutritional management Material & Methods: All 170 incident cases of diabetic nephropathy (DN based on glomerular filtration rate and creatinine level registered on pre-fixed dates during May 2007 to May 2010, but 127 followed inclusion criteria. Patients were recorded for demographic, biological & biochemical characteristics and presenting major sign/symptoms at registration time; further, evaluated for presenting sign/symptoms after six months of medicine and dietary intervention. Statistical Analysis: Statistical significance for association was tested by c2 (unrelated samples and McNemar (related samples and for the differences of number of signs/symptoms by Mann Whitney (unrelated samples and Wilcoxon Signed Rank tests (related samples. Results: No statistical association was seen between GFR status and presence of edema/swelling in any part of the body. After six months of treatment and dietary management, the edema/swelling presenting in 69.6% of the cases was found only in 33.3%. Pedal edema was found in 43.1%; while either eye lid or facial swelling was in 12.9% of the cases but after six months of drug treatment and dietary care these were present only in 18.6% and 3.9% cases respectively. GFR status of one third cases also improved after 6 months, while 57.8% were unchanged; very few (9.8% deteriorated. The GFR improvement was more in cases reporting with GFR 60 and above. Conclusion: Though, for the drug compliance and dietary intake patient’s statement was believed, after six

  3. Effects of Reducing Convective Acceleration Terms in Modelling Supercritical and Transcritical Flow Conditions

    Directory of Open Access Journals (Sweden)

    Yared Abayneh Abebe


    Full Text Available Modelling floods and flood-related disasters has become priority for many researchers and practitioners. Currently, there are several options that can be used for modelling floods in urban areas and the present work attempts to investigate effectiveness of different model formulations in modelling supercritical and transcritical flow conditions. In our work, we use the following three methods for modelling one-dimensional (1D flows: the MIKE 11 flow model, Kutija’s method, and the Roe scheme. We use two methods for modelling two-dimensional (2D flows: the MIKE21 flow model and a non-inertia 2D model. Apart from the MIKE11 and MIKE21 models, the code for all other models was developed and used for the purposes of the present work. The performance of the models was evaluated using hypothetical case studies with the intention of representing some configurations that can be found in urban floodplains. The present work does not go into the assessment of these models in modelling various topographical features that may be found on urban floodplains, but rather focuses on how they perform in simulating supercritical and transcritical flows. The overall findings are that the simplified models which ignore convective acceleration terms (CATs in the momentum equations may be effectively used to model urban flood plains without a significant loss of accuracy.

  4. A parameter model for dredge plume sediment source terms (United States)

    Decrop, Boudewijn; De Mulder, Tom; Toorman, Erik; Sas, Marc


    The presented model allows for fast simulations of the near-field behaviour of overflow dredging plumes. Overflow dredging plumes occur when dredging vessels employ a dropshaft release system to discharge the excess sea water, which is pumped into the trailing suction hopper dredger (TSHD) along with the dredged sediments. The fine sediment fraction in the loaded water-sediment mixture does not fully settle before it reaches the overflow shaft. By consequence, the released water contains a fine sediment fraction of time-varying concentration. The sediment grain size is in the range of clays, silt and fine sand; the sediment concentration varies roughly between 10 and 200 g/l in most cases, peaking at even higher value with short duration. In order to assess the environmental impact of the increased turbidity caused by this release, plume dispersion predictions are often carried out. These predictions are usually executed with a large-scale model covering a complete coastal zone, bay, or estuary. A source term of fine sediments is implemented in the hydrodynamic model to simulate the fine sediment dispersion. The large-scale model mesh resolution and governing equations, however, do not allow to simulate the near-field plume behaviour in the vicinity of the ship hull and propellers. Moreover, in the near-field, these plumes are under influence of buoyancy forces and air bubbles. The initial distribution of sediments is therefore unknown and has to be based on crude assumptions at present. The initial (vertical) distribution of the sediment source is indeed of great influence on the final far-field plume dispersion results. In order to study this near-field behaviour, a highly-detailed computationally fluid dynamics (CFD) model was developed. This model contains a realistic geometry of a dredging vessel, buoyancy effects, air bubbles and propeller action, and was validated earlier by comparing with field measurements. A CFD model requires significant simulation times

  5. Evaluating survival model performance: a graphical approach. (United States)

    Mandel, M; Galai, N; Simchen, E


    In the last decade, many statistics have been suggested to evaluate the performance of survival models. These statistics evaluate the overall performance of a model ignoring possible variability in performance over time. Using an extension of measures used in binary regression, we propose a graphical method to depict the performance of a survival model over time. The method provides estimates of performance at specific time points and can be used as an informal test for detecting time varying effects of covariates in the Cox model framework. The method is illustrated on real and simulated data using Cox proportional hazard model and rank statistics. Copyright 2005 John Wiley & Sons, Ltd.

  6. ComfortPower - System improvements and long-term evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Silversand, Fredrik [Catator AB, Lund (Sweden)


    Catator has previously developed a novel heating system abbreviated ComfortPower in a RandD-programme supported by Catator, Swedish Gas Centre (SGC), Swedish Defence Materiel Administration (FMV), Skanska, Nibe and Alfa Laval. The ComfortPower unit comprises a multi fuel reformer system tied to a high-temperature polymer electrolyte fuel cell (HT-PEM) and a heat pump system. Since the residual heat from the fuel cell system can be utilized in a very effective way, it is possible to reach high thermal efficiencies. Indeed, the thermal efficiency in the unit has previously been shown to reach values as high as 175 - 200 % based on the lower heating value of the fuel. In addition to heat, ComfortPower can supply comfort cooling and surplus electricity. This project phase has focused on the following elements: 1. System improvements to further enhance the efficiency with existing fuel cell (HT-PEM). 2. System simplifications (e.g. DC-compressor system) to manage issues with start-up currents. 3. Tests with biogas qualities (various levels of CO{sub 2}) and biogas/air. 4. Long-term test with biogas quality (upgraded biogas). 5. Additional tests with liquid fuels (alcohols and diesel). 6. Map the need for cooling and heating in various applications. 7. Investigate how ComfortPower can reduce the primary energy consumption and reduce the environmental impact. 8. Study the possibility with a SOFC-based system with internal reforming. It was found that the Optiformer technology can be used to derive a suitable reformate gas for the HT-PEM unit from a wide range of fuels. Even if operation with fuel gases is the natural choice in most cases, it is possible also to use alcohols and other liquid fuels (e.g. in Campus applications). The heat pump system was equipped with a 24 V DC-compressor provided by Nibe. The compressor could be directly powered by the accumulator system and start-up currents, harmful to the inverter, could be avoided. Some improvements were made on the

  7. A Long-Term Mathematical Model for Mining Industries

    Energy Technology Data Exchange (ETDEWEB)

    Achdou, Yves, E-mail: [Univ. Paris Diderot, Sorbonne Paris Cité, Laboratoire Jacques-Louis Lions, UMR 7598, UPMC, CNRS (France); Giraud, Pierre-Noel [CERNA, Mines ParisTech (France); Lasry, Jean-Michel [Univ. Paris Dauphine (France); Lions, Pierre-Louis [Collège de France (France)


    A parcimonious long term model is proposed for a mining industry. Knowing the dynamics of the global reserve, the strategy of each production unit consists of an optimal control problem with two controls, first the flux invested into prospection and the building of new extraction facilities, second the production rate. In turn, the dynamics of the global reserve depends on the individual strategies of the producers, so the models leads to an equilibrium, which is described by low dimensional systems of partial differential equations. The dimensionality depends on the number of technologies that a mining producer can choose. In some cases, the systems may be reduced to a Hamilton–Jacobi equation which is degenerate at the boundary and whose right hand side may blow up at the boundary. A mathematical analysis is supplied. Then numerical simulations for models with one or two technologies are described. In particular, a numerical calibration of the model in order to fit the historical data is carried out.

  8. A new Expert Finding model based on Term Correlation Matrix

    Directory of Open Access Journals (Sweden)

    Ehsan Pornour


    Full Text Available Due to the enormous volume of unstructured information available on the Web and inside organization, finding an answer to the knowledge need in a short time is difficult. For this reason, beside Search Engines which don’t consider users individual characteristics, Recommender systems were created which use user’s previous activities and other individual characteristics to help users find needed knowledge. Recommender systems usage is increasing every day. Expert finder systems also by introducing expert people instead of recommending information to users have provided this facility for users to ask their questions form experts. Having relation with experts not only causes information transition, but also with transferring experiences and inception causes knowledge transition. In this paper we used university professors academic resume as expert people profile and then proposed a new expert finding model that recommends experts to users query. We used Term Correlation Matrix, Vector Space Model and PageRank algorithm and proposed a new hybrid model which outperforms conventional methods. This model can be used in internet environment, organizations and universities that experts have resume dataset.

  9. Evaluation of a new model of short-term palliative care for people severely affected with multiple sclerosis: a randomised fast-track trial to test timing of referral and how long the effect is maintained. (United States)

    Higginson, Irene J; Costantini, Massimo; Silber, Eli; Burman, Rachel; Edmonds, Polly


    In this randomised fast-track phase II trial, the authors examined (1) whether the timing of referral to short-term palliative care (PC) affected selected outcomes, and (2) the potential staff-modifying effect of the short-term PC intervention (whether the effects were sustained over time after PC was withdrawn). PC comprised a multiprofessional PC team that provided, on average, three visits, with all care completed by 6 weeks. Recruitment commenced in August 2004 and continued for 1 year. Follow-up was performed for 6 months in both groups. Outcomes were a composite measure of five key symptoms (pain, nausea, vomiting, mouth problems and sleeping difficulty) using the Palliative care Outcome Scale-MS Symptom Scale, and care giver burden was measured using the Zarit (Care Giver) Burden Interview (ZBI). 52 patients severely affected by multiple sclerosis were randomised to receive PC either immediately (fast-track group) or after 12 weeks (control group). Patients had a high level of disability (mean Expanded Disability Status Scale: 7.7; median: 8; SD: 1). Following PC, there was an improvement in Palliative care Outcome Scale-MS Symptom Scale score and ZBI score. A higher rate of improvement in ZBI score was seen in the fast-track group. After withdrawal of PC, effects were maintained at 12 weeks, but not at 24 weeks. Receiving PC earlier has a similar effect on reducing symptoms but greater effects on reducing care giver burden, compared to later referral. In this phase II trial, the authors lacked the power to detect small differences. The effect of PC is maintained for 6 weeks after withdrawal but then appears to wane.

  10. Short-term solar flare prediction using multi-model integration method (United States)

    Liu, Jin-Fu; Li, Fei; Wan, Jie; Yu, Da-Ren


    A multi-model integration method is proposed to develop a multi-source and heterogeneous model for short-term solar flare prediction. Different prediction models are constructed on the basis of extracted predictors from a pool of observation databases. The outputs of the base models are normalized first because these established models extract predictors from many data resources using different prediction methods. Then weighted integration of the base models is used to develop a multi-model integrated model (MIM). The weight set that single models assign is optimized by a genetic algorithm. Seven base models and data from Solar and Heliospheric Observatory/Michelson Doppler Imager longitudinal magnetograms are used to construct the MIM, and then its performance is evaluated by cross validation. Experimental results showed that the MIM outperforms any individual model in nearly every data group, and the richer the diversity of the base models, the better the performance of the MIM. Thus, integrating more diversified models, such as an expert system, a statistical model and a physical model, will greatly improve the performance of the MIM.

  11. Short-term nitrogen dioxide modeling: currently available models and the applications and development needed for energy assessment

    Energy Technology Data Exchange (ETDEWEB)

    Chun, K.C.


    The rapid increase in US coal consumption projected for the near future is likely to result in, among other things, increased concentrations of nitrogen dioxide and other nitrogenous pollutants in the ambient atmosphere. The design and siting of new coal utilization facilities could be constrained by the potential promulgation of a short-term ambient air quality standard for nitrogen dioxide and by other regulations. To assess the extent and pattern of such constraints, appropriate air quality models for short-term nitrogen dioxide levels are needed. As an initial step in developing models capable of emissions from coalburning electric utility and industrial point sources, this report: (1) discusses multiple interdependent factors that affect local short-term concentrations of nitrogen dioxide such as meteorology, air quality, and the characteristics and distribution of emission sources of nitrogen dioxide precursors; (2) evaluates the utility and limitations of existing air quality models for nitrogen dioxide, including empirical, mechanistic and empirico-mechanistic models; and (3) suggests an approach for applying and developing relatively simple models for predicting short-term concentrations of nitrogen dioxide in assessments of regional and national energy development.

  12. Short-Term Wind Speed Prediction Using EEMD-LSSVM Model

    Directory of Open Access Journals (Sweden)

    Aiqing Kang


    Full Text Available Hybrid Ensemble Empirical Mode Decomposition (EEMD and Least Square Support Vector Machine (LSSVM is proposed to improve short-term wind speed forecasting precision. The EEMD is firstly utilized to decompose the original wind speed time series into a set of subseries. Then the LSSVM models are established to forecast these subseries. Partial autocorrelation function is adopted to analyze the inner relationships between the historical wind speed series in order to determine input variables of LSSVM models for prediction of every subseries. Finally, the superposition principle is employed to sum the predicted values of every subseries as the final wind speed prediction. The performance of hybrid model is evaluated based on six metrics. Compared with LSSVM, Back Propagation Neural Networks (BP, Auto-Regressive Integrated Moving Average (ARIMA, combination of Empirical Mode Decomposition (EMD with LSSVM, and hybrid EEMD with ARIMA models, the wind speed forecasting results show that the proposed hybrid model outperforms these models in terms of six metrics. Furthermore, the scatter diagrams of predicted versus actual wind speed and histograms of prediction errors are presented to verify the superiority of the hybrid model in short-term wind speed prediction.

  13. Evaluation of the long-term energy analysis program used for the 1978 EIA Administrator's Report to Congress

    Energy Technology Data Exchange (ETDEWEB)

    Peelle, R. W.; Weisbin, C. R.; Alsmiller, Jr., R. G.


    An evaluation of the Long-Term Energy Analysis Program (LEAP), a computer model of the energy portion of the US economy that was used for the 1995-2020 projections in its 1978 Annual Report to Congress, is presented. An overview of the 1978 version, LEAP Model 22C, is followed by an analysis of the important results needed by its users. The model is then evaluated on the basis of: (1) the adequacy of its documentation; (2) the local experience in operating the model; (3) the adequacy of the numerical techniques used; (4) the soundness of the economic and technical foundations of the model equations; and (5) the degree to which the computer program has been verified. To show which parameters strongly influence the results and to approach the question of whether the model can project important results with sufficient accuracy to support qualitative conclusions, the numerical sensitivities of some important results to model input parameters are described. The input data are categorized and discussed, and uncertainties are given for some parameters as examples. From this background and from the relation of LEAP to other available approaches for long-term energy modeling, an overall evaluation is given of the model's suitability for use by the EIA.

  14. Evaluation of green house gas emissions models. (United States)


    The objective of the project is to evaluate the GHG emissions models used by transportation agencies and industry leaders. Factors in the vehicle : operating environment that may affect modal emissions, such as, external conditions, : vehicle fleet c...

  15. Modeling procedure and surgical times for current procedural terminology-anesthesia-surgeon combinations and evaluation in terms of case-duration prediction and operating room efficiency: a multicenter study. (United States)

    Stepaniak, Pieter S; Heij, Christiaan; Mannaerts, Guido H H; de Quelerij, Marcel; de Vries, Guus


    Gains in operating room (OR) scheduling may be obtained by using accurate statistical models to predict surgical and procedure times. The 3 main contributions of this article are the following: (i) the validation of Strum's results on the statistical distribution of case durations, including surgeon effects, using OR databases of 2 European hospitals, (ii) the use of expert prior expectations to predict durations of rarely observed cases, and (iii) the application of the proposed methods to predict case durations, with an analysis of the resulting increase in OR efficiency. We retrospectively reviewed all recorded surgical cases of 2 large European teaching hospitals from 2005 to 2008, involving 85,312 cases and 92,099 h in total. Surgical times tended to be skewed and bounded by some minimally required time. We compared the fit of the normal distribution with that of 2- and 3-parameter lognormal distributions for case durations of a range of Current Procedural Terminology (CPT)-anesthesia combinations, including possible surgeon effects. For cases with very few observations, we investigated whether supplementing the data information with surgeons' prior guesses helps to obtain better duration estimates. Finally, we used best fitting duration distributions to simulate the potential efficiency gains in OR scheduling. The 3-parameter lognormal distribution provides the best results for the case durations of CPT-anesthesia (surgeon) combinations, with an acceptable fit for almost 90% of the CPTs when segmented by the factor surgeon. The fit is best for surgical times and somewhat less for total procedure times. Surgeons' prior guesses are helpful for OR management to improve duration estimates of CPTs with very few (<10) observations. Compared with the standard way of case scheduling using the mean of the 3-parameter lognormal distribution for case scheduling reduces the mean overreserved OR time per case up to 11.9 (11.8-12.0) min (55.6%) and the mean underreserved

  16. Inclusion of Relevance Information in the Term Discrimination Model. (United States)

    Biru, Tesfaye; And Others


    Discusses the effect of including relevance data on the calculation of term discrimination values in bibliographic databases. Algorithms that calculate the ability of index terms to discriminate between relevant and non-relevant documents are described and tested. The results are discussed in terms of the relationship between term frequency and…

  17. Ecological interpretation of short-term toxicity results: Development of a population model for Arbacia

    Energy Technology Data Exchange (ETDEWEB)

    Munns, W.R. Jr.; Nacci, D.E. [SAIC, Narragansett, RI (United States); Walker, H.A. [Environmental Protection Agency, Narragansett, RI (United States); Johnston, R.K. [NCCOSC, Narragansett, RI (United States). RDTE Division


    The Arbacia punctulata fertilization and larval development tests are used extensively in regulatory and research programs to evaluate toxicity associated with contaminants in aqueous media. These short-term assays are inexpensive, easy to use, and provide information regarding the effects of environmental contaminants on critical life history stages of the sea urchin. Despite substantial consideration of the precision of assay methods, and a clear understanding of the statistical significance of treatment differences, an appreciation of the ecological significance of treatment effects is lacking. To address this problem, a stage classified population projection model was developed to relate short-term test endpoints to potential effects at the population level. The model was applied to evaluate population-level effects using short-term toxicity data obtained in an estuarine ecological risk assessment conducted for Portsmouth Naval Shipyard, Kittery, Maine. The model also was used to examine which test endpoints provide useful information relative to population growth dynamics. Population modeling approaches can be extremely valuable in extrapolating single species toxicity information to higher level ecological endpoints and for identifying appropriate measurement endpoints during toxicity test development.

  18. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)


    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  19. Modelling the long-term vertical dynamics of salt marshes (United States)

    Zoccarato, Claudia; Teatini, Pietro


    Salt marshes are vulnerable environments hosting complex interactions between physical and biological processes with a strong influence on the dynamics of the marsh evolution. The estimation and prediction of the elevation of a salt-marsh platform is crucial to forecast the marsh growth or regression under different scenarios considering, for example, the potential climate changes. The long-term vertical dynamics of a salt marsh is predicted with the aid of an original finite-element (FE) numerical model accounting for the marsh accretion and compaction and for the variation rates of the relative sea level rise, i.e., land subsidence of the marsh basement and eustatic rise of the sea level. The accretion term considers the vertical sedimentation of organic and inorganic material over the marsh surface, whereas the compaction reflects the progressive consolidation of the porous medium under the increasing load of the overlying younger deposits. The modelling approach is based on a 2D groundwater flow simulator, which provides the pressure evolution within a compacting/accreting vertical cross-section of the marsh assuming that the groundwater flow obeys the relative Darcy's law, coupled to a 1D vertical geomechanical module following Terzaghi's principle of effective intergranular stress. Soil porosity, permeability, and compressibility may vary with the effective intergranular stress according to empirically based relationships. The model also takes into account the geometric non-linearity arising from the consideration of large solid grain movements by using a Lagrangian approach with an adaptive FE mesh. The element geometry changes in time to follow the deposit consolidation and the element number increases in time to follow the sedimentation of new material. The numerical model is tested on different realistic configurations considering the influence of (i) the spatial distribution of the sedimentation rate in relation to the distance from the marsh margin, (ii

  20. Evaluation of the Effect of Host Immune Status on Short-Term Yersinia pestis Infection in Fleas With Implications for the Enzootic Host Model for Maintenance of Y. pestis During Interepizootic Periods (United States)



    Plague, a primarily flea-borne disease caused by Yersinia pestis, is characterized by rapidly spreading epizootics separated by periods of quiescence. Little is known about how and where Y. pestis persists between epizootics. It is commonly proposed, however, that Y. pestis is maintained during interepizootic periods in enzootic cycles involving flea vectors and relatively resistant host populations. According to this model, while susceptible individuals serve as infectious sources for feeding fleas and subsequently die of infection, resistant hosts survive infection, develop antibodies to the plague bacterium, and continue to provide bloodmeals to infected fleas. For Y. pestis to persist under this scenario, fleas must remain infected after feeding on hosts carrying antibodies to Y. pestis. Studies of other vector-borne pathogens suggest that host immunity may negatively impact pathogen survival in the vector. Here, we report infection rates and bacterial loads for fleas (both Xenopsylla cheopis (Rothschild) and Oropsylla montana (Baker)) that consumed an infectious bloodmeal and subsequently fed on an immunized or age-matched naive mouse. We demonstrate that neither the proportion of infected fleas nor the bacterial loads in infected fleas were significantly lower within 3 d of feeding on immunized versus naive mice. Our findings thus provide support for one assumption underlying the enzootic host model of interepizootic maintenance of Y. pestis. PMID:25276941

  1. Newfoundland and Labrador: 80/20 staffing model pilot in a long-term care facility. (United States)

    Stuckless, Trudy; Power, Margaret


    This project, based in Newfoundland and Labrador's Central Regional Health Authority, is the first application of an 80/20 staffing model to a long-term care facility in Canada. The model allows nurse participants to spend 20% of their paid time pursuing a professional development activity instead of providing direct patient care. Newfoundland and Labrador has the highest aging demographic in Canada owing, in part, to the out-migration of younger adults. Recruiting and retaining nurses to work in long-term care in the province is difficult; at the same time, the increasing acuity of long-term care residents and their complex care needs mean that nurses must assume greater leadership roles in these facilities. This project set out to increase capacity for registered nurse (RN) leadership, training and support and to enhance the profile of long-term care as a place to work. Six RNs and one licensed practical nurse (LPN) participated and engaged in a range of professional development activities. Several of the participants are now pursuing further nursing educational activities. Central Health plans to continue a 90/10 model for one RN and one LPN per semester, with the timeframe to be determined. The model will be evaluated and, if it is deemed successful, the feasibility of implementing it in other sites throughout the region will be explored.

  2. Dynamic Hybrid Model for Short-Term Electricity Price Forecasting

    Directory of Open Access Journals (Sweden)

    Marin Cerjan


    Full Text Available Accurate forecasting tools are essential in the operation of electric power systems, especially in deregulated electricity markets. Electricity price forecasting is necessary for all market participants to optimize their portfolios. In this paper we propose a hybrid method approach for short-term hourly electricity price forecasting. The paper combines statistical techniques for pre-processing of data and a multi-layer (MLP neural network for forecasting electricity price and price spike detection. Based on statistical analysis, days are arranged into several categories. Similar days are examined by correlation significance of the historical data. Factors impacting the electricity price forecasting, including historical price factors, load factors and wind production factors are discussed. A price spike index (CWI is defined for spike detection and forecasting. Using proposed approach we created several forecasting models of diverse model complexity. The method is validated using the European Energy Exchange (EEX electricity price data records. Finally, results are discussed with respect to price volatility, with emphasis on the price forecasting accuracy.

  3. Modelling substorm chorus events in terms of dispersive azimuthal drift

    Directory of Open Access Journals (Sweden)

    A. B. Collier


    Full Text Available The Substorm Chorus Event (SCE is a radio phenomenon observed on the ground after the onset of the substorm expansion phase. It consists of a band of VLF chorus with rising upper and lower cutoff frequencies. These emissions are thought to result from Doppler-shifted cyclotron resonance between whistler mode waves and energetic electrons which drift into a ground station's field of view from an injection site around midnight. The increasing frequency of the emission envelope has been attributed to the combined effects of energy dispersion due to gradient and curvature drifts, and the modification of resonance conditions and variation of the half-gyrofrequency cutoff resulting from the radial component of the ExB drift. A model is presented which accounts for the observed features of the SCE in terms of the growth rate of whistler mode waves due to anisotropy in the electron distribution. This model provides an explanation for the increasing frequency of the SCE lower cutoff, as well as reproducing the general frequency-time signature of the event. In addition, the results place some restrictions on the injected particle source distribution which might lead to a SCE. Key words. Space plasma physics (Wave-particle interaction – Magnetospheric physics (Plasma waves and instabilities; Storms and substorms

  4. Modelling substorm chorus events in terms of dispersive azimuthal drift

    Directory of Open Access Journals (Sweden)

    A. B. Collier


    Full Text Available The Substorm Chorus Event (SCE is a radio phenomenon observed on the ground after the onset of the substorm expansion phase. It consists of a band of VLF chorus with rising upper and lower cutoff frequencies. These emissions are thought to result from Doppler-shifted cyclotron resonance between whistler mode waves and energetic electrons which drift into a ground station's field of view from an injection site around midnight. The increasing frequency of the emission envelope has been attributed to the combined effects of energy dispersion due to gradient and curvature drifts, and the modification of resonance conditions and variation of the half-gyrofrequency cutoff resulting from the radial component of the ExB drift.

    A model is presented which accounts for the observed features of the SCE in terms of the growth rate of whistler mode waves due to anisotropy in the electron distribution. This model provides an explanation for the increasing frequency of the SCE lower cutoff, as well as reproducing the general frequency-time signature of the event. In addition, the results place some restrictions on the injected particle source distribution which might lead to a SCE.

    Key words. Space plasma physics (Wave-particle interaction – Magnetospheric physics (Plasma waves and instabilities; Storms and substorms

  5. Analysis and evaluation of collaborative modeling processes

    NARCIS (Netherlands)

    Ssebuggwawo, D.


    Analysis and evaluation of collaborative modeling processes is confronted with many challenges. On the one hand, many systems design and re-engineering projects require collaborative modeling approaches that can enhance their productivity. But, such collaborative efforts, which often consist of the

  6. GATEWAY Report Brief: SSL Demonstration: Long-Term Evaluation of Indoor Field Performance

    Energy Technology Data Exchange (ETDEWEB)

    None, None


    Report brief summarizing a GATEWAY program evaluation of the long-term performance characteristics (chromaticity change, maintained illuminance, and operations and maintenance) of LED lighting systems in four field installations previously documented in separate DOE GATEWAY reports.

  7. An Evaluation of Query Expansion by the Addition of Clustered Terms for a Document Retrieval System (United States)

    Minker, Jack; And Others


    An evaluation of graph theoretical clusters of index terms which can be extracted from an automatically indexed document collection, and the effects of employing such clusters in automatic document retrieval are described. (19 references) (Author)

  8. The Evaluation of Ar ab Political Leaders’ Speeches during the Arab Spring in Terms of Semantics

    National Research Council Canada - National Science Library

    Ayşe İSPİR


    .... In this study, Arabic political leaders’ public - directed statements in these speeches during the Arab Spring , which was sparked with the society and the society played an active role in, have been evaluated in terms of semantics...

  9. GATEWAY Demonstrations: Long-Term Evaluation of SSL Field Performance in Select Interior Projects

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Tess E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Davis, Robert G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wilkerson, Andrea M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    The GATEWAY program evaluated the long-term performance characteristics (chromaticity change, maintained illuminance, and operations and maintenance) of LED lighting systems in four field installations previously documented in separate DOE GATEWAY reports.

  10. Animal models of bronchopulmonary dysplasia. The preterm and term rabbit models. (United States)

    D'Angio, Carl T; Ryan, Rita M


    Bronchopulmonary dysplasia (BPD) is an important lung developmental pathophysiology that affects many premature infants each year. Newborn animal models employing both premature and term animals have been used over the years to study various components of BPD. This review describes some of the neonatal rabbit studies that have contributed to the understanding of BPD, including those using term newborn hyperoxia exposure models, premature hyperoxia models, and a term newborn hyperoxia model with recovery in moderate hyperoxia, all designed to emulate aspects of BPD in human infants. Some investigators perturbed these models to include exposure to neonatal infection/inflammation or postnatal malnutrition. The similarities to lung injury in human premature infants include an acute inflammatory response with the production of cytokines, chemokines, and growth factors that have been implicated in human disease, abnormal pulmonary function, disordered lung architecture, and alveolar simplification, development of fibrosis, and abnormal vascular growth factor expression. Neonatal rabbit models have the drawback of limited access to reagents as well as the lack of readily available transgenic models but, unlike smaller rodent models, are able to be manipulated easily and are significantly less expensive than larger animal models. Copyright © 2014 the American Physiological Society.


    Energy Technology Data Exchange (ETDEWEB)

    Joseph H. Hartman


    Great Plains, northern hemisphere, and elsewhere. Finally these data can be integrated into a history of climate change and predictive climate models. This is not a small undertaking. The goals of researchers and the methods used vary considerably. The primary task of this project was literature research to (1) evaluate existing methodologies used in geologic climate change studies and evidence for short-term cycles produced by these methodologies and (2) evaluate late Holocene climate patterns and their interpretations.

  12. Evaluation of constitutive models for crushed salt

    Energy Technology Data Exchange (ETDEWEB)

    Callahan, G.D.; Loken, M.C. [RE/SPEC, Inc., Rapid City, SD (United States); Hurtado, L.D.; Hansen, F.D.


    Three constitutive models are recommended as candidates for describing the deformation of crushed salt. These models are generalized to three-dimensional states of stress to include the effects of mean and deviatoric stress and modified to include effects of temperature, grain size, and moisture content. A database including hydrostatic consolidation and shear consolidation tests conducted on Waste Isolation Pilot Plant (WIPP) and southeastern New Mexico salt is used to determine material parameters for the models. To evaluate the capability of the models, parameter values obtained from fitting the complete database are used to predict the individual tests. Finite element calculations of a WIPP shaft with emplaced crushed salt demonstrate the model predictions.

  13. Long-term strategic asset allocation: An out-of-sample evaluation

    NARCIS (Netherlands)

    Diris, B.F.; Palm, F.C.; Schotman, P.C.

    We evaluate the out-of-sample performance of a long-term investor who follows an optimized dynamic trading strategy. Although the dynamic strategy is able to benefit from predictability out-of-sample, a short-term investor using a single-period market timing strategy would have realized an almost

  14. Evaluation of short-term weather forecasts in South Africa | Banitz ...

    African Journals Online (AJOL)

    In this paper a brief overview will be given for the reasons for doing evaluations of short-term weather forecasts as well as the methodology thereof. Short-term weather forecasts are defined as a forecast valid for the current day as well as the next day. In other words up to 48 h ahead. Results are given for South African ...

  15. Long-term evaluation of a social robot in real homes

    NARCIS (Netherlands)

    de Graaf, M.M.A.; Ben Allouch, Soumaya; van Dijk, Johannes A.G.M.


    As the employment of robots for long-term evaluations in home settings are just starting to be robust enough for research purposes, our study aims at contributing to humanrobot interaction research by adding longitudinal findings to a limited number of long-term social robotics home studies. We

  16. Evaluation-Function-based Model-free Adaptive Fuzzy Control

    Directory of Open Access Journals (Sweden)

    Agus Naba


    Full Text Available Designs of adaptive fuzzy controllers (AFC are commonly based on the Lyapunov approach, which requires a known model of the controlled plant. They need to consider a Lyapunov function candidate as an evaluation function to be minimized. In this study these drawbacks were handled by designing a model-free adaptive fuzzy controller (MFAFC using an approximate evaluation function defined in terms of the current state, the next state, and the control action. MFAFC considers the approximate evaluation function as an evaluative control performance measure similar to the state-action value function in reinforcement learning. The simulation results of applying MFAFC to the inverted pendulum benchmark verified the proposed scheme’s efficacy.

  17. Modeling for Green Supply Chain Evaluation

    Directory of Open Access Journals (Sweden)

    Elham Falatoonitoosi


    Full Text Available Green supply chain management (GSCM has become a practical approach to develop environmental performance. Under strict regulations and stakeholder pressures, enterprises need to enhance and improve GSCM practices, which are influenced by both traditional and green factors. This study developed a causal evaluation model to guide selection of qualified suppliers by prioritizing various criteria and mapping causal relationships to find effective criteria to improve green supply chain. The aim of the case study was to model and examine the influential and important main GSCM practices, namely, green logistics, organizational performance, green organizational activities, environmental protection, and green supplier evaluation. In the case study, decision-making trial and evaluation laboratory technique is applied to test the developed model. The result of the case study shows only “green supplier evaluation” and “green organizational activities” criteria of the model are in the cause group and the other criteria are in the effect group.

  18. Conceptual modelling of human resource evaluation process

    Directory of Open Access Journals (Sweden)

    Negoiţă Doina Olivia


    Full Text Available Taking into account the highly diverse tasks which employees have to fulfil due to complex requirements of nowadays consumers, the human resource within an enterprise has become a strategic element for developing and exploiting products which meet the market expectations. Therefore, organizations encounter difficulties when approaching the human resource evaluation process. Hence, the aim of the current paper is to design a conceptual model of the aforementioned process, which allows the enterprises to develop a specific methodology. In order to design the conceptual model, Business Process Modelling instruments were employed - Adonis Community Edition Business Process Management Toolkit using the ADONIS BPMS Notation. The conceptual model was developed based on an in-depth secondary research regarding the human resource evaluation process. The proposed conceptual model represents a generic workflow (sequential and/ or simultaneously activities, which can be extended considering the enterprise’s needs regarding their requirements when conducting a human resource evaluation process. Enterprises can benefit from using software instruments for business process modelling as they enable process analysis and evaluation (predefined / specific queries and also model optimization (simulations.

  19. Considering extraction constraints in long-term oil price modelling

    Energy Technology Data Exchange (ETDEWEB)

    Rehrl, Tobias; Friedrich, Rainer; Voss, Alfred


    Apart from divergence about the remaining global oil resources, the peak oil discussion can be reduced to a dispute about the time rate at which these resources can be supplied. On the one hand it is problematic to project oil supply trends without taking both - prices as well as supply costs - explicitly into account. On the other hand are supply cost estimates however itself heavily dependent on the underlying extraction rates and are actually only valid within a certain business-as-usual extraction rate scenario (which itself is the task to determine). In fact, even after having applied enhanced recovery technologies, the rate at which an oil field can be exploited is quite restricted. Above a certain level an additional extraction rate increase can only be costly achieved at risks of losses in the overall recoverable amounts of the oil reservoir and causes much higher marginal cost. This inflexibility in extraction can be overcome in principle by the access to new oil fields. This indicates why the discovery trend may roughly form the long-term oil production curve, at least for price-taking suppliers. The long term oil discovery trend itself can be described as a logistic process with the two opposed effects of learning and depletion. This leads to the well-known Hubbert curve. Several attempts have been made to incorporate economic variables econometrically into the Hubbert model. With this work we follow a somewhat inverse approach and integrate Hubbert curves in our Long-term Oil Price and EXtraction model LOPEX. In LOPEX we assume that non-OPEC oil production - as long as the oil can be profitably discovered and extracted - is restricted to follow self-regulative discovery trends described by Hubbert curves. Non-OPEC production in LOPEX therefore consists of those Hubbert cycles that are profitable, depending on supply cost and price. Endogenous and exogenous technical progress is extra integrated in different ways. LOPEX determines extraction and price

  20. Variable Renewable Energy in Long-Term Planning Models: A Multi-Model Perspective

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Frew, Bethany [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sun, Yinong [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bistline, John [Electric Power Research Inst. (EPRI), Knoxville, TN (United States); Blanford, Geoffrey [Electric Power Research Inst. (EPRI), Knoxville, TN (United States); Young, David [Electric Power Research Inst. (EPRI), Knoxville, TN (United States); Marcy, Cara [U.S. Energy Information Administration, Washington, DC (United States); Namovicz, Chris [U.S. Energy Information Administration, Washington, DC (United States); Edelman, Risa [US Environmental Protection Agency (EPA), Washington, DC (United States); Meroney, Bill [US Environmental Protection Agency (EPA), Washington, DC (United States); Sims, Ryan [US Environmental Protection Agency (EPA), Washington, DC (United States); Stenhouse, Jeb [US Environmental Protection Agency (EPA), Washington, DC (United States); Donohoo-Vallett, Paul [Dept. of Energy (DOE), Washington DC (United States)


    Long-term capacity expansion models of the U.S. electricity sector have long been used to inform electric sector stakeholders and decision-makers. With the recent surge in variable renewable energy (VRE) generators — primarily wind and solar photovoltaics — the need to appropriately represent VRE generators in these long-term models has increased. VRE generators are especially difficult to represent for a variety of reasons, including their variability, uncertainty, and spatial diversity. This report summarizes the analyses and model experiments that were conducted as part of two workshops on modeling VRE for national-scale capacity expansion models. It discusses the various methods for treating VRE among four modeling teams from the Electric Power Research Institute (EPRI), the U.S. Energy Information Administration (EIA), the U.S. Environmental Protection Agency (EPA), and the National Renewable Energy Laboratory (NREL). The report reviews the findings from the two workshops and emphasizes the areas where there is still need for additional research and development on analysis tools to incorporate VRE into long-term planning and decision-making. This research is intended to inform the energy modeling community on the modeling of variable renewable resources, and is not intended to advocate for or against any particular energy technologies, resources, or policies.

  1. A framework for evaluating forest landscape model predictions using empirical data and knowledge (United States)

    Wen J. Wang; Hong S. He; Martin A. Spetich; Stephen R. Shifley; Frank R. Thompson; William D. Dijak; Qia. Wang


    Evaluation of forest landscape model (FLM) predictions is indispensable to establish the credibility of predictions. We present a framework that evaluates short- and long-term FLM predictions at site and landscape scales. Site-scale evaluation is conducted through comparing raster cell-level predictions with inventory plot data whereas landscape-scale evaluation is...


    Directory of Open Access Journals (Sweden)

    Altair Borgert


    Full Text Available This study’s purpose was to build a cost evaluation model with views to providing managers and decision makers with information to support the resolution process. From a strategic positioning standpoint, the pondering of variables involved in a cost system is key to corporate success. To this extent, overall consideration was given to contemporary cost approaches – the Theory of Constraints, Balanced Scorecard and Strategic Cost Management – and cost evaluation was analysed. It is understood that this is a relevant factor and that it ought to be taken into account when taking corporate decisions. Furthermore, considering that the MCDA methodology is recommended for the construction of cost evaluation models, some of it’s aspects were emphasised. Finally, the construction of the model itself complements this study. At this stage, cost variables for the three approaches were compiled. Thus, a repository of several variables was created and its use and combination is subject to the interests and needs of those responsible for it’s structuring within corporations. In so proceeding, the number of variables to ponder follows the complexity of the issue and of the required solution. Once meetings held with the study groups, the model was built, revised and reconstructed until consensus was reached. Thereafter, the conclusion was that a cost evaluation model, when built according to the characteristics and needs of each organization, might become the groundwork ensuring accounting becomes increasingly useful at  companies. Key-words: Cost evaluation. Cost measurement. Strategy.

  3. Long-term surface temperature modeling of Pluto (United States)

    Earle, Alissa M.; Binzel, Richard P.; Young, Leslie A.; Stern, S. A.; Ennico, K.; Grundy, W.; Olkin, C. B.; Weaver, H. A.; New Horizons Geology and Geophysics Imaging Team


    NASA's New Horizons' reconnaissance of the Pluto system has revealed at high resolution the striking albedo contrasts from polar to equatorial latitudes on Pluto, as well as the sharpness of boundaries for longitudinal variations. These contrasts suggest that Pluto must undergo dynamic evolution that drives the redistribution of volatiles. Using the New Horizons results as a template, we explore the surface temperature variations driven seasonally on Pluto considering multiple timescales. These timescales include the current orbit (248 years) as well as the timescales for obliquity precession (peak-to-peak amplitude of 23° over 3 million years) and regression of the orbital longitude of perihelion (3.7 million years). These orbital variations create epochs of ;Extreme Seasons; where one pole receives a short, relatively warm summer and long winter, while the other receives a much longer, but less intense summer and short winter. We use thermal modeling to build upon the long-term insolation history model described by Earle and Binzel (2015) and investigate how these seasons couple with Pluto's albedo contrasts to create temperature effects. From this study we find that a bright region at the equator, once established, can become a site for net deposition. We see the region informally known as Sputnik Planitia as an example of this, and find it will be able to perpetuate itself as an ;always available; cold trap, thus having the potential to survive on million year or substantially longer timescales. Meanwhile darker, low-albedo, regions near the equator will remain relative warm and generally not attract volatile deposition. We argue that the equatorial region is a ;preservation zone; for whatever albedo is seeded there. This offers insight as to why the equatorial band of Pluto displays the planet's greatest albedo contrasts.

  4. Saphire models and software for ASP evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Sattison, M.B. [Idaho National Engineering Lab., Idaho Falls, ID (United States)


    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission`s (NRC`s) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented.

  5. Evaluation of Model Recognition for Grammar-Based Automatic 3d Building Model Reconstruction (United States)

    Yu, Qian; Helmholz, Petra; Belton, David


    In recent years, 3D city models are in high demand by many public and private organisations, and the steadily growing capacity in both quality and quantity are increasing demand. The quality evaluation of these 3D models is a relevant issue both from the scientific and practical points of view. In this paper, we present a method for the quality evaluation of 3D building models which are reconstructed automatically from terrestrial laser scanning (TLS) data based on an attributed building grammar. The entire evaluation process has been performed in all the three dimensions in terms of completeness and correctness of the reconstruction. Six quality measures are introduced to apply on four datasets of reconstructed building models in order to describe the quality of the automatic reconstruction, and also are assessed on their validity from the evaluation point of view.

  6. Model evaluation using grouped or individual data. (United States)

    Cohen, Andrew L; Sanborn, Adam N; Shiffrin, Richard M


    Analyzing the data of individuals has several advantages over analyzing the data combined across the individuals (the latter we term group analysis): Grouping can distort the form of data, and different individuals might perform the task using different processes and parameters. These factors notwithstanding, we demonstrate conditions in which group analysis outperforms individual analysis. Such conditions include those in which there are relatively few trials per subject per condition, a situation that sometimes introduces distortions and biases when models are fit and parameters are estimated. We employed a simulation technique in which data were generated from each of two known models, each with parameter variation across simulated individuals. We examined how well the generating model and its competitor each fared in fitting (both sets of) the data, using both individual and group analysis. We examined the accuracy of model selection (the probability that the correct model would be selected by the analysis method). Trials per condition and individuals per experiment were varied systematically. Three pairs of cognitive models were compared: exponential versus power models of forgetting, generalized context versus prototype models of categorization, and the fuzzy logical model of perception versus the linear integration model of information integration. We show that there are situations in which small numbers of trials per condition cause group analysis to outperform individual analysis. Additional tables and figures may be downloaded from the Psychonomic Society Archive of Norms, Stimuli, and Data,

  7. Modeling energy and development: an evaluation of models and concepts

    NARCIS (Netherlands)

    van Ruijven, B.J.|info:eu-repo/dai/nl/304834521; Urban, F.; Benders, R.J.M.; Moll, H.C.; van der Sluijs, J.P.|info:eu-repo/dai/nl/073427489; de Vries, B.|info:eu-repo/dai/nl/068361599; van Vuuren, D.P.


    Most global energy models are developed by institutes from developed countries, focusing primarily on issues that are important in industrialized countries. Evaluation of the results for Asia of the IPCC/SRES models shows that broad concepts of energy and development, the energy ladder and the

  8. A standard telemental health evaluation model: the time is now. (United States)

    Kramer, Greg M; Shore, Jay H; Mishkind, Matt C; Friedl, Karl E; Poropatich, Ronald K; Gahm, Gregory A


    The telehealth field has advanced historic promises to improve access, cost, and quality of care. However, the extent to which it is delivering on its promises is unclear as the scientific evidence needed to justify success is still emerging. Many have identified the need to advance the scientific knowledge base to better quantify success. One method for advancing that knowledge base is a standard telemental health evaluation model. Telemental health is defined here as the provision of mental health services using live, interactive video-teleconferencing technology. Evaluation in the telemental health field largely consists of descriptive and small pilot studies, is often defined by the individual goals of the specific programs, and is typically focused on only one outcome. The field should adopt new evaluation methods that consider the co-adaptive interaction between users (patients and providers), healthcare costs and savings, and the rapid evolution in communication technologies. Acceptance of a standard evaluation model will improve perceptions of telemental health as an established field, promote development of a sounder empirical base, promote interagency collaboration, and provide a framework for more multidisciplinary research that integrates measuring the impact of the technology and the overall healthcare aspect. We suggest that consideration of a standard model is timely given where telemental health is at in terms of its stage of scientific progress. We will broadly recommend some elements of what such a standard evaluation model might include for telemental health and suggest a way forward for adopting such a model.

  9. Variable Renewable Energy in Long-Term Planning Models: A Multi-Model Perspective

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley J. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Frew, Bethany A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mai, Trieu T. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sun, Yinong [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Bistline, John [Electric Power Research Inst., Palo Alto, CA (United States); Blanford, Geoffrey [Electric Power Research Inst., Palo Alto, CA (United States); Young, David [Electric Power Research Inst., Palo Alto, CA (United States); Marcy, Cara [Energy Information Administration, Washington, DC (United States); Namovicz, Chris [Energy Information Administration, Washington, DC (United States); Edelman, Risa [Environmental Protection Agency, Washington, DC (United States); Meroney, Bill [Environmental Protection Agency; Sims, Ryan [Environmental Protection Agency; Stenhouse, Jeb [Environmental Protection Agency; Donohoo-Vallett, Paul [U.S. Department of Energy


    Long-term capacity expansion models of the U.S. electricity sector have long been used to inform electric sector stakeholders and decision makers. With the recent surge in variable renewable energy (VRE) generators - primarily wind and solar photovoltaics - the need to appropriately represent VRE generators in these long-term models has increased. VRE generators are especially difficult to represent for a variety of reasons, including their variability, uncertainty, and spatial diversity. To assess current best practices, share methods and data, and identify future research needs for VRE representation in capacity expansion models, four capacity expansion modeling teams from the Electric Power Research Institute, the U.S. Energy Information Administration, the U.S. Environmental Protection Agency, and the National Renewable Energy Laboratory conducted two workshops of VRE modeling for national-scale capacity expansion models. The workshops covered a wide range of VRE topics, including transmission and VRE resource data, VRE capacity value, dispatch and operational modeling, distributed generation, and temporal and spatial resolution. The objectives of the workshops were both to better understand these topics and to improve the representation of VRE across the suite of models. Given these goals, each team incorporated model updates and performed additional analyses between the first and second workshops. This report summarizes the analyses and model 'experiments' that were conducted as part of these workshops as well as the various methods for treating VRE among the four modeling teams. The report also reviews the findings and learnings from the two workshops. We emphasize the areas where there is still need for additional research and development on analysis tools to incorporate VRE into long-term planning and decision-making.

  10. [Evaluation of the Dresden Tympanoplasty Model (DTM)]. (United States)

    Beleites, T; Neudert, M; Lasurashvili, N; Kemper, M; Offergeld, C; Hofmann, G; Zahnert, T


    The training of microsurgical motor skills is essentiell for surgical education if the interests of the patient are to be safeguarded. In otosurgery the complex anatomy of the temporal bone and variations necessitate a special training before performing surgery on a patient. We therefore developed and evaluated a simplified middle ear model for acquiring first microsurgical skills in tympanoplasty.The simplified tympanoplasty model consists of the outer ear canal and a tympanic cavity. A stapes model is placed in projection of the upper posterior tympanic membrane quadrant at the medial wall of the simulated tympanic cavity. To imitate the annular ligament flexibility the stapes is fixed on a soft plastic pad. 41 subjects evaluated the model´s anatomical analogy, the comparability to the real surgical situation and the general model properties the using a special questionnaire.The tympanoplasty model was very well evaluated by all participants. It is a reasonably priced model and a useful tool in microsurgical skills training. Thereby, it closes the gap between theoretical training and real operation conditions. © Georg Thieme Verlag KG Stuttgart · New York.

  11. Long-term functional outcomes and correlation with regional brain connectivity by MRI diffusion tractography metrics in a near-term rabbit model of intrauterine growth restriction. (United States)

    Illa, Miriam; Eixarch, Elisenda; Batalle, Dafnis; Arbat-Plana, Ariadna; Muñoz-Moreno, Emma; Figueras, Francesc; Gratacos, Eduard


    Intrauterine growth restriction (IUGR) affects 5-10% of all newborns and is associated with increased risk of memory, attention and anxiety problems in late childhood and adolescence. The neurostructural correlates of long-term abnormal neurodevelopment associated with IUGR are unknown. Thus, the aim of this study was to provide a comprehensive description of the long-term functional and neurostructural correlates of abnormal neurodevelopment associated with IUGR in a near-term rabbit model (delivered at 30 days of gestation) and evaluate the development of quantitative imaging biomarkers of abnormal neurodevelopment based on diffusion magnetic resonance imaging (MRI) parameters and connectivity. At +70 postnatal days, 10 cases and 11 controls were functionally evaluated with the Open Field Behavioral Test which evaluates anxiety and attention and the Object Recognition Task that evaluates short-term memory and attention. Subsequently, brains were collected, fixed and a high resolution MRI was performed. Differences in diffusion parameters were analyzed by means of voxel-based and connectivity analysis measuring the number of fibers reconstructed within anxiety, attention and short-term memory networks over the total fibers. The results of the neurobehavioral and cognitive assessment showed a significant higher degree of anxiety, attention and memory problems in cases compared to controls in most of the variables explored. Voxel-based analysis (VBA) revealed significant differences between groups in multiple brain regions mainly in grey matter structures, whereas connectivity analysis demonstrated lower ratios of fibers within the networks in cases, reaching the statistical significance only in the left hemisphere for both networks. Finally, VBA and connectivity results were also correlated with functional outcome. The rabbit model used reproduced long-term functional impairments and their neurostructural correlates of abnormal neurodevelopment associated with IUGR

  12. Long-term functional outcomes and correlation with regional brain connectivity by MRI diffusion tractography metrics in a near-term rabbit model of intrauterine growth restriction.

    Directory of Open Access Journals (Sweden)

    Miriam Illa

    Full Text Available BACKGROUND: Intrauterine growth restriction (IUGR affects 5-10% of all newborns and is associated with increased risk of memory, attention and anxiety problems in late childhood and adolescence. The neurostructural correlates of long-term abnormal neurodevelopment associated with IUGR are unknown. Thus, the aim of this study was to provide a comprehensive description of the long-term functional and neurostructural correlates of abnormal neurodevelopment associated with IUGR in a near-term rabbit model (delivered at 30 days of gestation and evaluate the development of quantitative imaging biomarkers of abnormal neurodevelopment based on diffusion magnetic resonance imaging (MRI parameters and connectivity. METHODOLOGY: At +70 postnatal days, 10 cases and 11 controls were functionally evaluated with the Open Field Behavioral Test which evaluates anxiety and attention and the Object Recognition Task that evaluates short-term memory and attention. Subsequently, brains were collected, fixed and a high resolution MRI was performed. Differences in diffusion parameters were analyzed by means of voxel-based and connectivity analysis measuring the number of fibers reconstructed within anxiety, attention and short-term memory networks over the total fibers. PRINCIPAL FINDINGS: The results of the neurobehavioral and cognitive assessment showed a significant higher degree of anxiety, attention and memory problems in cases compared to controls in most of the variables explored. Voxel-based analysis (VBA revealed significant differences between groups in multiple brain regions mainly in grey matter structures, whereas connectivity analysis demonstrated lower ratios of fibers within the networks in cases, reaching the statistical significance only in the left hemisphere for both networks. Finally, VBA and connectivity results were also correlated with functional outcome. CONCLUSIONS: The rabbit model used reproduced long-term functional impairments and their


    Directory of Open Access Journals (Sweden)

    José Luis Bernal Agudo


    Full Text Available The evaluation model that the LOMCE projects sinks its roots into the neoliberal beliefs, reflecting a specific way of understanding the world. What matters is not the process but the results, being the evaluation the center of the education-learning processes. It presents an evil planning, since the theory that justifies the model doesn’t specify upon coherent proposals, where there is an excessive worry for excellence and diversity is left out. A comprehensive way of understanding education should be recovered.

  14. Challenges in integrating shrot-term behaviour in a mixed-fishery Management Strategies Evaluation frame: a case study of the North Sea flatfish fishery

    NARCIS (Netherlands)

    Andersen, B.S.; Vermard, Y.; Ulrich, C.; Hutton, T.; Poos, J.J.


    This study presents a fleet-based bioeconomic simulation model to the international mixed flatfish fishery in the North Sea. The model uses a Management Strategies Evaluation framework including a discrete choice model accounting for short-term temporal changes in effort allocation across fisheries.

  15. Short-Term and Medium-Term Reliability Evaluation for Power Systems With High Penetration of Wind Power

    DEFF Research Database (Denmark)

    Ding, Yi; Singh, Chanan; Goel, Lalit


    The expanding share of the fluctuating and less predictable wind power generation can introduce complexities in power system reliability evaluation and management. This entails a need for the system operator to assess the system status more accurately for securing real-time balancing. The existing...... reliability evaluation techniques for power systems are well developed. These techniques are more focused on steady-state (time-independent) reliability evaluation and have been successfully applied in power system planning and expansion. In the operational phase, however, they may be too rough...... an approximation of the time-varying behavior of power systems with high penetration of wind power. This paper proposes a time-varying reliability assessment technique. Time-varying reliability models for wind farms, conventional generating units, and rapid start-up generating units are developed and represented...

  16. A model of late long-term potentiation simulates aspects of memory maintenance.

    Directory of Open Access Journals (Sweden)

    Paul Smolen

    Full Text Available Late long-term potentiation (L-LTP denotes long-lasting strengthening of synapses between neurons. L-LTP appears essential for the formation of long-term memory, with memories at least partly encoded by patterns of strengthened synapses. How memories are preserved for months or years, despite molecular turnover, is not well understood. Ongoing recurrent neuronal activity, during memory recall or during sleep, has been hypothesized to preferentially potentiate strong synapses, preserving memories. This hypothesis has not been evaluated in the context of a mathematical model representing ongoing activity and biochemical pathways important for L-LTP. In this study, ongoing activity was incorporated into two such models - a reduced model that represents some of the essential biochemical processes, and a more detailed published model. The reduced model represents synaptic tagging and gene induction simply and intuitively, and the detailed model adds activation of essential kinases by Ca(2+. Ongoing activity was modeled as continual brief elevations of Ca(2+. In each model, two stable states of synaptic strength/weight resulted. Positive feedback between synaptic weight and the amplitude of ongoing Ca(2+ transients underlies this bistability. A tetanic or theta-burst stimulus switches a model synapse from a low basal weight to a high weight that is stabilized by ongoing activity. Bistability was robust to parameter variations in both models. Simulations illustrated that prolonged periods of decreased activity reset synaptic strengths to low values, suggesting a plausible forgetting mechanism. However, episodic activity with shorter inactive intervals maintained strong synapses. Both models support experimental predictions. Tests of these predictions are expected to further understanding of how neuronal activity is coupled to maintenance of synaptic strength. Further investigations that examine the dynamics of activity and synaptic maintenance can be

  17. Spectral model for long-term computation of thermodynamics and potential evaporation in shallow wetlands (United States)

    de la Fuente, Alberto; Meruane, Carolina


    Altiplanic wetlands are unique ecosystems located in the elevated plateaus of Chile, Argentina, Peru, and Bolivia. These ecosystems are under threat due to changes in land use, groundwater extractions, and climate change that will modify the water balance through changes in precipitation and evaporation rates. Long-term prediction of the fate of aquatic ecosystems imposes computational constraints that make finding a solution impossible in some cases. In this article, we present a spectral model for long-term simulations of the thermodynamics of shallow wetlands in the limit case when the water depth tends to zero. This spectral model solves for water and sediment temperature, as well as heat, momentum, and mass exchanged with the atmosphere. The parameters of the model (water depth, thermal properties of the sediments, and surface albedo) and the atmospheric downscaling were calibrated using the MODIS product of the land surface temperature. Moreover, the performance of the daily evaporation rates predicted by the model was evaluated against daily pan evaporation data measured between 1964 and 2012. The spectral model was able to correctly represent both seasonal fluctuation and climatic trends observed in daily evaporation rates. It is concluded that the spectral model presented in this article is a suitable tool for assessing the global climate change effects on shallow wetlands whose thermodynamics is forced by heat exchanges with the atmosphere and modulated by the heat-reservoir role of the sediments.

  18. Critical Evaluation of the EJB Transaction Model


    Silaghi, Raul; Strohmeier, Alfred


    Enterprise JavaBeans is a widely-used technology that aims at supporting distributed component-based applications written in Java. One of the key features of the Enterprise JavaBeans architecture is the support of declarative distributed transactions, without requiring explicit coding. In this paper, after a brief introduction of the concepts and mechanisms related to the EJB Transaction Model, we provide guidelines for their consistent use. We then evaluate the EJB Transaction Model on an Au...

  19. 76 FR 72423 - Bridging the Idea Development Evaluation Assessment and Long-Term Initiative and Total Product... (United States)


    .... FDA-2011-N-0780] Bridging the Idea Development Evaluation Assessment and Long-Term Initiative and... entitled ``Bridging the Idea Development Evaluation Assessment and Long-Term Initiative and Total Product...

  20. Evaluation of potential crushed-salt constitutive models

    Energy Technology Data Exchange (ETDEWEB)

    Callahan, G.D.; Loken, M.C.; Sambeek, L.L. Van; Chen, R.; Pfeifle, T.W.; Nieland, J.D. [RE/SPEC Inc., Rapid City, SD (United States); Hansen, F.D. [Sandia National Labs., Albuquerque, NM (United States). Repository Isolation Systems Dept.


    Constitutive models describing the deformation of crushed salt are presented in this report. Ten constitutive models with potential to describe the phenomenological and micromechanical processes for crushed salt were selected from a literature search. Three of these ten constitutive models, termed Sjaardema-Krieg, Zeuch, and Spiers models, were adopted as candidate constitutive models. The candidate constitutive models were generalized in a consistent manner to three-dimensional states of stress and modified to include the effects of temperature, grain size, and moisture content. A database including hydrostatic consolidation and shear consolidation tests conducted on Waste Isolation Pilot Plant and southeastern New Mexico salt was used to determine material parameters for the candidate constitutive models. Nonlinear least-squares model fitting to data from the hydrostatic consolidation tests, the shear consolidation tests, and a combination of the shear and hydrostatic tests produces three sets of material parameter values for the candidate models. The change in material parameter values from test group to test group indicates the empirical nature of the models. To evaluate the predictive capability of the candidate models, each parameter value set was used to predict each of the tests in the database. Based on the fitting statistics and the ability of the models to predict the test data, the Spiers model appeared to perform slightly better than the other two candidate models. The work reported here is a first-of-its kind evaluation of constitutive models for reconsolidation of crushed salt. Questions remain to be answered. Deficiencies in models and databases are identified and recommendations for future work are made. 85 refs.

  1. Model evaluation methodology applicable to environmental assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Shaeffer, D.L.


    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes.

  2. Evaluating model assumptions in item response theory

    NARCIS (Netherlands)

    Tijmstra, J.


    This dissertation deals with the evaluation of model assumptions in the context of item response theory. Item response theory, also known as modern test theory, provides a statistical framework for the measurement of psychological constructs that cannot by observed directly, such as intelligence or

  3. Evaluation of Usability Utilizing Markov Models (United States)

    Penedo, Janaina Rodrigues; Diniz, Morganna; Ferreira, Simone Bacellar Leal; Silveira, Denis S.; Capra, Eliane


    Purpose: The purpose of this paper is to analyze the usability of a remote learning system in its initial development phase, using a quantitative usability evaluation method through Markov models. Design/methodology/approach: The paper opted for an exploratory study. The data of interest of the research correspond to the possible accesses of users…

  4. Evaluating Performances of Traffic Noise Models | Oyedepo ...

    African Journals Online (AJOL)

    Traffic noise in decibel dB(A) were measured at six locations using 407780A Integrating Sound Level Meter, while spot speed and traffic volume were collected with cine-camera. The predicted sound exposure level (SEL) was evaluated using Burgess, British and FWHA model. The average noise level obtained are 77.64 ...

  5. Evaluation of help model replacement codes

    Energy Technology Data Exchange (ETDEWEB)

    Whiteside, Tad [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Hang, Thong [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Flach, Gregory [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)


    This work evaluates the computer codes that are proposed to be used to predict percolation of water through the closure-cap and into the waste containment zone at the Department of Energy closure sites. This work compares the currently used water-balance code (HELP) with newly developed computer codes that use unsaturated flow (Richards’ equation). It provides a literature review of the HELP model and the proposed codes, which result in two recommended codes for further evaluation: HYDRUS-2D3D and VADOSE/W. This further evaluation involved performing actual simulations on a simple model and comparing the results of those simulations to those obtained with the HELP code and the field data. From the results of this work, we conclude that the new codes perform nearly the same, although moving forward, we recommend HYDRUS-2D3D.

  6. Measuring success in obesity prevention: a synthesis of Health Promotion Switzerland's long-term monitoring and evaluation strategy. (United States)

    Ackermann, Günter; Kirschner, Michael; Guggenbühl, Lisa; Abel, Bettina; Klohn, Axel; Mattig, Thomas


    Since 2007, Health Promotion Switzerland has implemented a national priority program for a healthy body weight. This article provides insight into the methodological challenges and results of the program evaluation. Evaluation of the long-term program required targeted monitoring and evaluation projects addressing different outcome levels. The evaluation was carried out according to the Swiss Model for Outcome Classification (SMOC), a model designed to classify the effects of health promotion and prevention efforts. The results presented in this article emphasize both content and methods. The national program successfully achieved outcomes on many different levels within complex societal structures. The evaluation system built around the SMOC enabled assessment of program progress and the development of key indicators. However, it is not possible to determine definitively to what extent the national program helped stabilize the prevalence of obesity in Switzerland. The model has shown its utility in providing a basis for evaluation and monitoring of the national program. Continuous analysis of data from evaluation and monitoring has made it possible to check the plausibility of suspected causal relationships as well as to establish an overall perspective and assessment of effectiveness supported by a growing body of evidence. © 2015 S. Karger GmbH, Freiburg.

  7. Measuring Success in Obesity Prevention: A Synthesis of Health Promotion Switzerland's Long-Term Monitoring and Evaluation Strategy

    Directory of Open Access Journals (Sweden)

    Günter Ackermann


    Full Text Available Aims: Since 2007, Health Promotion Switzerland has implemented a national priority program for a healthy body weight. This article provides insight into the methodological challenges and results of the program evaluation. Methods: Evaluation of the long-term program required targeted monitoring and evaluation projects addressing different outcome levels. The evaluation was carried out according to the Swiss Model for Outcome Classification (SMOC, a model designed to classify the effects of health promotion and prevention efforts. Results: The results presented in this article emphasize both content and methods. The national program successfully achieved outcomes on many different levels within complex societal structures. The evaluation system built around the SMOC enabled assessment of program progress and the development of key indicators. However, it is not possible to determine definitively to what extent the national program helped stabilize the prevalence of obesity in Switzerland. Conclusion: The model has shown its utility in providing a basis for evaluation and monitoring of the national program. Continuous analysis of data from evaluation and monitoring has made it possible to check the plausibility of suspected causal relationships as well as to establish an overall perspective and assessment of effectiveness supported by a growing body of evidence.

  8. Evaluating the TD model of classical conditioning. (United States)

    Ludvig, Elliot A; Sutton, Richard S; Kehoe, E James


    The temporal-difference (TD) algorithm from reinforcement learning provides a simple method for incrementally learning predictions of upcoming events. Applied to classical conditioning, TD models suppose that animals learn a real-time prediction of the unconditioned stimulus (US) on the basis of all available conditioned stimuli (CSs). In the TD model, similar to other error-correction models, learning is driven by prediction errors--the difference between the change in US prediction and the actual US. With the TD model, however, learning occurs continuously from moment to moment and is not artificially constrained to occur in trials. Accordingly, a key feature of any TD model is the assumption about the representation of a CS on a moment-to-moment basis. Here, we evaluate the performance of the TD model with a heretofore unexplored range of classical conditioning tasks. To do so, we consider three stimulus representations that vary in their degree of temporal generalization and evaluate how the representation influences the performance of the TD model on these conditioning tasks.

  9. Bayesian model evidence as a model evaluation metric (United States)

    Guthke, Anneli; Höge, Marvin; Nowak, Wolfgang


    When building environmental systems models, we are typically confronted with the questions of how to choose an appropriate model (i.e., which processes to include or neglect) and how to measure its quality. Various metrics have been proposed that shall guide the modeller towards a most robust and realistic representation of the system under study. Criteria for evaluation often address aspects of accuracy (absence of bias) or of precision (absence of unnecessary variance) and need to be combined in a meaningful way in order to address the inherent bias-variance dilemma. We suggest using Bayesian model evidence (BME) as a model evaluation metric that implicitly performs a tradeoff between bias and variance. BME is typically associated with model weights in the context of Bayesian model averaging (BMA). However, it can also be seen as a model evaluation metric in a single-model context or in model comparison. It combines a measure for goodness of fit with a penalty for unjustifiable complexity. Unjustifiable refers to the fact that the appropriate level of model complexity is limited by the amount of information available for calibration. Derived in a Bayesian context, BME naturally accounts for measurement errors in the calibration data as well as for input and parameter uncertainty. BME is therefore perfectly suitable to assess model quality under uncertainty. We will explain in detail and with schematic illustrations what BME measures, i.e. how complexity is defined in the Bayesian setting and how this complexity is balanced with goodness of fit. We will further discuss how BME compares to other model evaluation metrics that address accuracy and precision such as the predictive logscore or other model selection criteria such as the AIC, BIC or KIC. Although computationally more expensive than other metrics or criteria, BME represents an appealing alternative because it provides a global measure of model quality. Even if not applicable to each and every case, we aim

  10. Use of apparent water age at groundwater production wells to evaluate maps of long term average groundwater recharge (United States)

    Selle, B.; Shakked, N.; Rink, K.; Kolditz, O.


    The long-term average groundwater recharge typically varies greatly in space due to differences in climate, soil, landuse, topography and groundwater levels. Long-term average recharge is a key input to steady- state groundwater models. This type of models is frequently used to determine the long-term safe yield to aquifer systems and to predict the fate of human-derived contaminants in capture zones of groundwater production wells. An accurate quantification of the spatial distribution of recharge is therefore important but difficult to achieve as it cannot be directly measured at regional scales. Several methods have been applied to indirectly estimate the spatial distribution of long-term average groundwater recharge including spatially distributed hydrological modelling and the regionalisation of catchment scale recharge estimated from long-term average baseflows. In the context of a groundwater model, it is difficult to evaluate the reliability of these recharge maps if groundwater levels are the only data set available for model calibration. Concentrations of environmental tracers measured at groundwater wells can be used to interpret the apparent groundwater age, which in turn is significantly influenced by the magnitude and spatial distribution of recharge contributing groundwater to the well. Thus, the apparent groundwater age at wells may be useful to assess the reliability of maps on long term average groundwater recharge. To test this hypothesis, a steady-state groundwater model was set up for a 180 km2 area in SW Germany, which is extensively used for groundwater production from a confined limestone aquifer. We tested three maps of long-term average recharge that were all computed using different methods. For each recharge map, the steady-state groundwater model was calibrated against water levels and mean groundwater ages were calculated for groundwater production wells from particle tracking methods. These computed groundwater ages were subsequently

  11. Marine and Coastal Morphology: medium term and long-term area modelling

    DEFF Research Database (Denmark)

    Kristensen, Sten Esbjørn

    evolution model and apply them to problems concerning coastal protection strategies (both hard and soft measures). The applied coastal protection strategies involve morphological impact of detached shore parallel segmented breakwaters and shore normal impermeable groynes in groyne fields, and morphological...... solution has a two dimensional nature. 1.5D shoreline model A so-called “1.5D” implementation which introduces redistribution of sediment within a coastal profile in response to horizontal 2D currents makes it possible to simulate the morphological development in areas where 2D evolution occurs....... The 1.5D model is seen to produce reasonable results when subject to cases with detached breakwaters and groynes. The computational efficiency of the model is however reduced compared to the 1D model, because the increased freedom of the model reduces the maximum stable morphological time step. Bar...

  12. Evaluating software architecture using fuzzy formal models

    Directory of Open Access Journals (Sweden)

    Payman Behbahaninejad


    Full Text Available Unified Modeling Language (UML has been recognized as one of the most popular techniques to describe static and dynamic aspects of software systems. One of the primary issues in designing software packages is the existence of uncertainty associated with such models. Fuzzy-UML to describe software architecture has both static and dynamic perspective, simultaneously. The evaluation of software architecture design phase initiates always help us find some additional requirements, which helps reduce cost of design. In this paper, we use a fuzzy data model to describe the static aspects of software architecture and the fuzzy sequence diagram to illustrate the dynamic aspects of software architecture. We also transform these diagrams into Petri Nets and evaluate reliability of the architecture. The web-based hotel reservation system for further explanation has been studied.

  13. Building self-consistent, short-term earthquake probability (STEP models: improved strategies and calibration procedures

    Directory of Open Access Journals (Sweden)

    Damiano Monelli


    Full Text Available We present here two self-consistent implementations of a short-term earthquake probability (STEP model that produces daily seismicity forecasts for the area of the Italian national seismic network. Both implementations combine a time-varying and a time-invariant contribution, for which we assume that the instrumental Italian earthquake catalog provides the best information. For the time-invariant contribution, the catalog is declustered using the clustering technique of the STEP model; the smoothed seismicity model is generated from the declustered catalog. The time-varying contribution is what distinguishes the two implementations: 1 for one implementation (STEP-LG, the original model parameterization and estimation is used; 2 for the other (STEP-NG, the mean abundance method is used to estimate aftershock productivity. In the STEP-NG implementation, earthquakes with magnitude up to ML= 6.2 are expected to be less productive compared to the STEP-LG implementation, whereas larger earthquakes are expected to be more productive. We have retrospectively tested the performance of these two implementations and applied likelihood tests to evaluate their consistencies with observed earthquakes. Both of these implementations were consistent with the observed earthquake data in space: STEP-NG performed better than STEP-LG in terms of forecast rates. More generally, we found that testing earthquake forecasts issued at regular intervals does not test the full power of clustering models, and future experiments should allow for more frequent forecasts starting at the times of triggering events.

  14. Human models of migraine - short-term pain for long-term gain

    DEFF Research Database (Denmark)

    Ashina, Messoud; Hansen, Jakob Møller; Á Dunga, Bára Oladóttir


    of molecular pathways that are responsible for initiation of migraine attacks. Combining experimental human models with advanced imaging techniques might help to identify biomarkers of migraine, and in the ongoing search for new and better migraine treatments, human models will have a key role in the discovery...


    Directory of Open Access Journals (Sweden)

    A. Cilek


    Full Text Available The development and the application of rainfall-runoff models have been a corner-stone of hydrological research for many decades. The amount of rainfall and its intensity and variability control the generation of runoff and the erosional processes operating at different scales. These interactions can be greatly variable in Mediterranean catchments with marked hydrological fluctuations. The aim of the study was to evaluate the performance of rainfall-runoff model, for rainfall-runoff simulation in a Mediterranean subcatchment. The Pan-European Soil Erosion Risk Assessment (PESERA, a simplified hydrological process-based approach, was used in this study to combine hydrological surface runoff factors. In total 128 input layers derived from data set includes; climate, topography, land use, crop type, planting date, and soil characteristics, are required to run the model. Initial ground cover was estimated from the Landsat ETM data provided by ESA. This hydrological model was evaluated in terms of their performance in Goksu River Watershed, Turkey. It is located at the Central Eastern Mediterranean Basin of Turkey. The area is approximately 2000 km2. The landscape is dominated by bare ground, agricultural and forests. The average annual rainfall is 636.4mm. This study has a significant importance to evaluate different model performances in a complex Mediterranean basin. The results provided comprehensive insight including advantages and limitations of modelling approaches in the Mediterranean environment.

  16. Evaluation of articulation simulation system using artificial maxillectomy models. (United States)

    Elbashti, M E; Hattori, M; Sumita, Y I; Taniguchi, H


    Acoustic evaluation is valuable for guiding the treatment of maxillofacial defects and determining the effectiveness of rehabilitation with an obturator prosthesis. Model simulations are important in terms of pre-surgical planning and pre- and post-operative speech function. This study aimed to evaluate the acoustic characteristics of voice generated by an articulation simulation system using a vocal tract model with or without artificial maxillectomy defects. More specifically, we aimed to establish a speech simulation system for maxillectomy defect models that both surgeons and maxillofacial prosthodontists can use in guiding treatment planning. Artificially simulated maxillectomy defects were prepared according to Aramany's classification (Classes I-VI) in a three-dimensional vocal tract plaster model of a subject uttering the vowel /a/. Formant and nasalance acoustic data were analysed using Computerized Speech Lab and the Nasometer, respectively. Formants and nasalance of simulated /a/ sounds were successfully detected and analysed. Values of Formants 1 and 2 for the non-defect model were 675.43 and 976.64 Hz, respectively. Median values of Formants 1 and 2 for the defect models were 634.36 and 1026.84 Hz, respectively. Nasalance was 11% in the non-defect model, whereas median nasalance was 28% in the defect models. The results suggest that an articulation simulation system can be used to help surgeons and maxillofacial prosthodontists to plan post-surgical defects that will be facilitate maxillofacial rehabilitation. © 2015 John Wiley & Sons Ltd.

  17. Medium-term hydrologic forecasting in mountain basins using forecasting of a mesoscale numerical weather model (United States)

    Castro Heredia, L. M.; Suarez, F. I.; Fernandez, B.; Maass, T.


    For forecasting of water resources, weather outputs provide a valuable source of information which is available online. Compared to traditional ground-based meteorological gauges, weather forecasts data offer spatially and temporally continuous data not yet evaluated and used in the forecasting of water resources in mountainous regions in Chile. Nevertheless, the use of this non-conventional data has been limited or null in developing countries, basically because of the spatial resolution, despite the high potential in the management of water resources. The adequate incorporation of these data in hydrological models requires its evaluation while taking into account the features of river basins in mountainous regions. This work presents an integrated forecasting system which represents a radical change in the way of making the streamflow forecasts in Chile, where the snowmelt forecast is the principal component of water resources management. The integrated system is composed of a physically based hydrological model, which is the prediction tool itself, together with a methodology for remote sensing data gathering that allows feed the hydrological model in real time, and meteorological forecasts from NCEP-CFSv2. Previous to incorporation of meteorological forecasts into the hydrological model, the weather outputs were evaluated and downscaling using statistical downscaling methods. The hydrological forecasts were evaluated in two mountain basins in Chile for a term of six months for the snowmelt period. In every month an assimilation process was performed, and the hydrological forecast was improved. Each month, the snow cover area (from remote sensing) and the streamflow observed were used to assimilate the model parameters in order to improve the next hydrological forecast using meteorological forecasts. The operation of the system in real time shows a good agreement between the streamflow and the snow cover area observed. The hydrological model and the weather

  18. A long term model of circulation. [human body (United States)

    White, R. J.


    A quantitative approach to modeling human physiological function, with a view toward ultimate application to long duration space flight experiments, was undertaken. Data was obtained on the effect of weightlessness on certain aspects of human physiological function during 1-3 month periods. Modifications in the Guyton model are reviewed. Design considerations for bilateral interface models are discussed. Construction of a functioning whole body model was studied, as well as the testing of the model versus available data.

  19. Performance Evaluation and Modelling of Container Terminals (United States)

    Venkatasubbaiah, K.; Rao, K. Narayana; Rao, M. Malleswara; Challa, Suresh


    The present paper evaluates and analyzes the performance of 28 container terminals of south East Asia through data envelopment analysis (DEA), principal component analysis (PCA) and hybrid method of DEA-PCA. DEA technique is utilized to identify efficient decision making unit (DMU)s and to rank DMUs in a peer appraisal mode. PCA is a multivariate statistical method to evaluate the performance of container terminals. In hybrid method, DEA is integrated with PCA to arrive the ranking of container terminals. Based on the composite ranking, performance modelling and optimization of container terminals is carried out through response surface methodology (RSM).

  20. Probabilistic evaluation of competing climate models (United States)

    Braverman, Amy; Chatterjee, Snigdhansu; Heyman, Megan; Cressie, Noel


    Climate models produce output over decades or longer at high spatial and temporal resolution. Starting values, boundary conditions, greenhouse gas emissions, and so forth make the climate model an uncertain representation of the climate system. A standard paradigm for assessing the quality of climate model simulations is to compare what these models produce for past and present time periods, to observations of the past and present. Many of these comparisons are based on simple summary statistics called metrics. In this article, we propose an alternative: evaluation of competing climate models through probabilities derived from tests of the hypothesis that climate-model-simulated and observed time sequences share common climate-scale signals. The probabilities are based on the behavior of summary statistics of climate model output and observational data over ensembles of pseudo-realizations. These are obtained by partitioning the original time sequences into signal and noise components, and using a parametric bootstrap to create pseudo-realizations of the noise sequences. The statistics we choose come from working in the space of decorrelated and dimension-reduced wavelet coefficients. Here, we compare monthly sequences of CMIP5 model output of average global near-surface temperature anomalies to similar sequences obtained from the well-known HadCRUT4 data set as an illustration.

  1. Evaluation of parameters in hydrodynamic model

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Tae-Hoon [Hanyang University, Seoul (Korea); Lee, Jong-Wook [Korea Institute of Construction Technology, Koyang (Korea); Jegal, Sun-Dong [Kumho Engineering Company, Anyang (Korea)


    Generally speaking, a hydrodynamic model needs a friction coefficient (Manning coefficient or Chezy coefficient) and eddy viscosity. For numerical solution the coefficients are usually determined by recursive calculations. The eddy viscosity in numerical model plays physical diffusion in flow and also acts as numerical viscosity. Hence its value has influence on the stability of numerical solution and for these reasons a consistent evaluation procedure is needed. By using records of stage and discharge in the downstream reach of the Han river, 1-D models (HEC-2 and NETWORK) and 2-D model (SMS), estimated values of Manning coefficient and an empirical equation for eddy viscosity are presented. The computed results are verified through the recorded flow elevation data. (author). 26 refs., 7 tabs., 14 figs.

  2. Modeling the long-term evolution of space debris (United States)

    Nikolaev, Sergei; De Vries, Willem H.; Henderson, John R.; Horsley, Matthew A.; Jiang, Ming; Levatin, Joanne L.; Olivier, Scot S.; Pertica, Alexander J.; Phillion, Donald W.; Springer, Harry K.


    A space object modeling system that models the evolution of space debris is provided. The modeling system simulates interaction of space objects at simulation times throughout a simulation period. The modeling system includes a propagator that calculates the position of each object at each simulation time based on orbital parameters. The modeling system also includes a collision detector that, for each pair of objects at each simulation time, performs a collision analysis. When the distance between objects satisfies a conjunction criterion, the modeling system calculates a local minimum distance between the pair of objects based on a curve fitting to identify a time of closest approach at the simulation times and calculating the position of the objects at the identified time. When the local minimum distance satisfies a collision criterion, the modeling system models the debris created by the collision of the pair of objects.

  3. Evaluation of Life Sciences and Social Sciences Course Books in Term of Societal Sexuality (United States)

    Aykac, Necdet


    This study aims to evaluate primary school Life Sciences (1st, 2nd, and 3rd grades) and Social Sciences (4th, 5th, and 6th grades) course books in terms of gender discrimination. This study is a descriptive study aiming to evaluate the primary school Life Sciences (1st, 2nd, 3rd grades) and Social Sciences (4th, 5th, and 6th grades) course books…

  4. Evaluation of oxidative status in short-term exercises of adolescent athletes


    Karacabey, K.; Atas, A; D Zeyrek; A Cakmak; R Kurkcu; F Yamaner


    The aim of the study was to evaluate the effects of short-term exercise on total antioxidant status (TAS), lipid hydroperoxide (LOOHs), total oxidative status (TOS) and oxidative stress index (OSI) in adolescent athletes. A total of 62 adolescent participated in the study. Athletes were trained regularly 3 days a week for 2 hours. All subjects followed a circuit exercise program. Blood samples were collected just before and immediately after the exercise program. Antioxidant status was evalu...

  5. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)



    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  6. Evaluating spatial patterns in hydrological modelling

    DEFF Research Database (Denmark)

    Koch, Julian

    is not fully exploited by current modelling frameworks due to the lack of suitable spatial performance metrics. Furthermore, the traditional model evaluation using discharge is found unsuitable to lay confidence on the predicted catchment inherent spatial variability of hydrological processes in a fully...... the contiguous United Sates (10^6 km2). To this end, the thesis at hand applies a set of spatial performance metrics on various hydrological variables, namely land-surface-temperature (LST), evapotranspiration (ET) and soil moisture. The inspiration for the applied metrics is found in related fields...

  7. Comment on ``θ-term renormalization in (2+1)-dimensional CPN-1 model with a θ term'' (United States)

    Kondrashuk, I. N.; Kotikov, A. V.


    It is found that in a recent paper by Park the first coefficient of the nonzero β function for the Chern-Simons term in the 1/N expansion of the CPN-1 model is untrue numerically. The correct result is given. The main conclusions of the Park's paper are not changed.

  8. Effects of mid-term student evaluations of teaching as measured by end-of-term evaluations: An emperical study of course evaluations

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder; Sliusarenko, Tamara; Ersbøll, Bjarne Kjær


    Universities have varying policies on how and when to perform student evaluations of courses and teachers. More empirical evidence of the consequences of such policies on quality enhancement of teaching and learning is needed. A study (35 courses at the Technical University of Denmark....... The evaluations generally showed positive improvements over the semester for courses with access, and negative improvements for those without access. Improvements related to: Student learning, student satisfaction, teaching activities, and communication showed statistically significant average differences of 0.......1-0.2 points between the two groups. These differences are relatively large compared to the standard deviation of the scores when student effect is removed (approximately 0.7). We conclude that university policies on course evaluations seem to have an impact on the development of the teaching and learning...

  9. Large Engine Technology Program. Task 21: Rich Burn Liner for Near Term Experimental Evaluations (United States)

    Hautman, D. J.; Padget, F. C.; Kwoka, D.; Siskind, K. S.; Lohmann, R. P.


    The objective of the task reported herein, which was conducted as part of the NASA sponsored Large Engine Technology program, was to define and evaluate a near-term rich-zone liner construction based on currently available materials and fabrication processes for a Rich-Quench-Lean combustor. This liner must be capable of operation at the temperatures and pressures of simulated HSCT flight conditions but only needs sufficient durability for limited duration testing in combustor rigs and demonstrator engines in the near future. This must be achieved at realistic cooling airflow rates since the approach must not compromise the emissions, performance, and operability of the test combustors, relative to the product engine goals. The effort was initiated with an analytical screening of three different liner construction concepts. These included a full cylinder metallic liner and one with multiple segments of monolithic ceramic, both of which incorporated convective cooling on the external surface using combustor airflow that bypassed the rich zone. The third approach was a metallic platelet construction with internal convective cooling. These three metal liner/jacket combinations were tested in a modified version of an existing Rich-Quench-Lean combustor rig to obtain data for heat transfer model refinement and durability verification.

  10. Eye growth in term- and preterm-born eyes modeled from magnetic resonance images. (United States)

    Munro, Robert J; Fulton, Anne B; Chui, Toco Y P; Moskowitz, Anne; Ramamirtham, Ramkumar; Hansen, Ronald M; Prabhu, Sanjay P; Akula, James D


    We generated a model of eye growth and tested it against an eye known to develop abnormally, one with a history of retinopathy of prematurity (ROP). We reviewed extant magnetic resonance images (MRIs) from term and preterm-born patients for suitable images (n = 129). We binned subjects for analysis based upon postmenstrual age at birth (in weeks) and ROP history ("Term" ≥ 37, "Premature" ≤ 32 with no ROP, "ROP" ≤ 32 with ROP). We measured the axial positions and curvatures of the cornea, anterior and posterior lens, and inner retinal surface. We fit anterior chamber depth (ACD), posterior segment depth (PSD), axial length (AL), and corneal and lenticular curvatures with logistic growth curves that we then evaluated for significant differences. We also measured the length of rays from the centroid to the surface of the eye at 5° intervals, and described the length versus age relationship of each ray, L(ray)(x), using the same logistic growth curve. We determined the rate of ray elongation, E(ray)(x), from L(ray)dy/dx. Then, we estimated the scleral growth that accounted for E(ray)(x), G(x), at every age and position. Relative to Term, development of ACD, PSD, AL, and corneal and lenticular curvatures was delayed in ROP eyes, but not Premature eyes. In Term infants, G(x) was fast and predominantly equatorial; in age-matched ROP eyes, maximal G(x) was offset by approximately 90°. We produced a model of normal eye growth in term-born subjects. Relative to normal, the ROP eye is characterized by delayed, abnormal growth.

  11. The Usage Evaluation of Official Computer Terms in Bahasa Indonesia in Indonesian Government Official Websites (United States)

    Amalia, A.; Gunawan, D.; Lydia, M. S.; Charlie, C.


    According to Undang-Undang Dasar Republik Indonesia 1945 Pasal 36, Bahasa Indonesia is a National Language of Indonesia. It means Bahasa Indonesia must be used as an official language in all levels ranging from government to education as well as in development of science and technology. The Government of Republic of Indonesia as the highest and formal authority must use official Bahasa Indonesia in their activities including in their official websites. Therefore, the government issued a regulation instruction called Instruksi Presiden (Inpres) No. 2 Tahun 2001 to govern the usage of official computer terms in Bahasa Indonesia. The purpose of this research is to evaluate the usage of official computer terms in Bahasa Indonesia compared to the computer terms in English. The data are obtained from the government official websites in Indonesia. The method consists of data gathering, template detection, string extraction and data analysis. The evaluation of official computer terms in Bahasa Indonesia falls into three categories, such as good, moderate and poor. The number of websites in good category is 281 websites, the moderate category is 512 websites and the poor category is 290 websites. The authorized institution may use this result as additional information to evaluate the implementation of official information technology terms in Bahasa Indonesia.

  12. Evaluation of CNN as anthropomorphic model observer (United States)

    Massanes, Francesc; Brankov, Jovan G.


    Model observers (MO) are widely used in medical imaging to act as surrogates of human observers in task-based image quality evaluation, frequently towards optimization of reconstruction algorithms. In this paper, we explore the use of convolutional neural networks (CNN) to be used as MO. We will compare CNN MO to alternative MO currently being proposed and used such as the relevance vector machine based MO and channelized Hotelling observer (CHO). As the success of the CNN, and other deep learning approaches, is rooted in large data sets availability, which is rarely the case in medical imaging systems task-performance evaluation, we will evaluate CNN performance on both large and small training data sets.

  13. Implicit moral evaluations: A multinomial modeling approach. (United States)

    Cameron, C Daryl; Payne, B Keith; Sinnott-Armstrong, Walter; Scheffer, Julian A; Inzlicht, Michael


    Implicit moral evaluations-i.e., immediate, unintentional assessments of the wrongness of actions or persons-play a central role in supporting moral behavior in everyday life. Yet little research has employed methods that rigorously measure individual differences in implicit moral evaluations. In five experiments, we develop a new sequential priming measure-the Moral Categorization Task-and a multinomial model that decomposes judgment on this task into multiple component processes. These include implicit moral evaluations of moral transgression primes (Unintentional Judgment), accurate moral judgments about target actions (Intentional Judgment), and a directional tendency to judge actions as morally wrong (Response Bias). Speeded response deadlines reduced Intentional Judgment but not Unintentional Judgment (Experiment 1). Unintentional Judgment was stronger toward moral transgression primes than non-moral negative primes (Experiments 2-4). Intentional Judgment was associated with increased error-related negativity, a neurophysiological indicator of behavioral control (Experiment 4). Finally, people who voted for an anti-gay marriage amendment had stronger Unintentional Judgment toward gay marriage primes (Experiment 5). Across Experiments 1-4, implicit moral evaluations converged with moral personality: Unintentional Judgment about wrong primes, but not negative primes, was negatively associated with psychopathic tendencies and positively associated with moral identity and guilt proneness. Theoretical and practical applications of formal modeling for moral psychology are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Evaluating two model reduction approaches for large scale hedonic models sensitive to omitted variables and multicollinearity

    DEFF Research Database (Denmark)

    Panduro, Toke Emil; Thorsen, Bo Jellesmark


    Hedonic models in environmental valuation studies have grown in terms of number of transactions and number of explanatory variables. We focus on the practical challenge of model reduction, when aiming for reliable parsimonious models, sensitive to omitted variable bias and multicollinearity. We...... evaluate two common model reduction approaches in an empirical case. The first relies on a principal component analysis (PCA) used to construct new orthogonal variables, which are applied in the hedonic model. The second relies on a stepwise model reduction based on the variance inflation index and Akaike......’s information criteria. Our empirical application focuses on estimating the implicit price of forest proximity in a Danish case area, with a dataset containing 86 relevant variables. We demonstrate that the estimated implicit price for forest proximity, while positive in all models, is clearly sensitive...

  15. Evaluation of anastomotic strength and drug safety after short-term sunitinib administration in rabbits. (United States)

    Fallon, Erica M; Nehra, Deepika; Carlson, Sarah J; Brown, David W; Nedder, Arthur P; Rueda, Bo R; Puder, Mark


    Sunitinib (Sutent) is a Food and Drug Administration-approved receptor tyrosine kinase inhibitor found to reduce postoperative adhesion formation in animal models. The objective of the present study was to evaluate anastomotic healing and potential drug-related toxicities after short-term sunitinib administration in New Zealand White rabbits. Under an approved study protocol, 40 rabbits underwent a laparotomy followed by colonic transection and anastomosis. Animals were randomly assigned to treatment with oral sunitinib (10 mg/kg/d) or placebo, received one preoperative dose followed by 10 postoperative doses, and were divided into two groups following the procedure: group I animals were euthanized on completion of drug treatment and group II animals were euthanized 30 d after completion of treatment. Prior to study completion, animals underwent an echocardiogram and laboratory test results were obtained. At necropsy, intestinal bursting strength (in mmHg) was evaluated. All animals survived until designated euthanasia. There was no evidence of intra-abdominal sepsis or intestinal obstruction. Sunitinib-treated animals were found to have lower intestinal anastomotic strength compared with placebo-treated animals, as measured by bursting pressure at euthanasia, and a greater percentage of bursting at the anastomosis. On echocardiography, all ejection and shortening fractions were within established normal reference values. There were no significant differences in liver enzymes between animals. There were no wound infections, dehiscence, or delayed wound healing in any animal. These results caution against the administration of sunitinib in cases involving intestinal anastomoses because of the elevated risk of anastomotic leak. No evidence of cardiotoxicity, hepatotoxicity, or detrimental effect on wound healing was found in any animal. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Long-term evaluation of eccentric viewing spectacles in patients with bilateral central scotomas.

    NARCIS (Netherlands)

    Verezen, C.A.; Meulendijks, C.F.M.; Hoyng, C.B.; Klevering, B.J.


    PURPOSE: Yoked prism spectacles (eccentric viewing spectacles [EVS]) facilitate eccentric viewing in patients with bilateral central scotomas. This study was conducted to evaluate the long-term success and patient satisfaction of this type of low-vision aid. METHODS. In this retrospective

  17. Methodological Considerations in Evaluating Long-Term Systems Change: A Case Study From Eastern Nepal (United States)

    Koleros, Andrew; Jupp, Dee; Kirwan, Sean; Pradhan, Meeta S.; Pradhan, Pushkar K.; Seddon, David; Tumbahangfe, Ansu


    This article presents discussion and recommendations on approaches to retrospectively evaluating development interventions in the long term through a systems lens. It is based on experiences from the implementation of an 18-month study to investigate the impact of development interventions on economic and social change over a 40-year period in the…

  18. Long term treatment with gabapentin in an animal model of chronic neuropathic pain

    DEFF Research Database (Denmark)

    Baastrup, C. S.; Andrews, N.; Wegener, Gregers


    In preclinical animal pain research potential efficacy of a drug is often evaluate after a single exposure, which is in contrast to the long lasting treatment needed in chronic neuropathic pain (CNP) patients. Gabapentin remains one of the most efficacious drugs in the treatment of CNP. The aims...... of the study were to evaluate the spinal cord contusion (SCC) model and 2 different measures of painlike behaviour using a long term treatment schedule with gabapentin. Furthermore the effect on mobility and on anxiety, a pain-related behaviour, was included. 40 Female SD rats with a T13 SCC and sham animals...... was measured after initial dose, 1 and 6 weeks of treatment. Preliminary results show that saline-treated SCC animals (N=10) have significantly lower MST with supra-spinal responses on the thorax compared to saline-treated shams (N=10), and gabapentin-treated SCC (N=10) and sham animals (N=10) throughout...

  19. A Multi-Stage Maturity Model for Long-Term IT Outsourcing Relationship Success (United States)

    Luong, Ming; Stevens, Jeff


    The Multi-Stage Maturity Model for Long-Term IT Outsourcing Relationship Success, a theoretical stages-of-growth model, explains long-term success in IT outsourcing relationships. Research showed the IT outsourcing relationship life cycle consists of four distinct, sequential stages: contract, transition, support, and partnership. The model was…

  20. Infrahumanization in children: An evaluation of 70 terms relating to humanity. (United States)

    Betancor Rodríguez, Verónica; Chas Villar, Alexandra; Rodríguez-Pérez, Armando; Delgado Rodríguez, Naira


    Research on infrahumanization has revealed that individuals attribute more secondary emotions to the in-group than to the out-group, whose capacity to experience them is denied or restricted. When this bias has been examined in children, researchers have used the same taxonomy of affective terms as that used with adults. The aim of this research is to conduct a normative study that will equip researchers with a taxonomy of humanity attributed to emotional terms specifically for children. Three hundred and sixty-three children aged between 11 and 12 responded to several questionnaires containing a total of 70 emotional terms, evaluated in eight dimensions associated with the perception of humanity. Principal component analysis shows that children categorize implicit dimensions associated with humanity differently to adults. This normative study enables the selection of graded emotional terms in humanity perceived by a child sample, in order to overcome current limitations in research on infrahumanization in children.

  1. Loss terms in free-piston Stirling engine models (United States)

    Gordon, Lloyd B.


    Various models for free piston Stirling engines are reviewed. Initial models were developed primarily for design purposes and to predict operating parameters, especially efficiency. More recently, however, such models have been used to predict engine stability. Free piston Stirling engines have no kinematic constraints and stability may not only be sensitive to the load, but also to various nonlinear loss and spring constraints. The present understanding is reviewed of various loss mechanisms for free piston Stirling engines and how they have been incorporated into engine models is discussed.

  2. A Spectral Evaluation of Models Performances in Mediterranean Oak Woodlands (United States)

    Vargas, R.; Baldocchi, D. D.; Abramowitz, G.; Carrara, A.; Correia, A.; Kobayashi, H.; Papale, D.; Pearson, D.; Pereira, J.; Piao, S.; Rambal, S.; Sonnentag, O.


    Ecosystem processes are influenced by climatic trends at multiple temporal scales including diel patterns and other mid-term climatic modes, such as interannual and seasonal variability. Because interactions between biophysical components of ecosystem processes are complex, it is important to test how models perform in frequency (e.g. hours, days, weeks, months, years) and time (i.e. day of the year) domains in addition to traditional tests of annual or monthly sums. Here we present a spectral evaluation using wavelet time series analysis of model performance in seven Mediterranean Oak Woodlands that encompass three deciduous and four evergreen sites. We tested the performance of five models (CABLE, ORCHIDEE, BEPS, Biome-BGC, and JULES) on measured variables of gross primary production (GPP) and evapotranspiration (ET). In general, model performance fails at intermediate periods (e.g. weeks to months) likely because these models do not represent the water pulse dynamics that influence GPP and ET at these Mediterranean systems. To improve the performance of a model it is critical to identify first where and when the model fails. Only by identifying where a model fails we can improve the model performance and use them as prognostic tools and to generate further hypotheses that can be tested by new experiments and measurements.

  3. RTMOD: Real-Time MODel evaluation

    DEFF Research Database (Denmark)

    Graziani, G.; Galmarini, S.; Mikkelsen, Torben


    The 1998 - 1999 RTMOD project is a system based on an automated statistical evaluation for the inter-comparison of real-time forecasts produced by long-range atmospheric dispersion models for national nuclear emergency predictions of cross-boundaryconsequences. The background of RTMOD was the 1994...... ETEX project that involved about 50 models run in several Institutes around the world to simulate two real tracer releases involving a large part of the European territory. In the preliminary phase ofETEX, three dry runs (i.e. simulations in real-time of fictitious releases) were carried out...... would be recalculated to include the influence by all available predictions. The new web-based RTMOD concept has proven useful as a practical decision-making tool for real-time communicationbetween dispersion modellers around the World and for fast and standardised information exchange on the most...

  4. Multivariate time series modeling of short-term system scale irrigation demand (United States)

    Perera, Kushan C.; Western, Andrew W.; George, Biju; Nawarathna, Bandara


    Travel time limits the ability of irrigation system operators to react to short-term irrigation demand fluctuations that result from variations in weather, including very hot periods and rainfall events, as well as the various other pressures and opportunities that farmers face. Short-term system-wide irrigation demand forecasts can assist in system operation. Here we developed a multivariate time series (ARMAX) model to forecast irrigation demands with respect to aggregated service points flows (IDCGi, ASP) and off take regulator flows (IDCGi, OTR) based across 5 command areas, which included area covered under four irrigation channels and the study area. These command area specific ARMAX models forecast 1-5 days ahead daily IDCGi, ASP and IDCGi, OTR using the real time flow data recorded at the service points and the uppermost regulators and observed meteorological data collected from automatic weather stations. The model efficiency and the predictive performance were quantified using the root mean squared error (RMSE), Nash-Sutcliffe model efficiency coefficient (NSE), anomaly correlation coefficient (ACC) and mean square skill score (MSSS). During the evaluation period, NSE for IDCGi, ASP and IDCGi, OTR across 5 command areas were ranged 0.98-0.78. These models were capable of generating skillful forecasts (MSSS ⩾ 0.5 and ACC ⩾ 0.6) of IDCGi, ASP and IDCGi, OTR for all 5 lead days and IDCGi, ASP and IDCGi, OTR forecasts were better than using the long term monthly mean irrigation demand. Overall these predictive performance from the ARMAX time series models were higher than almost all the previous studies we are aware. Further, IDCGi, ASP and IDCGi, OTR forecasts have improved the operators' ability to react for near future irrigation demand fluctuations as the developed ARMAX time series models were self-adaptive to reflect the short-term changes in the irrigation demand with respect to various pressures and opportunities that farmers' face, such as

  5. Mesoscale to microscale wind farm flow modeling and evaluation: Mesoscale to Microscale Wind Farm Models

    Energy Technology Data Exchange (ETDEWEB)

    Sanz Rodrigo, Javier [National Renewable Energy Centre (CENER), Sarriguren Spain; Chávez Arroyo, Roberto Aurelio [National Renewable Energy Centre (CENER), Sarriguren Spain; Moriarty, Patrick [National Renewable Energy Laboratory (NREL), Golden CO USA; Churchfield, Matthew [National Renewable Energy Laboratory (NREL), Golden CO USA; Kosović, Branko [National Center for Atmospheric Research (NCAR), Boulder CO USA; Réthoré, Pierre-Elouan [Technical University of Denmark (DTU), Roskilde Denmark; Hansen, Kurt Schaldemose [Technical University of Denmark (DTU), Lyngby Denmark; Hahmann, Andrea [Technical University of Denmark (DTU), Roskilde Denmark; Mirocha, Jeffrey D. [Lawrence Livermore National Laboratory, Livermore CA USA; Rife, Daran [DNV GL, San Diego CA USA


    The increasing size of wind turbines, with rotors already spanning more than 150 m diameter and hub heights above 100 m, requires proper modeling of the atmospheric boundary layer (ABL) from the surface to the free atmosphere. Furthermore, large wind farm arrays create their own boundary layer structure with unique physics. This poses significant challenges to traditional wind engineering models that rely on surface-layer theories and engineering wind farm models to simulate the flow in and around wind farms. However, adopting an ABL approach offers the opportunity to better integrate wind farm design tools and meteorological models. The challenge is how to build the bridge between atmospheric and wind engineering model communities and how to establish a comprehensive evaluation process that identifies relevant physical phenomena for wind energy applications with modeling and experimental requirements. A framework for model verification, validation, and uncertainty quantification is established to guide this process by a systematic evaluation of the modeling system at increasing levels of complexity. In terms of atmospheric physics, 'building the bridge' means developing models for the so-called 'terra incognita,' a term used to designate the turbulent scales that transition from mesoscale to microscale. This range of scales within atmospheric research deals with the transition from parameterized to resolved turbulence and the improvement of surface boundary-layer parameterizations. The coupling of meteorological and wind engineering flow models and the definition of a formal model evaluation methodology, is a strong area of research for the next generation of wind conditions assessment and wind farm and wind turbine design tools. Some fundamental challenges are identified in order to guide future research in this area.

  6. Usability Evaluation of Variability Modeling by means of Common Variability Language

    Directory of Open Access Journals (Sweden)

    Jorge Echeverria


    Full Text Available Common Variability Language (CVL is a recent proposal for OMG's upcoming Variability Modeling standard. CVL models variability in terms of Model Fragments.  Usability is a widely-recognized quality criterion essential to warranty the successful use of tools that put these ideas in practice. Facing the need of evaluating the usability of CVL modeling tools, this paper presents a Usability Evaluation of CVL applied to a Modeling Tool for firmware code of Induction Hobs. This evaluation addresses the configuration, scoping and visualization facets. The evaluation involved the end users of the tool whom are engineers of our Induction Hob industrial partner. Effectiveness and efficiency results indicate that model configuration in terms of model fragment substitutions is intuitive enough but both scoping and visualization require improved tool support. Results also enabled us to identify a list of usability problems which may contribute to alleviate scoping and visualization issues in CVL.

  7. Knowledge management: Postgraduate Alternative Evaluation Model (MAPA in Brazil

    Directory of Open Access Journals (Sweden)

    Deisy Cristina Corrêa Igarashi


    Full Text Available The Brazilian stricto sensu postgraduate programs that include master and / or doctorate courses are evaluated by Coordination for the Improvement of Higher Education Personnel (CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior. The evaluation method used by CAPES is recognized in national and international context. However, several elements of the evaluation method can be improved. For example: to consider programs diversity, heterogeneity and specificities; to reduce subjectivity and to explain how indicators are grouped into different dimensions to generate a final result, which is scoring level reached by a program. This study aims to analyze the evaluation process by CAPES, presenting questions, difficulties and objections raised by researchers. From the analysis, the study proposes an alternative evaluation model for postgraduate (MAPA - Modelo de Avaliação para Pós graduação Alternativo which incorporates fuzzy logic in result analysis to minimize limitations identified. The MAPA was applied in three postgraduate programs, allowing: (1 better understanding of procedures used for the evaluation, (2 identifying elements that need regulation, (3 characterization of indicators that generate local evaluation, (4 support in medium and long term planning.

  8. Diagnosis code assignment: models and evaluation metrics. (United States)

    Perotte, Adler; Pivovarov, Rimma; Natarajan, Karthik; Weiskopf, Nicole; Wood, Frank; Elhadad, Noémie


    The volume of healthcare data is growing rapidly with the adoption of health information technology. We focus on automated ICD9 code assignment from discharge summary content and methods for evaluating such assignments. We study ICD9 diagnosis codes and discharge summaries from the publicly available Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC II) repository. We experiment with two coding approaches: one that treats each ICD9 code independently of each other (flat classifier), and one that leverages the hierarchical nature of ICD9 codes into its modeling (hierarchy-based classifier). We propose novel evaluation metrics, which reflect the distances among gold-standard and predicted codes and their locations in the ICD9 tree. Experimental setup, code for modeling, and evaluation scripts are made available to the research community. The hierarchy-based classifier outperforms the flat classifier with F-measures of 39.5% and 27.6%, respectively, when trained on 20,533 documents and tested on 2282 documents. While recall is improved at the expense of precision, our novel evaluation metrics show a more refined assessment: for instance, the hierarchy-based classifier identifies the correct sub-tree of gold-standard codes more often than the flat classifier. Error analysis reveals that gold-standard codes are not perfect, and as such the recall and precision are likely underestimated. Hierarchy-based classification yields better ICD9 coding than flat classification for MIMIC patients. Automated ICD9 coding is an example of a task for which data and tools can be shared and for which the research community can work together to build on shared models and advance the state of the art.

  9. Dynamic Hybrid Model for Short-Term Electricity Price Forecasting


    Marin Cerjan; Marin Matijaš; Marko Delimar


    Accurate forecasting tools are essential in the operation of electric power systems, especially in deregulated electricity markets. Electricity price forecasting is necessary for all market participants to optimize their portfolios. In this paper we propose a hybrid method approach for short-term hourly electricity price forecasting. The paper combines statistical techniques for pre-processing of data and a multi-layer (MLP) neural network for forecasting electricity price and price spike det...

  10. Long-term Morphological Modeling at Coastal Inlets (United States)


    wave and flow models is also determined based on a sensitivity analysis. All simulations were run on nonuniform Cartesian grids. The grid...Community Development. Lin, L., Demirbilek, Z., Mase , H., Zheng, J., and Yamada, F. (2008). “CMS- wave: A nearshore spectral wave processes model for

  11. Long-Term Calculations with Large Air Pollution Models

    DEFF Research Database (Denmark)

    Ambelas Skjøth, C.; Bastrup-Birk, A.; Brandt, J.


    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  12. An economic model of long-term use of celecoxib in patients with osteoarthritis

    Directory of Open Access Journals (Sweden)

    Rublee Dale


    Full Text Available Abstract Background Previous evaluations of the cost-effectiveness of the cyclooxygenase-2 selective inhibitor celecoxib (Celebrex, Pfizer Inc, USA have produced conflicting results. The recent controversy over the cardiovascular (CV risks of rofecoxib and other coxibs has renewed interest in the economic profile of celecoxib, the only coxib now available in the United States. The objective of our study was to evaluate the long-term cost-effectiveness of celecoxib compared with nonselective nonsteroidal anti-inflammatory drugs (nsNSAIDs in a population of 60-year-old osteoarthritis (OA patients with average risks of upper gastrointestinal (UGI complications who require chronic daily NSAID therapy. Methods We used decision analysis based on data from the literature to evaluate cost-effectiveness from a modified societal perspective over patients' lifetimes, with outcomes expressed as incremental costs per quality-adjusted life-year (QALY gained. Sensitivity tests were performed to evaluate the impacts of advancing age, CV thromboembolic event risk, different analytic horizons and alternate treatment strategies after UGI adverse events. Results Our main findings were: 1 the base model incremental cost-effectiveness ratio (ICER for celecoxib versus nsNSAIDs was $31,097 per QALY; 2 the ICER per QALY was $19,309 for a model in which UGI ulcer and ulcer complication event risks increased with advancing age; 3 the ICER per QALY was $17,120 in sensitivity analyses combining serious CV thromboembolic event (myocardial infarction, stroke, CV death risks with base model assumptions. Conclusion Our model suggests that chronic celecoxib is cost-effective versus nsNSAIDs in a population of 60-year-old OA patients with average risks of UGI events.

  13. Evaluation of Algebraic Reynolds Stress Model Assumptions Using Experimental Data (United States)

    Jyoti, B.; Ewing, D.; Matovic, D.


    The accuracy of Rodi's ASM assumption is examined by evaluating the terms in Reynolds stress transport equation and their modelled counterparts. The basic model assumption: Dτ_ij/Dt + partial T_ijl/partial xl = (τ_ij/k )(Dk/Dt + partial Tl /partial xl ) (Rodi( Rodi W., ZAMM.), 56, pp. 219-221, 1976.), can also be broken into two stronger assumptions: Da_ij/Dt = 0 and (2) partial T_ijl/partial xl = (τ_ij/k )(partial Tl /partial xl ) (e.g. Taulbee( Taulbee D. B., Phys. of Fluids), 4(11), pp. 2555-2561, 1992.). Fu et al( Fu S., Huang P.G., Launder B.E. & Leschziner M.A., J. Fluid Eng.), 110(2), pp. 216-221., 1988 examined the accuracy of Rodi's assumption using the results of RSM calculation of axisymmetric jets. Since the RSM results did not accurately predict the experimental results either, it may be useful to examine the basic ASM model assumptions using experimental data. The database of Hussein, Capp and George( Hussein H., Capp S. & George W., J.F.M.), 258, pp.31-75., 1994. is sufficiently detailed to evaluate the terms of Reynolds stress transport equations individually, thus allowing both Rodi's and the stronger assumptions to be tested. For this flow assumption (1) is well satisfied for all the components (including \\overlineuv); however, assumption (2) does not seem as well satisfied.

  14. A Multiscale Model Evaluates Screening for Neoplasia in Barrett's Esophagus.

    Directory of Open Access Journals (Sweden)

    Kit Curtius


    Full Text Available Barrett's esophagus (BE patients are routinely screened for high grade dysplasia (HGD and esophageal adenocarcinoma (EAC through endoscopic screening, during which multiple esophageal tissue samples are removed for histological analysis. We propose a computational method called the multistage clonal expansion for EAC (MSCE-EAC screening model that is used for screening BE patients in silico to evaluate the effects of biopsy sampling, diagnostic sensitivity, and treatment on disease burden. Our framework seamlessly integrates relevant cell-level processes during EAC development with a spatial screening process to provide a clinically relevant model for detecting dysplastic and malignant clones within the crypt-structured BE tissue. With this computational approach, we retain spatio-temporal information about small, unobserved tissue lesions in BE that may remain undetected during biopsy-based screening but could be detected with high-resolution imaging. This allows evaluation of the efficacy and sensitivity of current screening protocols to detect neoplasia (dysplasia and early preclinical EAC in the esophageal lining. We demonstrate the clinical utility of this model by predicting three important clinical outcomes: (1 the probability that small cancers are missed during biopsy-based screening, (2 the potential gains in neoplasia detection probabilities if screening occurred via high-resolution tomographic imaging, and (3 the efficacy of ablative treatments that result in the curative depletion of metaplastic and neoplastic cell populations in BE in terms of the long-term impact on reducing EAC incidence.

  15. Modelling Tradescantia fluminensis to assess long term survival

    Directory of Open Access Journals (Sweden)

    Alex James


    Full Text Available We present a simple Poisson process model for the growth of Tradescantia fluminensis, an invasive plant species that inhibits the regeneration of native forest remnants in New Zealand. The model was parameterised with data derived from field experiments in New Zealand and then verified with independent data. The model gave good predictions which showed that its underlying assumptions are sound. However, this simple model had less predictive power for outputs based on variance suggesting that some assumptions were lacking. Therefore, we extended the model to include higher variability between plants thereby improving its predictions. This high variance model suggests that control measures that promote node death at the base of the plant or restrict the main stem growth rate will be more effective than those that reduce the number of branching events. The extended model forms a good basis for assessing the efficacy of various forms of control of this weed, including the recently-released leaf-feeding tradescantia leaf beetle (Neolema ogloblini.

  16. Evaluating climate models: Should we use weather or climate observations?

    Energy Technology Data Exchange (ETDEWEB)

    Oglesby, Robert J [ORNL; Erickson III, David J [ORNL


    Calling the numerical models that we use for simulations of climate change 'climate models' is a bit of a misnomer. These 'general circulation models' (GCMs, AKA global climate models) and their cousins the 'regional climate models' (RCMs) are actually physically-based weather simulators. That is, these models simulate, either globally or locally, daily weather patterns in response to some change in forcing or boundary condition. These simulated weather patterns are then aggregated into climate statistics, very much as we aggregate observations into 'real climate statistics'. Traditionally, the output of GCMs has been evaluated using climate statistics, as opposed to their ability to simulate realistic daily weather observations. At the coarse global scale this may be a reasonable approach, however, as RCM's downscale to increasingly higher resolutions, the conjunction between weather and climate becomes more problematic. We present results from a series of present-day climate simulations using the WRF ARW for domains that cover North America, much of Latin America, and South Asia. The basic domains are at a 12 km resolution, but several inner domains at 4 km have also been simulated. These include regions of complex topography in Mexico, Colombia, Peru, and Sri Lanka, as well as a region of low topography and fairly homogeneous land surface type (the U.S. Great Plains). Model evaluations are performed using standard climate analyses (e.g., reanalyses; NCDC data) but also using time series of daily station observations. Preliminary results suggest little difference in the assessment of long-term mean quantities, but the variability on seasonal and interannual timescales is better described. Furthermore, the value-added by using daily weather observations as an evaluation tool increases with the model resolution.

  17. Building long-term and high spatio-temporal resolution precipitation and air temperature reanalyses by mixing local observations and global atmospheric reanalyses: the ANATEM model

    Directory of Open Access Journals (Sweden)

    A. Kuentz


    The ANATEM model has been also evaluated for the regional scale against independent long-term time series and was able to capture regional low-frequency variability over more than a century (1883–2010.

  18. Modeling and evaluation of information systems using coloured petri network

    Directory of Open Access Journals (Sweden)

    Ehsan Zamirpour


    Full Text Available Nowadays with the growth of organizations and their affiliates, the importance of information systems has increased. Functional and non-functional requirements of information systems in an organization are supported. There are literally several techniques to support the functional requirements in terms of software methodologies, but support for the second set of requirements has received little attention. Software Performance Engineering (SPE forum tries to address this issue by presenting software methodologies to support both types of requirements. In this paper, we present a formal model for the evaluation of system performance based on a pragmatic model. Because of supporting the concurrency concepts, petri net has a higher priority than queuing system. For mapping UML to colored Petri net diagram, we use an intermediate graph. The preliminary results indicate that the proposed model may save significant amount of computations.

  19. Evaluation of the St. Lucia geothermal resource: macroeconomic models

    Energy Technology Data Exchange (ETDEWEB)

    Burris, A.E.; Trocki, L.K.; Yeamans, M.K.; Kolstad, C.D.


    A macroeconometric model describing the St. Lucian economy was developed using 1970 to 1982 economic data. Results of macroeconometric forecasts for the period 1983 through 1985 show an increase in gross domestic product (GDP) for 1983 and 1984 with a decline in 1985. The rate of population growth is expected to exceed GDP growth so that a small decline in per capita GDP will occur. We forecast that garment exports will increase, providing needed employment and foreign exchange. To obtain a longer-term but more general outlook on St. Lucia's economy, and to evaluate the benefit of geothermal energy development, we applied a nonlinear programming model. The model maximizes discounted cumulative consumption.

  20. Improved national modelling by short-term measurement campaigns (United States)

    Arheimer, Berit; Lindström, Göran; Strömqvist, Johan; Spângmyr, Henrik


    The Swedish Meteorological and Hydrological Institute (SMHI) produces hydrological predictions in Sweden both within the national early warning services and to the waterpower industry. SMHI is also responsible for delivering high-resolution data to water authorities. Most of the waterbodies of interest to the European Water Framework Directive (WFD) in Sweden do not have monitoring programmes. Thus, modelled data has to be used for expert judgments. Recently, SMHI got a new request from the government to support water authorities with relevant data to fulfill the reporting and assist implementation of environmental goals within the WFD. SMHI then started the internal "Water management programme" in cooperation with water authorities, to better harmonise and develop SMHI databases, monitoring, model systems and internet services for efficient data delivery free of charge. Since the early 70's operational flood forecasts in Sweden has been based on the HBV model, but the environmental sector has other needs which has promoted a new model concept, called HYPE. The model is applied according to the Swedish water authorities classification of waterbodies (at present 17 313 limnic systems), and will be up-dated annually following water authorities requests. The model system delivers a large amount of hydrological, chemical, and physico-chemical variables. However, national monitoring programmes for calibration and validation is only available for 300 discharge stations and 900 grab sampling sites for nutrient concentrations. The HYPE model is a semi-distributed processed-based hydrological model for small-scale and large-scale assessments of water resources and water quality. In the model, the landscape is divided into classes according to soil type, land-use and altitude. In agricultural lands, the soil is divided into three layers, each with individual computations of soil wetness and nutrient processes. The model simulates water flows, and flow and turnover of

  1. Long-term soil organic carbon changes in cereal and ley rotations: model testing (United States)

    Kynding Borgen, Signe; Dörsch, Peter; Krogstad, Tore; Azzaroli Bleken, Marina


    Reliable modeling of soil organic carbon (SOC) dynamics in agroecosystems is crucial to define mitigation strategies related to crop management on the farm scale as well as the regional scale. International climate agreements and national political decisions rely to a large extent on the National Greenhouse gas Inventory Reports that are submitted annually to the UNFCCC. However, lower tier methods are used to estimate SOC changes on cropland in most country reports. The application of mechanistic models in national greenhouse gas inventory estimation requires proper model testing against measurements in order to verify the estimated emissions. Few long-term field experiments measuring SOC stock changes have been conducted in Norway. We evaluate the performance of the Introductory Carbon Balance Model (ICBM) in simulating SOC changes over 60 years in a field experiment conducted in Ås from 1953-2013. The site is located in south-eastern Norway, on the boarder of the boreal and temperate climate zone, where the majority of the country's grain production occurs. The field trial consisted of four rotations: I) continuous cereal, II) cereal + row crops, III) 2 years of ley + 4 years of cereal, IV) 4 years of ley + 2 years of cereal, and four treatments per rotation: a) low NPK, b) high NPK, c) low NPK + FYM, and d) straw (on rotations I and II) or high NPK + FYM (on rotations III and IV). The annual external modifying factor of the decomposition rate was calculated based on daily minimum and maximum temperature, precipitation, relative humidity, wind speed, and net radiation, and adjusted for soil type and crop management according to default ICBM calibration. We present results of simulated C changes for the long term plots and explore options to improve parameter calibration. Finally, we provide suggestions for how problems regarding model verification can be handled with when applying the model on a national scale for inventory reporting.

  2. Losartan prevents heart fibrosis induced by long-term intensive exercise in an animal model.

    Directory of Open Access Journals (Sweden)

    Gemma Gay-Jordi

    Full Text Available RATIONALE: Recently it has been shown that long-term intensive exercise practice is able to induce myocardial fibrosis in an animal model. Angiotensin II is a profibrotic hormone that could be involved in the cardiac remodeling resulting from endurance exercise. OBJECTIVE: This study examined the antifibrotic effect of losartan, an angiotensin II type 1 receptor antagonist, in an animal model of heart fibrosis induced by long-term intense exercise. METHODS AND RESULTS: Male Wistar rats were randomly distributed into 4 experimental groups: Exercise, Exercise plus losartan, Sedentary and Sedentary plus losartan. Exercise groups were conditioned to run vigorously for 16 weeks. Losartan was orally administered daily before each training session (50 mg/kg/day. Time-matched sedentary rats served as controls. After euthanasia, heart hypertrophy was evaluated by histological studies; ventricular collagen deposition was quantified by histological and biochemical studies; and messenger RNA and protein expression of transforming growth factor-β1, fibronectin-1, matrix metalloproteinase-2, tissue inhibitor of metalloproteinase-1, procollagen-I and procollagen-III was evaluated in all 4 cardiac chambers. Daily intensive exercise caused hypertrophy in the left ventricular heart wall and originated collagen deposition in the right ventricle. Additionally long-term intensive exercise induced a significant increase in messenger RNA expression and protein synthesis of the major fibrotic markers in both atria and in the right ventricle. Losartan treatment was able to reduce all increases in messenger RNA expression and protein levels caused by exercise, although it could not completely reverse the heart hypertrophy. CONCLUSIONS: Losartan treatment prevents the heart fibrosis induced by endurance exercise in training animals.

  3. Evaluation of CASP8 model quality predictions

    KAUST Repository

    Cozzetto, Domenico


    The model quality assessment problem consists in the a priori estimation of the overall and per-residue accuracy of protein structure predictions. Over the past years, a number of methods have been developed to address this issue and CASP established a prediction category to evaluate their performance in 2006. In 2008 the experiment was repeated and its results are reported here. Participants were invited to infer the correctness of the protein models submitted by the registered automatic servers. Estimates could apply to both whole models and individual amino acids. Groups involved in the tertiary structure prediction categories were also asked to assign local error estimates to each predicted residue in their own models and their results are also discussed here. The correlation between the predicted and observed correctness measures was the basis of the assessment of the results. We observe that consensus-based methods still perform significantly better than those accepting single models, similarly to what was concluded in the previous edition of the experiment. © 2009 WILEY-LISS, INC.

  4. An Integrated Multicriteria Decision-Making Approach for Evaluating Nuclear Fuel Cycle Systems for Long-term Sustainability on the Basis of an Equilibrium Model: Technique for Order of Preference by Similarity to Ideal Solution, Preference Ranking Organization Method for Enrichment Evaluation, and Multiattribute Utility Theory Combined with Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Saerom Yoon


    Full Text Available The focus on the issues surrounding spent nuclear fuel and lifetime extension of old nuclear power plants continues to grow nowadays. A transparent decision-making process to identify the best suitable nuclear fuel cycle (NFC is considered to be the key task in the current situation. Through this study, an attempt is made to develop an equilibrium model for the NFC to calculate the material flows based on 1 TWh of electricity production, and to perform integrated multicriteria decision-making method analyses via the analytic hierarchy process technique for order of preference by similarity to ideal solution, preference ranking organization method for enrichment evaluation, and multiattribute utility theory methods. This comparative study is aimed at screening and ranking the three selected NFC options against five aspects: sustainability, environmental friendliness, economics, proliferation resistance, and technical feasibility. The selected fuel cycle options include pressurized water reactor (PWR once-through cycle, PWR mixed oxide cycle, or pyroprocessing sodium-cooled fast reactor cycle. A sensitivity analysis was performed to prove the robustness of the results and explore the influence of criteria on the obtained ranking. As a result of the comparative analysis, the pyroprocessing sodium-cooled fast reactor cycle is determined to be the most competitive option among the NFC scenarios.

  5. An integrated multicriteria decision-making approach for evaluating nuclear fuel cycle systems for long-term sustainability on the basis of an equilibrium model: Technique for order of preference by similarity to ideal solution, preference ranking organization method for enrichment evaluation, and multiattribute utility theory combined with analytic hierarchy process

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Sae Rom [Dept of Quantum Energy Chemical Engineering, Korea University of Science and Technology (KUST), Daejeon (Korea, Republic of); Choi, Sung Yeol [Ulsan National Institute of Science and Technology, Ulju (Korea, Republic of); Ko, Wonil [Nonproliferation System Development Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)


    The focus on the issues surrounding spent nuclear fuel and lifetime extension of old nuclear power plants continues to grow nowadays. A transparent decision-making process to identify the best suitable nuclear fuel cycle (NFC) is considered to be the key task in the current situation. Through this study, an attempt is made to develop an equilibrium model for the NFC to calculate the material flows based on 1 TWh of electricity production, and to perform integrated multicriteria decision-making method analyses via the analytic hierarchy process technique for order of preference by similarity to ideal solution, preference ranking organization method for enrichment evaluation, and multiattribute utility theory methods. This comparative study is aimed at screening and ranking the three selected NFC options against five aspects: sustainability, environmental friendliness, economics, proliferation resistance, and technical feasibility. The selected fuel cycle options include pressurized water reactor (PWR) once-through cycle, PWR mixed oxide cycle, or pyroprocessing sodium-cooled fast reactor cycle. A sensitivity analysis was performed to prove the robustness of the results and explore the influence of criteria on the obtained ranking. As a result of the comparative analysis, the pyroprocessing sodium-cooled fast reactor cycle is determined to be the most competitive option among the NFC scenarios.

  6. [Systemic-psychodynamic model for family evaluation]. (United States)

    Salinas, J L; Pérez, M P; Viniegra, L; Armando Barriguete, J; Casillas, J; Valencia, A


    In this paper a family evaluation instrument called systemic-psychodynamic family evaluation model is described. Also, the second stage of the validation study of this instrument is presented (which deals with the inter-observers variation). Twenty families were studied. They were assessed always by the same interviewers designated as experts. They are all family therapy specialists and their assessment was used as the evaluation reference standard or "gold standard". The observers were psychiatrists without previous training in family therapy. For the purpose of the interview, both experts and observers were blind to the medical diagnosis of the patients. During the first stage of the validation study the observers did not have a reference guide which resulted in a low concordance rating. For the second stage, a 177 item guide was used and a considerable increase in the concordance rating was observed. Validation studies like the one used here are of considerable value to increase the reliability and further utilisation of evaluation instruments of this type.

  7. Solutions of several coupled discrete models in terms of Lamé ...

    Indian Academy of Sciences (India)

    Coupled discrete models are ubiquitous in a variety of physical contexts. We provide an extensive set of exact quasiperiodic solutions of a number of coupled discrete models in terms of Lamé polynomials of arbitrary order. The models discussed are: coupled Salerno model,; coupled Ablowitz–Ladik model,; coupled 4 ...

  8. Evaluation of long-term water-level declines in basalt aquifers near Mosier, Oregon (United States)

    Burns, Erick R.; Morgan, David S.; Lee, Karl K.; Haynes, Jonathan V.; Conlon, Terrence D.


    are not sustainable, (2) well construction practices that have resulted in leakage from aquifers into springs and streams, and (3) reduction in aquifer recharge resulting from long-term climate variations. Historical well construction practices, specifically open, unlined, uncased boreholes that result in cross-connecting (or commingling) multiple aquifers, allow water to flow between these aquifers. Water flowing along the path of least resistance, through commingled boreholes, allows the drainage of aquifers that previously stored water more efficiently. The study area is in the eastern foothills of the Cascade Range in north central Oregon in a transitional zone between the High Cascades to the west and the Columbia Plateau to the east. The 78-square mile (mi2) area is defined by the drainages of three streams - Mosier Creek (51.8 mi2), Rock Creek (13.9 mi2), and Rowena Creek (6.9 mi2) - plus a small area that drains directly to the Columbia River.The three major components of the study are: (1) a 2-year intensive data collection period to augment previous streamflow and groundwater-level measurements, (2) precipitation-runoff modeling of the watersheds to determine the amount of recharge to the aquifer system, and (3) groundwater-flow modeling and analysis to evaluate the cause of groundwater-level declines and to evaluate possible water resource management strategies. Data collection included the following: 1. Water-level measurements were made in 37 wells. Bi-monthly or quarterly measurements were made in 30 wells, and continuous water-level monitoring instruments were installed in 7 wells. The measurements principally were made to capture the seasonal patterns in the groundwater system, and to augment the available long-term record. 2. Groundwater pumping was measured, reported, or estimated from irrigation, municipal and domestic wells. Flowmeters were installed on 74 percent of all high-capacity irrigation wells in the study area. 3. Borehole geophysical data

  9. European Cohesion Policy: A Proposed Evaluation Model

    Directory of Open Access Journals (Sweden)

    Alina Bouroşu (Costăchescu


    Full Text Available The current approach of European Cohesion Policy (ECP is intended to be a bridge between different fields of study, emphasizing the intersection between "the public policy cycle, theories of new institutionalism and the new public management”. ECP can be viewed as a focal point between putting into practice the principles of the new governance theory, theories of economic convergence and divergence and the governance of common goods. After a short introduction of defining the concepts used, the author discussed on the created image of ECP by applying three different theories, focusing on the structural funds implementation system (SFIS, directing the discussion on the evaluation part of this policy, by proposing a model of performance evaluation of the system, in order to outline key principles for creating effective management mechanisms of ECP.

  10. Evaluation of short term effects of the IROMEC robotic toy for children with developmental disabilities. (United States)

    Klein, Tanja; Gelderblom, Gert Jan; de Witte, Luc; Vanstipelen, Silvie


    Research shows a reduced playfulness in children with developmental disabilities. This is a barrier for participation and children's health and wellbeing. IROMEC is a purposely designed robot to support play in impaired children. The reported study evaluates short-term effects of the IROMEC robot toy supporting play in an occupational therapy intervention for children with developmental disabilities. Two types of play intervention (standard occupational therapy versus robot-facilitated play intervention) were compared regarding their effect on the level of playfulness, on children's general functional development, goal achievement as well as the therapist's evaluation of the added value of a robot-facilitated play intervention. Three young children took part in this single-subject design study. Evaluation was performed through Test of Playfulness (ToP), the IROMEC evaluation questionnaire and qualitative evaluation by the therapists. Results confirmed the IROMEC robot did partly meet the needs of the children and therapists, and positive impact on TOP results was found with two children. This suggests robotic toys can support children with developmental disabilities in enriching play. Long term effect evaluation should verify these positive indications resulting from use of this innovative social robot for children with developmental disabilities. But it also became clear further development of the robot is required. © 2011 IEEE

  11. Persistent short-term memory defects following sleep deprivation in a drosophila model of Parkinson disease. (United States)

    Seugnet, Laurent; Galvin, James E; Suzuki, Yasuko; Gottschalk, Laura; Shaw, Paul J


    Parkinson disease (PD) is the second most common neurodegenerative disorder in the United States. It is associated with motor deficits, sleep disturbances, and cognitive impairment. The pathology associated with PD and the effects of sleep deprivation impinge, in part, upon common molecular pathways suggesting that sleep loss may be particularly deleterious to the degenerating brain. Thus we investigated the long-term consequences of sleep deprivation on shortterm memory using a Drosophila model of Parkinson disease. Transgenic strains of Drosophila melanogaster. Using the GAL4-UAS system, human alpha-synuclein was expressed throughout the nervous system of adult flies. Alpha-synuclein expressing flies (alpha S flies) and the corresponding genetic background controls were sleep deprived for 12 h at age 16 days and allowed to recover undisturbed for at least 3 days. Short-term memory was evaluated using aversive phototaxis suppression. Dopaminergic systems were assessed using mRNA profiling and immunohistochemistry. MEASURMENTS AND RESULTS: When sleep deprived at an intermediate stage of the pathology, alpha S flies showed persistent short-term memory deficits that lasted > or = 3 days. Cognitive deficits were not observed in younger alpha S flies nor in genetic background controls. Long-term impairments were not associated with accelerated loss of dopaminergic neurons. However mRNA expression of the dopamine receptors dDA1 and DAMB were significantly increased in sleep deprived alpha S flies. Blocking D1-like receptors during sleep deprivation prevented persistent shortterm memory deficits. Importantly, feeding flies the polyphenolic compound curcumin blocked long-term learning deficits. These data emphasize the importance of sleep in a degenerating/reorganizing brain and shows that pathological processes induced by sleep deprivation can be dissected at the molecular and cellular level using Drosophila genetics.

  12. A generalized one-factor term structure model and pricing of interest rate derivative securities

    NARCIS (Netherlands)

    Jiang, George J.


    The purpose of this paper is to propose a nonparametric interest rate term structure model and investigate its implications on term structure dynamics and prices of interest rate derivative securities. The nonparametric spot interest rate process is estimated from the observed short-term interest

  13. Evaluating Remotely Sensed Phenological Metrics in a Dynamic Ecosystem Model

    Directory of Open Access Journals (Sweden)

    Hong Xu


    Full Text Available Vegetation phenology plays an important role in regulating processes of terrestrial ecosystems. Dynamic ecosystem models (DEMs require representation of phenology to simulate the exchange of matter and energy between the land and atmosphere. Location-specific parameterization with phenological observations can potentially improve the performance of phenological models embedded in DEMs. As ground-based phenological observations are limited, phenology derived from remote sensing can be used as an alternative to parameterize phenological models. It is important to evaluate to what extent remotely sensed phenological metrics are capturing the phenology observed on the ground. We evaluated six methods based on two vegetation indices (VIs (i.e., Normalized Difference Vegetation Index and Enhanced Vegetation Index for retrieving the phenology of temperate forest in the Agro-IBIS model. First, we compared the remotely sensed phenological metrics with observations at Harvard Forest and found that most of the methods have large biases regardless of the VI used. Only two methods for the leaf onset and one method for the leaf offset showed a moderate performance. When remotely sensed phenological metrics were used to parameterize phenological models, the bias is maintained, and errors propagate to predictions of gross primary productivity and net ecosystem production. Our results show that Agro-IBIS has different sensitivities to leaf onset and offset in terms of carbon assimilation, suggesting it might be better to examine the respective impact of leaf onset and offset rather than the overall impact of the growing season length.

  14. Long-term Failure Prediction based on an ARP Model of Global Risk Network (United States)

    Lin, Xin; Moussawi, Alaa; Szymanski, Boleslaw; Korniss, Gyorgy

    Risks that threaten modern societies form an intricately interconnected network. Hence, it is important to understand how risk materializations in distinct domains influence each other. In the paper, we study the global risks network defined by World Economic Forum experts in the form of Stochastic Block Model. We model risks as Alternating Renewal Processes with variable intensities driven by hidden values of exogenous and endogenous failure probabilities. Based on the expert assessments and historical status of each risk, we use Maximum Likelihood Evaluation to find the optimal model parameters and demonstrate that the model considering network effects significantly outperforms the others. In the talk, we discuss how the model can be used to provide quantitative means for measuring interdependencies and materialization of risks in the network. We also present recent results of long-term predictions in the form of predicated distributions of materializations over various time periods. Finally we show how the simulation of ARP's enables us to probe limits of the predictability of the system parameters from historical data and ability to recover hidden variable. Supported in part by DTRA, ARL NS-CTA.

  15. Risk Modeling Approaches in Terms of Volatility Banking Transactions

    Directory of Open Access Journals (Sweden)

    Angelica Cucşa (Stratulat


    Full Text Available The inseparability of risk and banking activity is one demonstrated ever since banking systems, the importance of the topic being presend in current life and future equally in the development of banking sector. Banking sector development is done in the context of the constraints of nature and number of existing risks and those that may arise, and serves as limiting the risk of banking activity. We intend to develop approaches to analyse risk through mathematical models by also developing a model for the Romanian capital market 10 active trading picks that will test investor reaction in controlled and uncontrolled conditions of risk aggregated with harmonised factors.

  16. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.


    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  17. An evaluation framework for participatory modelling (United States)

    Krueger, T.; Inman, A.; Chilvers, J.


    Strong arguments for participatory modelling in hydrology can be made on substantive, instrumental and normative grounds. These arguments have led to increasingly diverse groups of stakeholders (here anyone affecting or affected by an issue) getting involved in hydrological research and the management of water resources. In fact, participation has become a requirement of many research grants, programs, plans and policies. However, evidence of beneficial outcomes of participation as suggested by the arguments is difficult to generate and therefore rare. This is because outcomes are diverse, distributed, often tacit, and take time to emerge. In this paper we develop an evaluation framework for participatory modelling focussed on learning outcomes. Learning encompasses many of the potential benefits of participation, such as better models through diversity of knowledge and scrutiny, stakeholder empowerment, greater trust in models and ownership of subsequent decisions, individual moral development, reflexivity, relationships, social capital, institutional change, resilience and sustainability. Based on the theories of experiential, transformative and social learning, complemented by practitioner experience our framework examines if, when and how learning has occurred. Special emphasis is placed on the role of models as learning catalysts. We map the distribution of learning between stakeholders, scientists (as a subgroup of stakeholders) and models. And we analyse what type of learning has occurred: instrumental learning (broadly cognitive enhancement) and/or communicative learning (change in interpreting meanings, intentions and values associated with actions and activities; group dynamics). We demonstrate how our framework can be translated into a questionnaire-based survey conducted with stakeholders and scientists at key stages of the participatory process, and show preliminary insights from applying the framework within a rural pollution management situation in

  18. Term structure of sovereign spreads: a contingent claim model

    Directory of Open Access Journals (Sweden)

    Katia Rocha


    Full Text Available This paper proposes a simple structural model to estimate the termstructure and the implied default probability of a selected group of emerging countries, which account for 54% of the JPMorgan EMBIG index on average for the period 2000-2005. The real exchange rate dynamic, modeled as a pure diffusion process, is assumed to trigger default. The calibrated model generates sovereign spread curves consistent to market data. The results suggest that the market is systematically overpricing spreads for Brazil in 100 basis points, whereas for Mexico, Russia and Turkey the model is able to reproduce the market behavior.Este trabalho propõe um modelo estrutural para estimar a estrutura a termo e a probabilidade implícita de default de países emergentes que representam, em média, 54% do índice EMBIG do JPMorgan no período de 2000-2005. A taxa de câmbio real, modelada como um processo de difusão simples, é considerada como indicativa de default. O modelo calibrado gera a estrutura a termo dos spreads consistente com dados de mercado, indicando que o mercado sistematicamente sobre-estima os spreads para o Brasil em 100 pontos base na média, enquanto para México, Rússia e Turquia reproduz o comportamento do mercado.

  19. Modelled long term trends of surface ozone over South Africa

    CSIR Research Space (South Africa)

    Naidoo, M


    Full Text Available pollutants over the region. Previous monitoring campaigns have described local trends of surface and profile ozone (e.g. Thompson et al, 2007); however results are spatially limited and temporally sparse. The modelling of surface ozone within a large...

  20. Evaluating and ranking threats to the long-term persistence of polar bears (United States)

    Atwood, Todd C.; Marcot, Bruce G.; Douglas, David C.; Amstrup, Steven C.; Rode, Karyn D.; Durner, George M.; Bromaghin, Jeffrey F.


    The polar bear (Ursus maritimus) was listed as a globally threatened species under the U.S. Endangered Species Act (ESA) in 2008, mostly due to the significant threat to their future population viability from rapidly declining Arctic sea ice. A core mandate of the ESA is the development of a recovery plan that identifies steps to maintain viable populations of a listed species. A substantive evaluation of the relative influence of putative threats to population persistence is helpful to recovery planning. Because management actions must often be taken in the face of substantial information gaps, a formalized evaluation hypothesizing potential stressors and their relationships with population persistence can improve identification of relevant conservation actions. To this end, we updated a Bayesian network model previously used to forecast the future status of polar bears worldwide. We used new information on actual and predicted sea ice loss and polar bear responses to evaluate the relative influence of plausible threats and their mitigation through management actions on the persistence of polar bears in four ecoregions. We found that polar bear outcomes worsened over time through the end of the century under both stabilized and unabated greenhouse gas (GHG) emission pathways. Under the unabated pathway (i.e., RCP 8.5), the time it took for polar bear populations in two of four ecoregions to reach a dominant probability of greatly decreased was hastened by about 25 years. Under the stabilized GHG emission pathway (i.e., RCP 4.5), where GHG emissions peak around the year 2040, the polar bear population in the Archipelago Ecoregion of High Arctic Canada never reached a dominant probability of greatly decreased, reinforcing earlier suggestions of this ecoregion’s potential to serve as a long-term refugium. The most influential drivers of adverse polar bear outcomes were declines to overall sea ice conditions and to the marine prey base. Improved sea ice conditions

  1. Modeling long-term effects of hairy vetch cultivation on cotton production in Northwest Louisiana. (United States)

    Ku, Hyun-Hwoi; Jeong, Changyoon; Colyer, Patrick


    The use of hairy vetch as a winter cover crop for cotton production in Northwest Louisiana has contributed to sustaining cotton production as well as improving soil quality. To test the efficacy of hairy vetch (HV) cultivation as a natural N supplement for cotton production, a long-term field experiment lasting 27years was evaluated using the Denitrification-Decomposition (DNDC) model. Different N fertilization practices, including 0kgNha-1 (PL_1), HV alone (PL_2), 44.8kgNha-1 (PL_3), and 67.3kgNha-1 (PL_4), were compared to evaluate nitrogen (N) response to cotton yield. Measured crop yield from each treatment was used to calibrate and validate the model. The DNDC model was employed to test the effects of N application scenarios on cotton yields and HV incorporation on N balance under irrigated and non-irrigated conditions. In model calibration, statistical indices for the model performance on cotton seed yield showed that PL_1 had a normalized root mean square error (NRMSE) value of 24.5%, a model efficiency (ME) value of 0.51, and a correlation coefficient (r) of 0.87 (P<0.01). The DNDC model was validated with PL_2, PL_3, and PL_4. PL_2, PL_3 and PL_4 had a NRMSE of 18.6%, 16.4% and 15.8% respectively, ME value of 0.19, 0.47 and 0.52 respectively, and an r of 0.75, 0.83 and 0.85 (P<0.05) respectively. Estimates of soil organic carbon (SOC) for HV treatment showed double the SOC content during a 27-year long-term experiment, while both treatments of 44.8kgNha-1 and 67.3kgNha-1 showed similar levels of SOC of a 25% increase compared to the control. Based on the scenario analysis, sustainable cotton yields do not require N fertilizer application under HV cultivated fields, and no yield differences were observed between irrigated and non-irrigated conditions. Copyright © 2017. Published by Elsevier B.V.

  2. Evaluation of Long Term Behaviour of Polymers for Offshore Oil and Gas Applications

    Directory of Open Access Journals (Sweden)

    Le Gac P.-Y.


    Full Text Available Polymers and composites are very attractive for underwater applications, but it is essential to evaluate their long term behaviour in sea water if structural integrity of offshore structures is to be guaranteed. Accelerated test procedures are frequently required, and this paper will present three examples showing how the durability of polymers, in the form of fibres, matrix resins in fibre reinforced composites for structural elements, and thermal insulation coatings of flow-lines, have been evaluated for offshore use. The influence of the ageing medium, temperature, and hydrostatic pressure will be discussed first, then an example of the application of ageing test results to predict long term behavior of the thermal insulation coating of a flowline will be presented.

  3. Long-Term Evaluation of SSL Field Performance in Select Interior Projects

    Energy Technology Data Exchange (ETDEWEB)

    Perrin, Tess E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Davis, Robert G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wilkerson, Andrea M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    This GATEWAY project evaluated four field installations to better understand the long-term performance of a number of LED products, which can hopefully stimulate improvements in designing, manufacturing, specifying, procuring, and installing LED products. Field studies provide the opportunity to discover and investigate issues that cannot be simulated or uncovered in a laboratory, but the installed performance over time of commercially available LED products has not been well documented. Improving long-term performance can provide both direct energy savings by reducing the need to over-light to account for light loss and indirect energy savings through better market penetration due to SSL’s competitive advantages over less-efficient light source technologies. The projects evaluated for this report illustrate that SSL use is often motivated by advantages other than energy savings, including maintenance savings, easier integration with control systems, and improved lighting quality.

  4. Progress on source term evaluation of accidental events in the experimental fusion installation ITER

    Energy Technology Data Exchange (ETDEWEB)

    Virot, F.; Barrachin, M.; Vola, D.


    Highlights: • Progress of the IRSN R&D activities related to the safety assessment of the ITER installation. • Simulation of an accidental scenario with the ASTEC code: loss of coolant in port cell and in vacuum vessel. • Location and chemical speciation of beryllium dusts and tritium. - Abstract: The French “Institut de Radioprotection et de Sûreté Nucléaire” (IRSN) in support to the French nuclear safety authority performs the safety analyses of the ITER experimental installation. We present the progress in the R&D activities related to a better evaluation of the source term in the event of an accident in this installation. These improvements are illustrated by an evaluation of the source term of a LOCA transient with the dedicated ASTEC code.

  5. Evaluation of selected near-term energy-conservation options for the Midwest

    Energy Technology Data Exchange (ETDEWEB)

    Evans, A.R.; Colsher, C.S.; Hamilton, R.W.; Buehring, W.A.


    This report evaluates the potential for implementation of near-term energy-conservation practices for the residential, commercial, agricultural, industrial, transportation, and utility sectors of the economy in twelve states: Illinois, Indiana, Iowa, Kansas, Michigan, Minnesota, Missouri, Nebraska, North Dakota, Ohio, South Dakota, and Wisconsin. The information used to evaluate the magnitude of achievable energy savings includes regional energy use, the regulatory/legislative climate relating to energy conservation, technical characteristics of the measures, and their feasibility of implementation. This work is intended to provide baseline information for an ongoing regional assessment of energy and environmental impacts in the Midwest. 80 references.

  6. Environmental and other evaluations of alternatives for long-term management of stored INEL transuranic waste

    Energy Technology Data Exchange (ETDEWEB)


    This study identifies, develops, and evaluates, in a preliminary manner, alternatives for long-term management of TRU waste stored at the Radioactive Waste Management Complex (RWMC) at the INEL. The evaluations concern waste currently at the RWMC and waste expected to be received by the beginning of the year 1985. The effects of waste that might be received after that date are addressed in an appendix. The technology required for managing the waste, the environmental effects, the risks to the public, the radiological and nonradiological hazards to workers, and the estimated costs are discussed.

  7. Medium-term evaluation of total knee arthroplasty without patellar replacement

    Directory of Open Access Journals (Sweden)

    José Wanderley Vasconcelos


    Full Text Available OBJECTIVE: To mid-term evaluate patients who were submitted to total knee arthroplasty without patellar resurfacing. METHODS: It was realized a retrospective cross-sectional study of patients who were submitted to total knee arthroplasty without patellar resurfacing. In all patients clinical examination was done based on the protocol of the Knee Society Scoring System, which assessed pain, range of motion, stability, contraction, knee alignment and function, and radiological evaluation. RESULTS: A total of 36 patients were evaluated. Of these, 07 were operated only on left knee, 12 only on right knee and 17 were operated bilaterally, totaling 53 knees. Ages ranged from 26 to 84 years. Of the 53 knees evaluated, 33 (62.26% had no pain. The maximum flexion range of motion averaged 104.7°. No knee had difficulty in active extension. As to the alignment for anatomical axis twelve knees (22.64% showed deviation between 0° and 4° varus. Thirty-nine (75.49% knees showed pace without restriction and the femorotibial angle ranged between 3° varus and 13° valgus with an average of 5° valgus. The patellar index ranged from 0.2 to 1.1. CONCLUSION: Total knee arthroplasty whitout patellar resurfacing provides good results in mid-term evaluation.

  8. Evaluating the Sustainability of Community-Based Long-Term Care Programmes: A Hybrid Multi-Criteria Decision Making Approach

    Directory of Open Access Journals (Sweden)

    Baoan Song


    Full Text Available Sustainability is a crucial factor in Long-Term Care (LTC programmes, which implies whether the programmes have the capability of sustaining a quality service over the long term. To evaluate the sustainability of community-based LTC programmes, a novel hybrid framework has been demonstrated with a mixed Multi-Criteria Decision Making (MCDM technique. According to extensive literature review and the fuzzy Delphi method, four pillars of initial criteria and twelve sub-criteria have been determined. Then a weighted hierarchy has been constructed with Analytic Hierarchy Process (AHP to constitute the evaluation index system. In order to prove our framework, a case study of four community-based LTC programmes in Michigan is presented by applying the fuzzy technique for order preference by similarity to an ideal solution (TOPSIS method. The results indicate that programme P2 has the best potential of sustainability, and sub-criteria associated with economy outweigh other sub-criteria. The sensitivity analysis verifies that the result of the ranking remains stable regardless of the fluctuation in sub-criteria weights, which proves the evaluation results and proposed model to be accurate and effective. This study develops a comprehensive and effective framework for evaluating community-based LTC programmes from the sustainability perspective.

  9. A multistate model and an algorithm for measuring long-term adherence to medication

    DEFF Research Database (Denmark)

    Jensen, Majken Linnemann; Jørgensen, Marit Eika; Hansen, Ebba Holme


    To develop a multistate model and an algorithm for calculating long-term adherence to medication among patients with a chronic disease.......To develop a multistate model and an algorithm for calculating long-term adherence to medication among patients with a chronic disease....

  10. Evaluating the quality of scenarios of short-term wind power generation

    DEFF Research Database (Denmark)

    Pinson, Pierre; Girard, R.


    their use in decision-making. So far however, their verication is almost always focused on their marginal distributions for each individual lead time only, thus overlooking their temporal interdependence structure. The shortcomings of such an approach are discussed. Multivariate verication tools, as well...... as diagnostic approaches based on event-based verication are then presented. Their application to the evaluation of various sets of scenarios of short-term wind power generation demonstrates them as valuable discrimination tools....

  11. Long-term medical costs and life expectancy of acute myeloid leukemia: a probabilistic decision model. (United States)

    Wang, Han-I; Aas, Eline; Howell, Debra; Roman, Eve; Patmore, Russell; Jack, Andrew; Smith, Alexandra


    Acute myeloid leukemia (AML) can be diagnosed at any age and treatment, which can be given with supportive and/or curative intent, is considered expensive compared with that for other cancers. Despite this, no long-term predictive models have been developed for AML, mainly because of the complexities associated with this disease. The objective of the current study was to develop a model (based on a UK cohort) to predict cost and life expectancy at a population level. The model developed in this study combined a decision tree with several Markov models to reflect the complexity of the prognostic factors and treatments of AML. The model was simulated with a cycle length of 1 month for a time period of 5 years and further simulated until age 100 years or death. Results were compared for two age groups and five different initial treatment intents and responses. Transition probabilities, life expectancies, and costs were derived from a UK population-based specialist registry-the Haematological Malignancy Research Network ( Overall, expected 5-year medical costs and life expectancy ranged from £8,170 to £81,636 and 3.03 to 34.74 months, respectively. The economic and health outcomes varied with initial treatment intent, age at diagnosis, trial participation, and study time horizon. The model was validated by using face, internal, and external validation methods. The results show that the model captured more than 90% of the empirical costs, and it demonstrated good fit with the empirical overall survival. Costs and life expectancy of AML varied with patient characteristics and initial treatment intent. The robust AML model developed in this study could be used to evaluate new diagnostic tools/treatments, as well as enable policy makers to make informed decisions. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  12. Challenges in integrating short-term behaviour in a mixed-fishery Management Strategies Evaluation frame: a case study of the North Sea flatfish fishery

    DEFF Research Database (Denmark)

    Andersen, Bo Sølgaard; Vermard, Youen; Ulrich, Clara


    This study presents a fleet-based bioeconomic simulation model to the international mixed flatfish fishery in the North Sea. The model uses a Management Strategies Evaluation framework including a discrete choice model accounting for short-term temporal changes in effort allocation across fisheries...... increase in the economic performance of the individual fleets. This showed the existence of a window of sensitivity of the model to the behaviour assumptions. The study highlights the challenge of implementing an effort allocation model in a general framework of Management Strategies Evaluation for mixed....... A simplified random utility model was used based on the expected revenue (or economic attractiveness) and two tradition parameters related to short and long term historical fishing patterns. All three parameters were significant. Even though reactions and adaptations vary between fleets, the estimated...

  13. A Probabilistic Palimpsest Model of Visual Short-term Memory (United States)

    Matthey, Loic; Bays, Paul M.; Dayan, Peter


    Working memory plays a key role in cognition, and yet its mechanisms remain much debated. Human performance on memory tasks is severely limited; however, the two major classes of theory explaining the limits leave open questions about key issues such as how multiple simultaneously-represented items can be distinguished. We propose a palimpsest model, with the occurrent activity of a single population of neurons coding for several multi-featured items. Using a probabilistic approach to storage and recall, we show how this model can account for many qualitative aspects of existing experimental data. In our account, the underlying nature of a memory item depends entirely on the characteristics of the population representation, and we provide analytical and numerical insights into critical issues such as multiplicity and binding. We consider representations in which information about individual feature values is partially separate from the information about binding that creates single items out of multiple features. An appropriate balance between these two types of information is required to capture fully the different types of error seen in human experimental data. Our model provides the first principled account of misbinding errors. We also suggest a specific set of stimuli designed to elucidate the representations that subjects actually employ. PMID:25611204

  14. A review of methods used in long-term cost-effectiveness models of diabetes mellitus treatment. (United States)

    Tarride, Jean-Eric; Hopkins, Robert; Blackhouse, Gord; Bowen, James M; Bischof, Matthias; Von Keyserlingk, Camilla; O'Reilly, Daria; Xie, Feng; Goeree, Ron


    Diabetes mellitus is a major healthcare concern from both a treatment and a funding perspective. Although decision makers frequently rely on models to evaluate the long-term costs and consequences associated with diabetes interventions, no recent article has reviewed the methods used in long-term cost-effectiveness models of diabetes treatment. The following databases were searched up to April 2008 to identify published economic models evaluating treatments for diabetes mellitus: OVID MEDLINE, EMBASE and the Thomson's Biosis Previews, NHS EED via Wiley's Cochrane Library, and Wiley's HEED database. Identified articles were reviewed and grouped according to unique models. When a model was applied in different settings (e.g. country) or compared different treatment alternatives, only the original publication describing the model was included. In some cases, subsequent articles were included if they provided methodological advances from the original model. The following data were captured for each study: (i) study characteristics; (ii) model structure; (iii) long-term complications, data sources, methods reporting and model validity; (iv) utilities, data sources and methods reporting; (v) costs, data sources and methods reporting; (vi) model data requirements; and (vii) economic results including methods to deal with uncertainty. A total of 17 studies were identified, 12 of which allowed for the conduct of a cost-effectiveness analysis and a cost-utility analysis. Although most models were Markov-based microsimulations, models differed with respect to the number of diabetes-related complications included. The majority of the studies used a lifetime time horizon and a payer perspective. The DCCT for type 1 diabetes and the UKPDS for type 2 diabetes were the trial data sources most commonly cited for the efficacy data, although several non-randomized data sources were used. While the methods used to derive the efficacy data were commonly reported, less information was

  15. The systemic kainic acid rat model of temporal lobe epilepsy: Long-term EEG monitoring. (United States)

    Van Nieuwenhuyse, B; Raedt, R; Sprengers, M; Dauwe, I; Gadeyne, S; Carrette, E; Delbeke, J; Wadman, W J; Boon, P; Vonck, K


    Animal models reproducing the characteristics of human epilepsy are essential for the elucidation of the pathophysiological mechanisms. In epilepsy research there is ongoing debate on whether the epileptogenic process is a continuous process rather than a step function. The aim of this study was to assess progression of epileptogenesis over the long term and to evaluate possible correlations between SE duration and severity with the disease progression in the kainic acid model. Rats received repeated KA injections (5mg/kg) until a self-sustained SE was elicited. Continuous depth EEG recording started before KA injection and continued for 30 weeks. Mean seizure rate progression could be expressed as a sigmoid function and increased from 1 ± 0.2 seizures per day during the second week after SE to 24.4 ± 6.4 seizures per day during week 30. Seizure rate progressed to a plateau phase 122 ± 9 days after SE. However, the individual seizure rate during this plateau phase varied between 14.5 seizures and 48.6 seizures per day. A circadian rhythm in seizure occurrence was observed in all rats. Histological characterization of damage to the dentate gyrus in the KA treated rats confirmed the presence of astrogliosis and aberrant mossy fiber sprouting in the dentate gyrus. This long-term EEG monitoring study confirms that epileptogenesis is a continuous process rather than a step function. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Evaluation of oxidative status in short-term exercises of adolescent athletes

    Directory of Open Access Journals (Sweden)

    K Karacabey


    Full Text Available The aim of the study was to evaluate the effects of short-term exercise on total antioxidant status (TAS, lipid hydroperoxide (LOOHs, total oxidative status (TOS and oxidative stress index (OSI in adolescent athletes. A total of 62 adolescent participated in the study. Athletes were trained regularly 3 days a week for 2 hours. All subjects followed a circuit exercise program. Blood samples were collected just before and immediately after the exercise program. Antioxidant status was evaluated by measuring the TAS level in the plasma. Oxidative status was evaluated by measuring the total peroxide level. The percentage ratio of TAS to total peroxide level was accepted as the OSI. Plasma triglyceride, total cholesterol, LDL, HDL and VLDL were measured by automated chemical analyzer using commercially available kits.There was a significant increase in TOS (p<0.05 and OSI (p<0.01 levels and a significant decrease in TAS levels (p<0.01 compared to the resting state. There were no significant changes in LOOHs levels before and after the short-term exercise. After short-term exercise, the balance between oxidative stress and antioxidant status moves towards oxidative stress as a result of increasing oxidants and decreasing antioxidants.

  17. Radiological evaluation of long term complications of oral rehabilitations of thin ridges with titanium blade implants

    Directory of Open Access Journals (Sweden)

    P. Diotallevi


    Full Text Available Aim: The aim of this study was to assess the sensitivity of orthopantomography (OPT in the diagnosis of long term complications in oral rehabilitations with blade implants. Materials and methods A total of 235 blade implants in 189 patients, inserted between 1988 and 2003, were retrospectively analyzed. The records consisted of a first OPT taken between January and December 2010, and a second one 12 months after. The evaluation of implant health considered: integrity of the blade, normal radiological representation of the bone around the implant, dense and cortical appearance of bone around the implant collar. The evaluation of radiological complications considered: implant fracture, bone resorption around the implant, recession of the bone around the implant collar. Results The sensitivity of the panoramic evaluation was equal to 100%. The complications detected were 5 cases of periimplantitis, 9 cases of bone pericervical bone recession and 3 cases of fracture of the implant body. In cases of pericervical bone resorption the following radiological check up 12 months after the first one showed the progression of the disease in 6 out of 9 cases, with irreversible implant failure. In subjects with a radiological pattern of implant health there were no complications in the subsequent check up after 12 months. In the subjects with complications the specificity was equal to 100%. Conclusion The radiographic evaluation by the means of OPT has shown high sensitivity in the diagnosis of long term complications of oral rehabilitations with blade implants and allows prompt therapeutic interventions. Radiological complications appeared mostly in the long term check ups and mainly consisted in recession of the bone around the neck or around the entire implant. More rarely implant fractures occurred, which, in the case of blades, sometimes were not associated with any clinical symptoms: therefore, postsurgical evaluation should not be separated from

  18. Nursing competence: psychometric evaluation using Rasch modelling. (United States)

    Müller, Marianne


    To test the psychometric properties and evaluate the German version of the Nurse Competence Scale. Nursing competence is an important factor for high-quality healthcare. However, there are only few instruments available, which try to assess nurse competence and there is limited knowledge about the psychometric quality of any of these instruments. A cross-sectional survey of 679 nurses was used. Analysis of the psychometric properties of the 73-item Nurse Competence Scale was undertaken using confirmatory factor analyses and Rasch modelling with data generated in a study in 2007. The 7-factor model of the Nurse Competence Scale could not be confirmed. However, six scales consisting of 54 items demonstrated adequate fit to the Rasch model. The six subscales could also be combined into an overall competence scale. There are concerns about the psychometric properties of the Nurse Competence Scale. The reduced set of items removes redundancy among items, is free from item bias and constitutes six unidimensional scales. © 2012 Blackwell Publishing Ltd.

  19. Solutions of several coupled discrete models in terms of Lamé ...

    Indian Academy of Sciences (India)

    . We provide an extensive set of exact quasiperiodic solutions of a number of coupled discrete models in terms of Lamé polynomials of arbitrary order. The models discussed are: (i) coupled Salerno model, (ii) coupled Ablowitz–Ladik model, ...

  20. Effects of Causal Attributions for Success on First-Term College Performance: A Covariance Structure Model. (United States)

    Platt, Craig W.


    A structural model of the consequences of success attributions--derived from B. Weiner's attribution model--was tested using 208 first-term college students. Although the hypothesized model was rejected based on a chi-square, goodness-of-fit test, a specification search yielded a model that fit the data and was consistent with Weiner's theory.…

  1. Development and Evaluation of Amino Acid Molecular Models

    Directory of Open Access Journals (Sweden)

    Aparecido R. Silva


    Full Text Available The comprehension of structure and function of proteins has a tight relationshipwith the development of structural biology. However, biochemistry students usuallyfind difficulty to visualize the structures when they use only schematic drawings ofdidactic books. The representation of three-dimensional structures of somebiomolecules with ludic models, built with representative units, have supplied tothe students and teachers a successfully experience to better visualize andcorrelate the structures to the real molecules. The present work shows thedeveloped models and the process to produce the representative units of the mainamino acids in industrial scale. The design and applicability of the representativeunits were discussed with many teachers and some suggestions wereimplemented to the models. The preliminary evaluation and perspective ofutilization by researchers show that the work is in the right direction. At the actualstage, the models are defined, prototypes were made and will be presented in thismeeting. The moulds for the units are at the final stage of construction and trial inspecialized tool facilities. The last term will consist of an effective evaluation of thedidactic tool for the teaching/learning process in Structural Molecular Biology. Theevaluation protocol is being elaborated containing simple and objective questions,similar to those used in research on science teaching.

  2. Comparison of two generation-recombination terms in the Poisson-Nernst-Planck model

    Energy Technology Data Exchange (ETDEWEB)

    Lelidis, I. [Solid State Section, Department of Physics, University of Athens, Panepistimiopolis, Zografos, Athens 157 84 (Greece); Department of Applied Science and Technology, Politecnico di Torino, Corso Duca degli Abruzzi 24, 10129 Torino (Italy); Universite de Picardie Jules Verne, Laboratoire de Physique des Systemes Complexes, 33 rue Saint-Leu 80039, Amiens (France); Barbero, G. [Department of Applied Science and Technology, Politecnico di Torino, Corso Duca degli Abruzzi 24, 10129 Torino (Italy); Sfarna, A. [Solid State Section, Department of Physics, University of Athens, Panepistimiopolis, Zografos, Athens 157 84 (Greece)


    Two phenomenological forms proposed to take into account the generation-recombination phenomenon of ions are investigated. The first form models the phenomenon as a chemical reaction, containing two coefficients describing the dissociation of neutral particles in ions, and the recombination of ions to give neutral particles. The second form is based on the assumption that in thermodynamical equilibrium, a well-defined density of ions is stable. Any deviation from the equilibrium density gives rise to a source term proportional to the deviation, whose phenomenological coefficient plays the role of a life time. The analysis is performed by evaluating the electrical response of an electrolytic cell to an external stimulus for both forms. For simplicity we assume that the electrodes are blocking, that there is only a group of negative and positive ions, and that the negative ions are immobile. For the second form, two cases are considered: (i) the generation-recombination phenomenon is due to an intrinsic mechanism, and (ii) the production of ions is triggered by an external source of energy, as in a solar cell. We show that the predictions of the two models are different at the impedance as well as at the admittance level. In particular, the first model predicts the existence of two plateaux for the real part of the impedance, whereas the second one predicts just one. It follows that impedance spectroscopy measurements could give information on the model valid for the generation-recombination of ions.

  3. A probabilistic quantitative risk assessment model for the long-term work zone crashes. (United States)

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo


    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation. 2010 Elsevier Ltd. All rights reserved.

  4. Evaluation of turbulence models for turbomachinery unsteady three-dimensional flows simulation; Evaluation de modeles de turbulence pour la simulation d'ecoulements tridimensionnels instationnaires en turbomachines

    Energy Technology Data Exchange (ETDEWEB)

    Dano, C.


    The objective of this thesis is to evaluate k-e, k-l and k-w low Reynolds two equation turbulence models for. A quadratic nonlinear k-l model is also implemented in this study. We analyze the two equation turbulence models capacity to predict the turbomachinery flows and the wakes. We are interested more particularly in the unsteady three dimensional configuration with rotor-stator interactions. A Gaussian distribution reproduces the upstream wake. This analysis is carried out in term of prediction quality but also in term of numerical behavior. Turbines and compressors configurations are tested. (author)

  5. Modelling seagrass growth and development to evaluate transplanting strategies for restoration. (United States)

    Renton, Michael; Airey, Michael; Cambridge, Marion L; Kendrick, Gary A


    Seagrasses are important marine plants that are under threat globally. Restoration by transplanting vegetative fragments or seedlings into areas where seagrasses have been lost is possible, but long-term trial data are limited. The goal of this study is to use available short-term data to predict long-term outcomes of transplanting seagrass. A functional-structural plant model of seagrass growth that integrates data collected from short-term trials and experiments is presented. The model was parameterized for the species Posidonia australis, a limited validation of the model against independent data and a sensitivity analysis were conducted and the model was used to conduct a preliminary evaluation of different transplanting strategies. The limited validation was successful, and reasonable long-term outcomes could be predicted, based only on short-term data. This approach for modelling seagrass growth and development enables long-term predictions of the outcomes to be made from different strategies for transplanting seagrass, even when empirical long-term data are difficult or impossible to collect. More validation is required to improve confidence in the model's predictions, and inclusion of more mechanism will extend the model's usefulness. Marine restoration represents a novel application of functional-structural plant modelling.

  6. A review of methodologies applied in Australian practice to evaluate long-term coastal adaptation options

    Directory of Open Access Journals (Sweden)

    Timothy David Ramm


    Full Text Available Rising sea levels have the potential to alter coastal flooding regimes around the world and local governments are beginning to consider how to manage uncertain coastal change. In doing so, there is increasing recognition that such change is deeply uncertain and unable to be reliably described with probabilities or a small number of scenarios. Characteristics of methodologies applied in Australian practice to evaluate long-term coastal adaptation options are reviewed and benchmarked against two state-of-the-art international methods suited for conditions of uncertainty (Robust Decision Making and Dynamic Adaptive Policy Pathways. Seven out of the ten Australian case studies assumed the uncertain parameters, such as sea level rise, could be described deterministically or stochastically when identifying risk and evaluating adaptation options across multi-decadal periods. This basis is not considered sophisticated enough for long-term decision-making, implying that Australian practice needs to increase the use of scenarios to explore a much larger uncertainty space when assessing the performance of adaptation options. Two Australian case studies mapped flexible adaptation pathways to manage uncertainty, and there remains an opportunity to incorporate quantitative methodologies to support the identification of risk thresholds. The contextual framing of risk, including the approach taken to identify risk (top-down or bottom-up and treatment of uncertain parameters, were found to be fundamental characteristics that influenced the methodology selected to evaluate adaptation options. The small sample of case studies available suggests that long-term coastal adaptation in Australian is in its infancy and there is a timely opportunity to guide local government towards robust methodologies for developing long-term coastal adaptation plans.

  7. Optimising a hydrostatic ocean model for long-term climate runs of glacier-fjord dynamics (United States)

    Siegfried, Merten; Burchard, Hans


    Glacier-fjord systems in Greenland contribute substantially to the total ice discharge via submarine melting and calving. Especially in South East Greenland, where warm saline water from the Atlantic enters deep narrow fjords at the bottom, melt und calving rates of marine terminating glaciers are high. The subglacial discharge of fresh and cold water at the glacier's grounding line drives a buoyant plume, which turbulently rises up. Due to entrainment of ambient warm Atlantic water, the higher thermal forcing causes drastically increased submarine melting and induces a circulation in the fjord. Beside this buoyant plume, pycnocline displacements and stratification profiles for temperature and salinity at the fjord's mouth as well as bathymetry, winds and tides are relevant to fjord dynamics. We investigate the long-term response of generic glacier-fjord pairs to climate change varying bathymetry and external forcings. In the end, a fjord model coupled to a comprehensive ice-sheet model of intermediate complexity running for up to thousand years is to be developed. Therefore, we examine the fjord dynamics applying the hydrostatic General Estuarine Circulation Model (GETM). For the sake of efficiency (I) across-fjord effects are neglected by modelling only in along-fjord direction and (II) a vertically adaptive grid is employed requiring less vertical layers. A modified plume model including a melt rate parameterisation has been implemented to handle non-hydrostatic effects appearing at the glacier front due to highly buoyant fresh subglacial discharge. Further, the shifting of the grounding line changes the bathymetry at the glacier front and thereby affects the submarine melt rate. We resolve this with a newly implemented horizontally adaptive grid following the glacier's movement. We present implementation details of our modified plume model and compare the output of our optimised hydrostatic model with non-hydrostatic model results near the glacier front

  8. Contribution to evaluation of power generator insulation aging due to long term service

    Energy Technology Data Exchange (ETDEWEB)

    Coletti, G.; Guastavino, F. (Genoa Univ. (Italy). Dip. di Ingegneria Elettrica)


    An evaluation of the insulating system status of two different large fossil- fuel power plant generators having a service life longer than 30 years is presented. Both insulations were based on mica splittings: one used a thermosetting resin, while the other one had a thermoplastic compound. A purposely devised testing program was carried out both on semicoils (and on smaller samples) taken from the windings which had been in long term service and on reference (same) semicoils kept stored for the same time period. The testing program involved visual, acoustic and microscopic examinations, partial discharges, loss tangent and depolarization current measurements, a dielectric proof test and thermo-physical analyses. The relevant results facilitated qualitative assessments of the noted ageing mechanisms and thus contributed to a wider diagnostic evaluation of the winding insulation status. In parallel, it was also possible to evaluate the capabilities of the single testing procedures adopted to discriminate between service aged and non-aged insulation systems.

  9. RTMOD: Real-Time MODel evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Graziani, G; Galmarini, S. [Joint Research centre, Ispra (Italy); Mikkelsen, T. [Risoe National Lab., Wind Energy and Atmospheric Physics Dept. (Denmark)


    The 1998 - 1999 RTMOD project is a system based on an automated statistical evaluation for the inter-comparison of real-time forecasts produced by long-range atmospheric dispersion models for national nuclear emergency predictions of cross-boundary consequences. The background of RTMOD was the 1994 ETEX project that involved about 50 models run in several Institutes around the world to simulate two real tracer releases involving a large part of the European territory. In the preliminary phase of ETEX, three dry runs (i.e. simulations in real-time of fictitious releases) were carried out. At that time, the World Wide Web was not available to all the exercise participants, and plume predictions were therefore submitted to JRC-Ispra by fax and regular mail for subsequent processing. The rapid development of the World Wide Web in the second half of the nineties, together with the experience gained during the ETEX exercises suggested the development of this project. RTMOD featured a web-based user-friendly interface for data submission and an interactive program module for displaying, intercomparison and analysis of the forecasts. RTMOD has focussed on model intercomparison of concentration predictions at the nodes of a regular grid with 0.5 degrees of resolution both in latitude and in longitude, the domain grid extending from 5W to 40E and 40N to 65N. Hypothetical releases were notified around the world to the 28 model forecasters via the web on a one-day warning in advance. They then accessed the RTMOD web page for detailed information on the actual release, and as soon as possible they then uploaded their predictions to the RTMOD server and could soon after start their inter-comparison analysis with other modelers. When additional forecast data arrived, already existing statistical results would be recalculated to include the influence by all available predictions. The new web-based RTMOD concept has proven useful as a practical decision-making tool for realtime

  10. Biological Modeling As A Method for Data Evaluation and ... (United States)

    Biological Models, evaluating consistency of data and integrating diverse data, examples of pharmacokinetics and response and pharmacodynamics Biological Models, evaluating consistency of data and integrating diverse data, examples of pharmacokinetics and response and pharmacodynamics

  11. Standard practice for digital detector array performance evaluation and long-term stability

    CERN Document Server

    American Society for Testing and Materials. Philadelphia


    1.1 This practice describes the evaluation of DDA systems for industrial radiology. It is intended to ensure that the evaluation of image quality, as far as this is influenced by the DDA system, meets the needs of users, and their customers, and enables process control and long term stability of the DDA system. 1.2 This practice specifies the fundamental parameters of Digital Detector Array (DDA) systems to be measured to determine baseline performance, and to track the long term stability of the DDA system. 1.3 The DDA system performance tests specified in this practice shall be completed upon acceptance of the system from the manufacturer and at intervals specified in this practice to monitor long term stability of the system. The intent of these tests is to monitor the system performance for degradation and to identify when an action needs to be taken when the system degrades by a certain level. 1.4 The use of the gages provided in this standard is mandatory for each test. In the event these tests or ga...

  12. Treatment modalities and evaluation models for periodontitis (United States)

    Tariq, Mohammad; Iqbal, Zeenat; Ali, Javed; Baboota, Sanjula; Talegaonkar, Sushama; Ahmad, Zulfiqar; Sahni, Jasjeet K


    Periodontitis is the most common localized dental inflammatory disease related with several pathological conditions like inflammation of gums (gingivitis), degeneration of periodontal ligament, dental cementum and alveolar bone loss. In this perspective, the various preventive and treatment modalities, including oral hygiene, gingival irrigations, mechanical instrumentation, full mouth disinfection, host modulation and antimicrobial therapy, which are used either as adjunctive treatments or as stand-alone therapies in the non-surgical management of periodontal infections, have been discussed. Intra-pocket, sustained release systems have emerged as a novel paradigm for the future research. In this article, special consideration is given to different locally delivered anti-microbial and anti inflammatory medications which are either commercially available or are currently under consideration for Food and Drug Administration (FDA) approval. The various in vitro dissolution models and microbiological strain investigated to impersonate the infected and inflamed periodontal cavity and to predict the in vivo performance of treatment modalities have also been thrashed out. Animal models that have been employed to explore the pathology at the different stages of periodontitis and to evaluate its treatment modalities are enlightened in this proposed review. PMID:23373002

  13. Rainwater harvesting: model-based design evaluation. (United States)

    Ward, S; Memon, F A; Butler, D


    The rate of uptake of rainwater harvesting (RWH) in the UK has been slow to date, but is expected to gain momentum in the near future. The designs of two different new-build rainwater harvesting systems, based on simple methods, are evaluated using three different design methods, including a continuous simulation modelling approach. The RWH systems are shown to fulfill 36% and 46% of WC demand. Financial analyses reveal that RWH systems within large commercial buildings maybe more financially viable than smaller domestic systems. It is identified that design methods based on simple approaches generate tank sizes substantially larger than the continuous simulation. Comparison of the actual tank sizes and those calculated using continuous simulation established that the tanks installed are oversized for their associated demand level and catchment size. Oversizing tanks can lead to excessive system capital costs, which currently hinders the uptake of systems. Furthermore, it is demonstrated that the catchment area size is often overlooked when designing UK-based RWH systems. With respect to these findings, a recommendation for a transition from the use of simple tools to continuous simulation models is made.


    Directory of Open Access Journals (Sweden)

    Zdenko Lončarić


    Full Text Available Evaluation of manures, composts and growing media quality should include enough properties to enable an optimal use from productivity and environmental points of view. The aim of this paper is to describe basic structure of organic fertilizer (and growing media evaluation model to present the model example by comparison of different manures as well as example of using plant growth experiment for calculating impact of pH and EC of growing media on lettuce plant growth. The basic structure of the model includes selection of quality indicators, interpretations of indicators value, and integration of interpreted values into new indexes. The first step includes data input and selection of available data as a basic or additional indicators depending on possible use as fertilizer or growing media. The second part of the model uses inputs for calculation of derived quality indicators. The third step integrates values into three new indexes: fertilizer, growing media, and environmental index. All three indexes are calculated on the basis of three different groups of indicators: basic value indicators, additional value indicators and limiting factors. The possible range of indexes values is 0-10, where range 0-3 means low, 3-7 medium and 7-10 high quality. Comparing fresh and composted manures, higher fertilizer and environmental indexes were determined for composted manures, and the highest fertilizer index was determined for composted pig manure (9.6 whereas the lowest for fresh cattle manure (3.2. Composted manures had high environmental index (6.0-10 for conventional agriculture, but some had no value (environmental index = 0 for organic agriculture because of too high zinc, copper or cadmium concentrations. Growing media indexes were determined according to their impact on lettuce growth. Growing media with different pH and EC resulted in very significant impacts on height, dry matter mass and leaf area of lettuce seedlings. The highest lettuce

  15. Evaluation of clinical information modeling tools. (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak


    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  16. A short-term model of COPD identifies a role for mast cell tryptase (United States)

    Beckett, Emma L.; Stevens, Richard L.; Jarnicki, Andrew G.; Kim, Richard Y.; Hanish, Irwan; Hansbro, Nicole G.; Deane, Andrew; Keely, Simon; Horvat, Jay C.; Yang, Ming; Oliver, Brian G.; van Rooijen, Nico; Inman, Mark D.; Adachi, Roberto; Soberman, Roy J.; Hamadi, Sahar; Wark, Peter A.; Foster, Paul S.; Hansbro, Philip M.


    Background Cigarette smoke-induced chronic obstructive pulmonary disease (COPD) is a life-threatening inflammatory disorder of the lung. The development of effective therapies for COPD has been hampered by the lack of an animal model that mimics the human disease in a short time-frame. Objectives To create an early onset mouse model of cigarette smoke-induced COPD that develops the hallmark features of the human condition in a short time-frame. To use this model to better understand pathogenesis and the roles of macrophages and mast cells (MCs) in COPD. Methods Tightly controlled amounts of cigarette smoke were delivered to the airways of mice, and the development of the pathological features of COPD was assessed. The roles of macrophages and MC tryptase in pathogenesis were evaluated using depletion and in vitro studies and MC protease-6 deficient mice. Results After just 8 weeks of smoke exposure, wild-type mice developed chronic inflammation, mucus hypersecretion, airway remodeling, emphysema, and reduced lung function. These characteristic features of COPD were glucocorticoid-resistant and did not spontaneously resolve. Systemic effects on skeletal muscle and the heart, and increased susceptibility to respiratory infections also were observed. Macrophages and tryptase-expressing MCs were required for the development of COPD. Recombinant MC tryptase induced pro-inflammatory responses from cultured macrophages. Conclusion A short-term mouse model of cigarette smoke-induced COPD was developed in which the characteristic features of the disease were induced more rapidly than existing models. The model can be used to better understand COPD pathogenesis, and we show a requirement for macrophages and tryptase-expressing MCs. PMID:23380220

  17. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species (United States)

    Adams, Matthew P.; Collier, Catherine J.; Uthicke, Sven; Ow, Yan X.; Langlois, Lucas; O'Brien, Katherine R.


    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (Topt) for maximum photosynthetic rate (Pmax). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike.

  18. Evaluation of the performance of DIAS ionospheric forecasting models

    Directory of Open Access Journals (Sweden)

    Tsagouri Ioanna


    Full Text Available Nowcasting and forecasting ionospheric products and services for the European region are regularly provided since August 2006 through the European Digital upper Atmosphere Server (DIAS, Currently, DIAS ionospheric forecasts are based on the online implementation of two models: (i the solar wind driven autoregression model for ionospheric short-term forecast (SWIF, which combines historical and real-time ionospheric observations with solar-wind parameters obtained in real time at the L1 point from NASA ACE spacecraft, and (ii the geomagnetically correlated autoregression model (GCAM, which is a time series forecasting method driven by a synthetic geomagnetic index. In this paper we investigate the operational ability and the accuracy of both DIAS models carrying out a metrics-based evaluation of their performance under all possible conditions. The analysis was established on the systematic comparison between models’ predictions with actual observations obtained over almost one solar cycle (1998–2007 at four European ionospheric locations (Athens, Chilton, Juliusruh and Rome and on the comparison of the models’ performance against two simple prediction strategies, the median- and the persistence-based predictions during storm conditions. The results verify operational validity for both models and quantify their prediction accuracy under all possible conditions in support of operational applications but also of comparative studies in assessing or expanding the current ionospheric forecasting capabilities.

  19. The Evaluation of Ar ab Political Leaders’ Speeches during the Arab Spring in Terms of Semantics

    Directory of Open Access Journals (Sweden)

    Ayşe İSPİR


    Full Text Available The texts of political speeches are purposeful and linguistic products which aim at the fundamental targets like influence and guidance. In this study, Arabic political leaders’ public - directed statements in these speeches during the Arab Spring , which was sparked with the society and the society played an active role in, have been evaluated in terms of semantics . In this context, the text s of speeches w ere examined in terms of semantics at w ord, sentence and text levels . The sample of research consists of three speeches of the Arab Spring ’s ousted leaders: Zeyne - l Âbidîn, Hosni Mubarak, Muammar Gaddafi. As a result of research it has been concluded that the speeches examined contain semantic features which aim realizing political discourse targets like exerting influence and guidance . Furthermore, it has been observed that the non - text elements such as leaders’ political situation and their image s affecte d the speeches.

  20. Modeling of drying kiwi slices and its sensory evaluation. (United States)

    Mahjoorian, Abbas; Mokhtarian, Mohsen; Fayyaz, Nasrin; Rahmati, Fatemeh; Sayyadi, Shabnam; Ariaii, Peiman


    In this study, monolayer drying of kiwi slices was simulated by a laboratory-scale hot-air dryer. The drying process was carried out at three different temperatures of 50, 60, and 70°C. After the end of drying process, initially, the experimental drying data were fitted to the 11 well-known drying models. The results indicated that Two-term model gave better performance compared with other models to monitor the moisture ratio (with average R(2) value equal .998). Also, this study used artificial neural network (ANN) in order to feasibly predict dried kiwi slices moisture ratio (y), based on the time and temperature drying inputs (x1, x2). In order to do this research, two main activation functions called logsig and tanh, widely used in engineering calculations, were applied. The results revealed that, logsig activation function base on 13 neurons in first and second hidden layers were selected as the best configuration to predict the moisture ratio. This network was able to predict moisture ratio with R(2) value .997. Furthermore, kiwi slice favorite is evaluated by sensory evaluation. In this test, sense qualities as color, aroma, flavor, appearance, and chew ability (tissue brittleness) are considered.

  1. A Hybrid Short-Term Power Load Forecasting Model Based on the Singular Spectrum Analysis and Autoregressive Model

    Directory of Open Access Journals (Sweden)

    Hongze Li


    Full Text Available Short-term power load forecasting is one of the most important issues in the economic and reliable operation of electricity power system. Taking the characteristics of randomness, tendency, and periodicity of short-term power load into account, a new method (SSA-AR model which combines the univariate singular spectrum analysis and autoregressive model is proposed. Firstly, the singular spectrum analysis (SSA is employed to decompose and reconstruct the original power load series. Secondly, the autoregressive (AR model is used to forecast based on the reconstructed power load series. The employed data is the hourly power load series of the Mid-Atlantic region in PJM electricity market. Empirical analysis result shows that, compared with the single autoregressive model (AR, SSA-based linear recurrent method (SSA-LRF, and BPNN (backpropagation neural network model, the proposed SSA-AR method has a better performance in terms of short-term power load forecasting.

  2. Evaluation of Vitamin and Trace Element Requirements after Sleeve Gastrectomy at Long Term. (United States)

    Pellitero, Silvia; Martínez, Eva; Puig, Rocío; Leis, Alba; Zavala, Roxanna; Granada, María Luisa; Pastor, Cruz; Moreno, Pau; Tarascó, Jordi; Puig-Domingo, Manel


    Nutritional deficiencies are common after bariatric surgery, but data are scarce after sleeve gastrectomy (SG) at long term. We performed a prospective nutritional status evaluation before and at 2 and 5 years after SG in morbid obese patients receiving mulvitamin and mineral supplementation at a Spanish university hospital. One hundred seventy-six patients (49.3 ± 9.1 years and 46.7 ± 7.4 kg/m 2 ) were evaluated; 51 of them were followed during 5 years. Anthropometric, compliance supplementation intake, and micronutrient evaluation were performed. Baseline concentrations were below normal values for 25(OH) vitamin D (73%), folic acid (16.5%), cobalamin (6.9%), pyridoxine (12%), thiamine (3.4%), and copper (0.5%). Anemia was found in 23%. In 49% of the subjects, at least one micronutrient deficiency was found at 2 years after SG. Vitamin D deficiency persisted at 2 and 5 years higher than 30% of patients. Frequencies of deficiencies for folic acid, B12, B6, and B1 vitamins decreased significantly after 2 years with normalization at 5 years. Copper deficiency increased between 1 and 2 years and it persisted at 5 years after SG. Vitamin supplementation compliance decreased progressively from the first year after surgery (94.8 to 81% at 2 years and to 53% 5 years after surgery). Vitamin D deficiency is the most prevalent long-term nutritional deficiency after SG. About half of patients show some micronutrient deficiency at medium long term, despite supplementation. A proactive follow-up is required to ensure a personalized and adequate supplementation in all surgically treated obese patients including those in which SG has been performed.

  3. Long-term leaching from MSWI air-pollution-control residues: Leaching characterization and modeling

    DEFF Research Database (Denmark)

    Hyks, Jiri; Astrup, Thomas; Christensen, Thomas Højlund


    Long-term leaching of Ca, Fe, Mg, K, Na, S, Al, As, Ba, Cd, Co, Cr, Cu, Hg, Mn, Ni, Pb, Zn, Mo, Sb, Si, Sri, Sr, Ti, V, P, Cl, and dissolved organic carbon from two different municipal solid waste incineration (MSWI) air-pollution-control residues was monitored during 24 months of column percolat......Long-term leaching of Ca, Fe, Mg, K, Na, S, Al, As, Ba, Cd, Co, Cr, Cu, Hg, Mn, Ni, Pb, Zn, Mo, Sb, Si, Sri, Sr, Ti, V, P, Cl, and dissolved organic carbon from two different municipal solid waste incineration (MSWI) air-pollution-control residues was monitored during 24 months of column...... percolation experiments; liquid-to-solid (L/S) ratios of 200-250 L/kg corresponding to more than 10,000 years in a conventional landfill were reached. Less than 2% of the initially present As, Cu, Pb, Zn, Cr, and Sb had leached during the Course of the experiments. Concentrations of Cd, Fe, Mg, Hg, Mn, Ni, Co......, Sn, Ti, and P were generally bellow 1 mu g/L; overall less than 1% of their mass leached. Column leaching data were further used in a two-step geochemical modeling in PHREEQC in order to (i) identify solubility controlling minerals and (ii) evaluate their interactions in a water-percolated column...

  4. Metabolic modelling to support long term strategic decisions on water supply systems (United States)

    Ciriello, Valentina; Felisa, Giada; Lauriola, Ilaria; Pomanti, Flavio; Di Federico, Vittorio


    Water resources are essential for the economic development and sustenance of anthropic activities belonging to the civil, agricultural and industrial sectors. Nevertheless, availability of water resources is not uniformly distributed in space and time. Moreover, the increasing water demand, mainly due to population growth and expansion of agricultural crops, may cause increasing water stress conditions, if combined with the effects of climate change. Under these circumstances, it is necessary to improve the resilience of water supply systems both in terms of infrastructures and environmental compliance. Metabolic modelling approaches represent a flexible tool able to provide support to decision making in the long term, based on sustainability criteria. These approaches mimic the water supply network through a set of material and energy fluxes that interact and influence each other. By analyzing these fluxes, a suite of key performance indicators is evaluated in order to identify which kind of interventions may be applied to increase the sustainability of the system. Here, we adopt these concepts to analyze the water supply network of Reggio-Emilia (Italy) which is supported by water withdrawals from both surface water and groundwater bodies. We analyze different scenarios, including possible reduction of water withdrawals from one of the different sources as a consequence of a decrease in water availability under present and future scenarios. On these basis, we identify preventive strategies for a dynamic management of the water supply system.

  5. Long-term evaluation of oral gavage with periodontopathogens or ligature induction of experimental periodontal disease in mice. (United States)

    de Molon, Rafael Scaf; Mascarenhas, Vinicius Ibiapina; de Avila, Erica Dorigatti; Finoti, Livia Sertori; Toffoli, Gustavo Boze; Spolidorio, Denise Madalena Palomari; Scarel-Caminaga, Raquel Mantuaneli; Tetradis, Sotirios; Cirelli, Joni Augusto


    To evaluate in long-term periods the destruction of periodontal tissues and bacterial colonization induced by oral gavage with periodontopathogens or ligature experimental periodontal disease models. Forty-eight C57BL/6 J mice were divided into four groups: group C: negative control; group L: ligature; group G-Pg: oral gavage with Porphyromonas gingivalis; and group G-PgFn: oral gavage with Porphyromonas gingivalis associated with Fusobacterium nucleatum. Mice were infected by oral gavage five times in 2-day intervals. After 45 and 60 days, animals were sacrificed and the immune-inflammatory response in the periodontal tissue was assessed by stereometric analysis. The alveolar bone loss was evaluated by live microcomputed tomography and histometric analysis. qPCR was used to confirm the bacterial colonization in all the groups. Data were analyzed using the Kruskal-Wallis, Wilcoxon, and ANOVA tests, at 5 % of significance level. Ligature model induced inflammation and bone resorption characterized by increased number of inflammatory cells and decreased number of fibroblasts, followed by advanced alveolar bone loss at 45 and 60 days (p periodontal disease induction, independent of tissue alterations. These mice models of periodontitis validates, compliments, and enhances published PD models that utilize ligature or oral gavage and supports the importance of a successful colonization of a susceptible host, a bacterial invasion into vulnerable tissue, and host-bacterial interactions that lead to tissue destruction. The ligature model was an effective approach to induce inflammation and bone loss similar to human periodontitis, but the oral gavage models were not efficient in inducing periodontal inflammation and tissue destruction in the conditions studied. Ligature models can provide a basis for future interventional studies that contribute to the understanding of the disease pathogenesis and the complex host response to microbial challenge.

  6. The Communication Model and the Nature of Change in Terms of Deforestation in China since 1949 (United States)

    Tian, Dexin; Chao, Chin-Chung


    This article explores the communication model and nature of change in terms of deforestation in China since 1949. Through Lasswell's communication model and the theory of change and via historical analysis and extended literature review, we have discovered: First, Mao's government adopted an effective one-way top-down communication model with…

  7. A facility for long term evaluation and quality assurance of LHCb Vertex Detector modules

    CERN Document Server

    Marinho, F; Dimattia, R; Doherty, F; Dumps, R; Gersabeck, M; Melone, J; Parkes, C; Saavedra, A; Tobin, M


    This note describes the facility developed for long term evaluation and quality assurance of the LHCb Vertex Detector modules, known as the 'Glasgow Burn-in System'. This facility was developed to ensure that the modules conform to stringent quality levels. The system was able to uncover any weaknesses that could be introduced during the manufacturing and assembly of the components or during the transport of the modules to CERN. The system consisted of: a high resolution microscope for visual inspections; and a burn-in system to operate cooled modules in vacuum. The main components of the burn-in system were a vacuum system, a cooling system and a DAQ system.

  8. Results of short-term corrosion evaluation tests at Raft River

    Energy Technology Data Exchange (ETDEWEB)

    Miller, R.L.


    Four categories of short-term materials evaluation tests were conducted in geothermal fluid from Raft River Geothermal Experiment, Well No. 1, to obtain corrosion data relevant to the design of the Raft River Thermal Loop Facility. Test programs are described and the testing philosophies are discussed. All materials and configurations which were tested are identified and details of posttest visual examinations are presented. The materials are then assigned to appropriate performance categories on the basis of test behavior, and the possible service limitations are appraised.

  9. Development, description and validation of a Tritium Environmental Release Model (TERM). (United States)

    Jeffers, Rebecca S; Parker, Geoffrey T


    Tritium is a radioisotope of hydrogen that exists naturally in the environment and may also be released through anthropogenic activities. It bonds readily with hydrogen and oxygen atoms to form tritiated water, which then cycles through the hydrosphere. This paper seeks to model the migration of tritiated species throughout the environment - including atmospheric, river and coastal systems - more comprehensively and more consistently across release scenarios than is currently in the literature. A review of the features and underlying conceptual models of some existing tritium release models was conducted, and an underlying aggregated conceptual process model defined, which is presented. The new model, dubbed 'Tritium Environmental Release Model' (TERM), was then tested against multiple validation sets from literature, including experimental data and reference tests for tritium models. TERM has been shown to be capable of providing reasonable results which are broadly comparable with atmospheric HTO release models from the literature, spanning both continuous and discrete release conditions. TERM also performed well when compared with atmospheric data. TERM is believed to be a useful tool for examining discrete and continuous atmospheric releases or combinations thereof. TERM also includes further capabilities (e.g. river and coastal release scenarios) that may be applicable to certain scenarios that atmospheric models alone may not handle well. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Dietetic Internship: Evaluation of an Integrated Model. (United States)

    Lordly, Daphne J.; Travers, Kim D.


    The purpose of this study was to utilize graduate and employer perceptions of outcomes of the Mount Saint Vincent University (MSVU) Co-operative Education (Co-op) Dietetics program to determine if an integrated model was an acceptable alternate method of dietetic education. Acceptable alternate was defined as: "facilitating achievement of entry level competence for dietetic practice". A self-administered, validated and piloted questionnaire was utilized to collect qualitative and quantitative information concerning employability, professional preparedness and program outcomes. Surveys were mailed to all program graduates (1989-1993) (n=24) and their first employers (n=19). Response rates were 96% and 89% respectively. Close-ended questions were analyzed quantitatively by determining frequency distributions. Data were also subjected to Chi-square to identify dependent factors. Qualitative responses to open-ended questions were analyzed by thematic content analysis. Results revealed all graduates were employed by six months after graduation. Competency development, a component of professional preparedness, was rated as average or above average by the majority of graduates and employers. Analysis of open-ended responses indicated that the introduction of experience while students were establishing theoretical foundations was perceived as beneficial. An integration of qualitative findings led to the development of a model depicting how professional competency development, readiness for practice, a realistic approach to dietetic practice and a high standard of practice were developed within an evolving personal and contextual framework. Socialization and mentoring opportunities, evaluation processes and the integration of theory and practice influenced professional development. In conclusion, both employer and graduate responses indicated overall program satisfaction suggesting that the Co-op program is an acceptable alternate method of dietetic education.

  11. Estimating Multivariate Exponentail-Affine Term Structure Models from Coupon Bound Prices using Nonlinear Filtering

    DEFF Research Database (Denmark)

    Baadsgaard, Mikkel; Nielsen, Jan Nygaard; Madsen, Henrik


    , the central tendency and stochastic volatility. Emphasis is placed on the particular class of exponential-affine term structure models that permits solving the bond pricing PDE in terms of a system of ODEs. It is assumed that coupon bond prices are contaminated by additive white noise, where the stochastic...

  12. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial (United States)

    The model performance evaluation consists of metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors.

  13. Evaluation of Some Mathematical Models for Estimating Evaluation ...

    African Journals Online (AJOL)

    The critical results of the application of these models showed that under a variety of atmospheric conditions Penman-Brutsaert model gave the best description of the measured fluxes. Priestley – Taylor with α = 1.26 performed best under unstable atmospheric conditions especially where radiation is the primary mechanism ...

  14. Supersymmetric models on magnetized orbifolds with flux-induced Fayet-Iliopoulos terms (United States)

    Abe, Hiroyuki; Kobayashi, Tatsuo; Sumita, Keigo; Tatsuta, Yoshiyuki


    We study supersymmetric (SUSY) models derived from the ten-dimensional SUSY Yang-Mills theory compactified on magnetized orbifolds, with nonvanishing Fayet-Iliopoulos (FI) terms induced by magnetic fluxes in extra dimensions. Allowing the presence of FI-terms relaxes a constraint on flux configurations in SUSY model building based on magnetized backgrounds. In this case, charged fields develop their vacuum expectation values to cancel the FI-terms in the D-flat directions of fluxed gauge symmetries, which break the gauge symmetries and lead to a SUSY vacuum. Based on this idea, we propose a new class of SUSY magnetized orbifold models with three generations of quarks and leptons. Especially, we construct a model where the right-handed sneutrinos develop their vacuum expectation values which restore the supersymmetry but yield lepton number violating terms below the compactification scale, and show their phenomenological consequences.

  15. Code-switched English pronunciation modeling for Swahili spoken term detection

    CSIR Research Space (South Africa)

    Kleynhans, N


    Full Text Available We investigate modeling strategies for English code-switched words as found in a Swahili spoken term detection system. Code switching, where speakers switch language in a conversation, occurs frequently in multilingual environments, and typically...

  16. Using Probablilistic Risk Assessment to Model Medication System Failures in Long-Term Care Facilities

    National Research Council Canada - National Science Library

    Comden, Sharon C; Marx, David; Murphy-Carley, Margaret; Hale, Misti


    Objective: State agencies and Oregon's long-term care providers cosponsored this developmental study to explore the creation of two statewide medication system risk models using sociotechnical probabilistic risk assessment (ST-PRA...

  17. Comfrey (Symphytum officinale. L. and Experimental Hepatic Carcinogenesis: A Short-Term Carcinogenesis Model Study

    Directory of Open Access Journals (Sweden)

    Maria Fernanda Pereira Lavieri Gomes


    Full Text Available Comfrey or Symphytum officinale (L. (Boraginaceae is a very popular plant used for therapeutic purposes. Since the 1980s, its effects have been studied in long-term carcinogenesis studies, in which Comfrey extract is administered at high doses during several months and the neoplastic hepatic lesions are evaluated. However, the literature on this topic is very poor considering the studies performed under short-term carcinogenesis protocols, such as the ‘resistant hepatocyte model’ (RHM. In these studies, it is possible to observe easily the phenomena related to the early phases of tumor development, since pre-neoplastic lesions (PNLs rise in about 1–2 months of chemical induction. Herein, the effects of chronic oral treatment of rats with 10% Comfrey ethanolic extract were evaluated in a RHM. Wistar rats were sequentially treated with N-nitrosodiethylamine (ip and 2-acetilaminofluorene (po, and submitted to hepatectomy to induce carcinogenesis promotion. Macroscopic/microscopic quantitative analysis of PNL was performed. Non-parametric statistical tests (Mann–Whitney and χ2 were used, and the level of significance was set at P ≤ 0.05. Comfrey treatment reduced the number of pre-neoplastic macroscopic lesions up to 1 mm (P ≤ 0.05, the percentage of oval cells (P = 0.0001 and mitotic figures (P = 0.007, as well as the number of Proliferating Cell Nuclear Antigen (PCNA positive cells (P = 0.0001 and acidophilic pre-neoplastic nodules (P = 0.05. On the other hand, the percentage of cells presenting megalocytosis (P = 0.0001 and vacuolar degeneration (P = 0.0001 was increased. Scores of fibrosis, glycogen stores and the number of nucleolus organizing regions were not altered. The study indicated that oral treatment of rats with 10% Comfrey alcoholic extract reduced cell proliferation in this model.

  18. Metabolic Profiling of Human Long-Term Liver Models and Hepatic Clearance Predictions from In Vitro Data Using Nonlinear Mixed-Effects Modeling. (United States)

    Kratochwil, Nicole A; Meille, Christophe; Fowler, Stephen; Klammers, Florian; Ekiciler, Aynur; Molitor, Birgit; Simon, Sandrine; Walter, Isabelle; McGinnis, Claudia; Walther, Johanna; Leonard, Brian; Triyatni, Miriam; Javanbakht, Hassan; Funk, Christoph; Schuler, Franz; Lavé, Thierry; Parrott, Neil J


    Early prediction of human clearance is often challenging, in particular for the growing number of low-clearance compounds. Long-term in vitro models have been developed which enable sophisticated hepatic drug disposition studies and improved clearance predictions. Here, the cell line HepG2, iPSC-derived hepatocytes (iCell®), the hepatic stem cell line HepaRG™, and human hepatocyte co-cultures (HμREL™ and HepatoPac®) were compared to primary hepatocyte suspension cultures with respect to their key metabolic activities. Similar metabolic activities were found for the long-term models HepaRG™, HμREL™, and HepatoPac® and the short-term suspension cultures when averaged across all 11 enzyme markers, although differences were seen in the activities of CYP2D6 and non-CYP enzymes. For iCell® and HepG2, the metabolic activity was more than tenfold lower. The micropatterned HepatoPac® model was further evaluated with respect to clearance prediction. To assess the in vitro parameters, pharmacokinetic modeling was applied. The determination of intrinsic clearance by nonlinear mixed-effects modeling in a long-term model significantly increased the confidence in the parameter estimation and extended the sensitive range towards 3% of liver blood flow, i.e., >10-fold lower as compared to suspension cultures. For in vitro to in vivo extrapolation, the well-stirred model was used. The micropatterned model gave rise to clearance prediction in man within a twofold error for the majority of low-clearance compounds. Further research is needed to understand whether transporter activity and drug metabolism by non-CYP enzymes, such as UGTs, SULTs, AO, and FMO, is comparable to the in vivo situation in these long-term culture models.

  19. Model-based Type B uncertainty evaluations of measurement towards more objective evaluation strategies

    NARCIS (Netherlands)

    Boumans, M.


    This article proposes a more objective Type B evaluation. This can be achieved when Type B uncertainty evaluations are model-based. This implies, however, grey-box modelling and validation instead of white-box modelling and validation which are appropriate for Type A evaluation.

  20. Semantic Modeling for Exposomics with Exploratory Evaluation in Clinical Context

    Directory of Open Access Journals (Sweden)

    Jung-wei Fan


    Full Text Available Exposome is a critical dimension in the precision medicine paradigm. Effective representation of exposomics knowledge is instrumental to melding nongenetic factors into data analytics for clinical research. There is still limited work in (1 modeling exposome entities and relations with proper integration to mainstream ontologies and (2 systematically studying their presence in clinical context. Through selected ontological relations, we developed a template-driven approach to identifying exposome concepts from the Unified Medical Language System (UMLS. The derived concepts were evaluated in terms of literature coverage and the ability to assist in annotating clinical text. The generated semantic model represents rich domain knowledge about exposure events (454 pairs of relations between exposure and outcome. Additionally, a list of 5667 disorder concepts with microbial etiology was created for inferred pathogen exposures. The model consistently covered about 90% of PubMed literature on exposure-induced iatrogenic diseases over 10 years (2001–2010. The model contributed to the efficiency of exposome annotation in clinical text by filtering out 78% of irrelevant machine annotations. Analysis into 50 annotated discharge summaries helped advance our understanding of the exposome information in clinical text. This pilot study demonstrated feasibility of semiautomatically developing a useful semantic resource for exposomics.

  1. Evaluating standard airborne sound insulation measures in terms of annoyance, loudness, and audibility ratings. (United States)

    Park, H K; Bradley, J S


    This paper reports the results of an evaluation of the merits of standard airborne sound insulation measures with respect to subjective ratings of the annoyance and loudness of transmitted sounds. Subjects listened to speech and music sounds modified to represent transmission through 20 different walls with sound transmission class (STC) ratings from 34 to 58. A number of variations in the standard measures were also considered. These included variations in the 8-dB rule for the maximum allowed deficiency in the STC measure as well as variations in the standard 32-dB total allowed deficiency. Several spectrum adaptation terms were considered in combination with weighted sound reduction index (R(w)) values as well as modifications to the range of included frequencies in the standard rating contour. A STC measure without an 8-dB rule and an R(w) rating with a new spectrum adaptation term were better predictors of annoyance and loudness ratings of speech sounds. R(w) ratings with one of two modified C(tr) spectrum adaptation terms were better predictors of annoyance and loudness ratings of transmitted music sounds. Although some measures were much better predictors of responses to one type of sound than were the standard STC and R(w) values, no measure was remarkably improved for predicting annoyance and loudness ratings of both music and speech sounds.

  2. Clinical evaluation of a visible light-cured indirect composite for long-term provisionalization. (United States)

    Ewoldsen, Nels; Sundar, Veeraraghavan; Bennett, William; Kanya, Kevin; Magyar, Karl


    To clinically evaluate a visible light-cured (VLC) resin composite system for long-term provisional and esthetic diagnostic restorations, fabricated using indirect techniques. One-hundred and nine teeth were restored in 31 patients. Preoperational impressions were used to create VLC resin composite restorations (Radica) using indirect techniques. Restorations were relined as necessary and placed using various provisional cements at a follow-up appointment, subsequent to preparation of the teeth. Both fabricating laboratory technicians and placing dentists rated the restorations for acceptability in esthetics, marginal fit, occlusion, and functionality in various stages of provisionalization. All restorations (100%) were rated acceptable for esthetics prior to relining. After relining, a majority (93-100%) of restorations were rated acceptable in esthetic and functional criteria. At the placement of the permanent restoration, a majority (96-100%) of restorations were rated acceptable in esthetic and functional criteria. Terms of service ranged from two to seventy-six days. In combination with in vitro results, the clinical performance of the Radica VLC system for provisionalization and esthetic diagnostic restorations was judged to be acceptable. The system offers esthetics that are superior to conventional provisional restorations, and should be a valuable option to practitioners considering longer-term provisionalization in complex cases.

  3. [Short term evaluation of a new treatment method for primary snoring: radiofrequency energy]. (United States)

    Attal, P; Popot, B; Le Pajolec, C; Alfandarry, D; Maruani, N; Ageel, M; Escourrou, P; Bobin, S


    The usual treatments for primary snoring are uvulopalatopharyngoplasty (UPPPP) and laser assisted uvulopalatoplasty, their morbidity is known, particularly considerable postoperative pain. We have performed a prospective study on a population of 23 patients with primary snoring (respiratory disturbance index =6 +/- 1) to 1) evaluates by means of questionnaires the short term (2 month) efficacy of the application of radiofrequency energy at the level of the soft palate, and 2) assess the morbidity associated with this treatment in this population. In the group of patients receiving treatment at three sites in a maximum of three sessions, the short term satisfaction rate was 75 %. The postoperative pain was of minor importance and the use of major analgesics was necessary in 2 cases only. We conclude that this treatment with radiofrequency energy seems to be an effective treatment for primary snoring at short term. The morbidity, specially the pain seems to be distinctly inferior compared to UPPP or laser treatment. Further prospective studies on a larger number of patients and with a longer follow-up are absolutely necessary to confirm these results.

  4. Evaluation of the long-term performance of six alternative disposal methods for LLRW

    Energy Technology Data Exchange (ETDEWEB)

    Kossik, R.; Sharp, G. [Golder Associates, Inc., Redmond, WA (United States); Chau, T. [Rogers & Associates Engineering Corp., Salt Lake City, UT (United States)


    The State of New York has carried out a comparison of six alternative disposal methods for low-level radioactive waste (LLRW). An important part of these evaluations involved quantitatively analyzing the long-term (10,000 yr) performance of the methods with respect to dose to humans, radionuclide concentrations in the environment, and cumulative release from the facility. Four near-surface methods (covered above-grade vault, uncovered above-grade vault, below-grade vault, augered holes) and two mine methods (vertical shaft mine and drift mine) were evaluated. Each method was analyzed for several generic site conditions applicable for the state. The evaluations were carried out using RIP (Repository Integration Program), an integrated, total system performance assessment computer code which has been applied to radioactive waste disposal facilities both in the U.S. (Yucca Mountain, WIPP) and worldwide. The evaluations indicate that mines in intact low-permeability rock and near-surface facilities with engineered covers generally have a high potential to perform well (within regulatory limits). Uncovered above-grade vaults and mines in highly fractured crystalline rock, however, have a high potential to perform poorly, exceeding regulatory limits.

  5. Improving short-term grade block models: alternative for correcting soft data

    Directory of Open Access Journals (Sweden)

    Cristina da Paixão Araújo

    Full Text Available Abstract Short-term mining planning typically relies on samples obtained from channels or less-accurate sampling methods. The results may include larger sampling errors than those derived from diamond drill hole core samples. The aim of this paper is to evaluate the impact of the sampling error on grade estimation and propose a method of correcting the imprecision and bias in the soft data. In addition, this paper evaluates the benefits of using soft data in mining planning. These concepts are illustrated via a gold mine case study, where two different data types are presented. The study used Au grades collected via diamond drilling (hard data and channels (soft data. Four methodologies were considered for estimation of the Au grades of each block to be mined: ordinary kriging with hard and soft data pooled without considering differences in data quality; ordinary kriging with only hard data; standardized ordinary kriging with pooled hard and soft data; and standardized, ordinary cokriging. The results show that even biased samples collected using poor sampling protocols improve the estimates more than a limited number of precise and unbiased samples. A welldesigned estimation method corrects the biases embedded in the samples, mitigating their propagation to the block model.

  6. An Evaluation Methodology for Longitudinal Studies of Short-Term Cancer Research Training Programs. (United States)

    Padilla, Luz A; Venkatesh, Raam; Daniel, Casey L; Desmond, Renee A; Brooks, C Michael; Waterbor, John W


    The need to familiarize medical students and graduate health professional students with research training opportunities that cultivate the appeal of research careers is vital to the future of research. Comprehensive evaluation of a cancer research training program can be achieved through longitudinal tracking of program alumni to assess the program's impact on each participant's career path and professional achievements. With advances in technology and smarter means of communication, effective ways to track alumni have changed. In order to collect data on the career outcomes and achievements of nearly 500 short-term cancer research training program alumni from 1999-2013, we sought to contact each alumnus to request completion of a survey instrument online, or by means of a telephone interview. The effectiveness of each contact method that we used was quantified according to ease of use and time required. The most reliable source of contact information for tracking alumni from the early years of the program was previous tracking results, and for alumni from the later years, the most important source of contact information was university alumni records that provided email addresses and telephone numbers. Personal contacts with former preceptors were sometimes helpful, as were generic search engines and people search engines. Social networking was of little value for most searches. Using information from two or more sources in combination was most effective in tracking alumni. These results provide insights and tools for other research training programs that wish to track their alumni for long-term program evaluation.

  7. Evaluating the relationship between breakfast pattern and short-term memory in junior high school girls. (United States)

    Ahmadi, A; Sohrabi, Z; Eftekhari, M H


    The aim of this study was to evaluate the relationship between breakfast pattern and short-term memory in guidance-school students. Memory improves for subjects who have eaten breakfast. It appears that breakfast consumption influences cognition via several mechanisms. What children eat for breakfast before going to school is very important. A total of 150 junior high school girls were taken from a subject pool in four schools in Shiraz (capital of the Fars Province in Iran). They filled out the socio-economic questionnaires as well as food frequency questionnaires for breakfast and provided two-three day breakfast records in two different seasons and their short-term memories were evaluated by Weksler test socio-economic conditions and dietary intakes were analyzed. The results of the study showed that there was no correlation between parents job, students mean age and their school grades with their memory scores. Dietary analysis demonstrated a negative correlation between local soup consumption in breakfast and memory scores. Food record analysis showed no correlation between fat, cholesterol, protein, vitamin B6, B12, calorie and iodine intake in breakfast and memory scores, but there was a positive correlation between carbohydrate, iron and vitamin B3 intake in breakfast and memory scores, similarly there was a positive correlation between B12 intake in the breakfast and students' average school grades during the year.

  8. Evaluating the long-term effectiveness of the Maternity Emergency Care course in remote Australia. (United States)

    Belton, Suzanne; Campbell, Marcel; Foxley, Sally; Hamerton, Bev; Gladman, Justin; McGrath, Sally; Piller, Neil; Saunders, Nathan; Vaughan, Fran


    The Council for Remote Area Nurses of Australia deliver the MEC course which is the only short-course on maternity emergencies offered to non-midwifery qualified remote area nurses and Aboriginal Health Workers. The aim of the course is to improve the maternity emergency skills and knowledge of health service providers who do not have midwifery qualifications. There has been no long-term evaluation of the course since its inception. To review the longer-term effectiveness of the maternity emergency care (MEC) course which was developed in consultation with the Australian College of Midwives (ACM) and rural and remote practitioners in 2003. Fifty-seven clinicians who completed the MEC course since 2003 responded to a survey. Seven remote area health managers and two course facilitators were interviewed. This study provides an evaluation of the experiences of non-midwives who manage maternity emergencies in the rural and remote setting; their perception of the skills, knowledge and confidence acquired through participation in the MEC program. The MEC course is valued by both remote health managers and practitioners. The learning activities, skills and knowledge gained are reported to be very beneficial and used by remote health practitioners. 2009. Published by Elsevier Ltd. All rights reserved.

  9. visCOS: An R-package to evaluate model performance of hydrological models (United States)

    Klotz, Daniel; Herrnegger, Mathew; Wesemann, Johannes; Schulz, Karsten


    The evaluation of model performance is a central part of (hydrological) modelling. Much attention has been given to the development of evaluation criteria and diagnostic frameworks. (Klemeš, 1986; Gupta et al., 2008; among many others). Nevertheless, many applications exist for which objective functions do not yet provide satisfying summaries. Thus, the necessity to visualize results arises in order to explore a wider range of model capacities, be it strengths or deficiencies. Visualizations are usually devised for specific projects and these efforts are often not distributed to a broader community (e.g. via open source software packages). Hence, the opportunity to explicitly discuss a state-of-the-art presentation technique is often missed. We therefore present a comprehensive R-package for evaluating model performance by visualizing and exploring different aspects of hydrological time-series. The presented package comprises a set of useful plots and visualization methods, which complement existing packages, such as hydroGOF (Zambrano-Bigiarini et al., 2012). It is derived from practical applications of the hydrological models COSERO and COSEROreg (Kling et al., 2014). visCOS, providing an interface in R, represents an easy-to-use software package for visualizing and assessing model performance and can be implemented in the process of model calibration or model development. The package provides functions to load hydrological data into R, clean the data, process, visualize, explore and finally save the results in a consistent way. Together with an interactive zoom function of the time series, an online calculation of the objective functions for variable time-windows is included. Common hydrological objective functions, such as the Nash-Sutcliffe Efficiency and the Kling-Gupta Efficiency, can also be evaluated and visualized in different ways for defined sub-periods like hydrological years or seasonal sections. Many hydrologists use long-term water-balances as a

  10. The Analytical Repository Source-Term (AREST) model: Description and documentation

    Energy Technology Data Exchange (ETDEWEB)

    Liebetrau, A.M.; Apted, M.J.; Engel, D.W.; Altenhofen, M.K.; Strachan, D.M.; Reid, C.R.; Windisch, C.F.; Erikson, R.L.; Johnson, K.I.


    The geologic repository system consists of several components, one of which is the engineered barrier system. The engineered barrier system interfaces with natural barriers that constitute the setting of the repository. A model that simulates the releases from the engineered barrier system into the natural barriers of the geosphere, called a source-term model, is an important component of any model for assessing the overall performance of the geologic repository system. The Analytical Repository Source-Term (AREST) model being developed is one such model. This report describes the current state of development of the AREST model and the code in which the model is implemented. The AREST model consists of three component models and five process models that describe the post-emplacement environment of a waste package. All of these components are combined within a probabilistic framework. The component models are a waste package containment (WPC) model that simulates the corrosion and degradation processes which eventually result in waste package containment failure; a waste package release (WPR) model that calculates the rates of radionuclide release from the failed waste package; and an engineered system release (ESR) model that controls the flow of information among all AREST components and process models and combines release output from the WPR model with failure times from the WPC model to produce estimates of total release. 167 refs., 40 figs., 12 tabs.

  11. Evaluation on Long-term Cooling of CANDU after Sump Blockage using MARS-KS

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Seon Oh; Cho, Yong Jin [KINS, Daejeon (Korea, Republic of); Kim, Sung Joong [Hanyang University, Seoul (Korea, Republic of)


    There was a real incident that part of the fibrous insulation debris stripped by steam jet was transported to the pool and clogged the intake strainers of the drywell spray system, which revealed a weakness in the defense-in-depth concept which under other circumstances could have led to the ECCS failing to provide coolant to the core. Since the above Barseback-2 incident in 1992, lots of the international activities have been carried out to identify essential parameters and physical phenomena and to promote consensus on the technical issues, important for safety and possible paths for their resolution. In nuclear power plant under operation, if an unplanned reactor trip or a power reduction occurs, operators are required to maintain the reactor in a stable state according to emergency operating procedure (EOP) and to take diagnosis and appropriate mitigation actions if necessary. Subject to the EOP of Wolsong unit 1 (the first Korean PHWR NPP) under LOCA, intact or broken loops are diagnosed using the available plant information such as pressure and temperature of outlet headers. For the intact loop, effective long-term cooling is envisioned through the operation of shutdown cooling system as implemented in the EOP. In this work, the adequacy of long-term cooling during the recirculation phase of LOCA was evaluated under the postulated condition of the reduced flow path of the recirculation sump due to the inflow of substantial amount of debris released by the break flow with high energy. For the intact loop, although the incipience of boiling in the fuel channel was evaluated to occur, the effective long-term cooling can be achieved through the shutdown cooling system as guided in the EOP.

  12. Interest Rates with Long Memory: A Generalized Affine Term-Structure Model

    DEFF Research Database (Denmark)

    Osterrieder, Daniela

    We propose a model for the term structure of interest rates that is a generalization of the discrete-time, Gaussian, affine yield-curve model. Compared to standard affine models, our model allows for general linear dynamics in the vector of state variables. In an application to real yields of U...... by a level, a slope, and a curvature factor that arise naturally from the co-fractional modeling framework. We show that implied yields match the level and the variability of yields well over time. Studying the out-of-sample forecasting accuracy of our model, we find that our model results in good yield...

  13. Modelling bidirectional modulations in synaptic plasticity: A biochemical pathway model to understand the emergence of long term potentiation (LTP) and long term depression (LTD). (United States)

    He, Yao; Kulasiri, Don; Samarasinghe, Sandhya


    Synaptic plasticity induces bidirectional modulations of the postsynaptic response following a synaptic transmission. The long term forms of synaptic plasticity, named long term potentiation (LTP) and long term depression (LTD), are critical for the antithetic functions of the memory system, memory formation and removal, respectively. A common Ca(2+) signalling upstream triggers both LTP and LTD, and the critical proteins and factors coordinating the LTP/LTD inductions are not well understood. We develop an integrated model based on the sub-models of the indispensable synaptic proteins in the emergence of synaptic plasticity to validate and understand their potential roles in the expression of synaptic plasticity. The model explains Ca(2+)/calmodulin (CaM) complex dependent coordination of LTP/LTD expressions by the interactions among the indispensable proteins using the experimentally estimated kinetic parameters. Analysis of the integrated model provides us with insights into the effective timescales of the key proteins and we conclude that the CaM pool size is critical for the coordination between LTP/LTD expressions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Longitudinal evaluation, acceptability and long-term retention of knowledge on a horizontally integrated organic and functional systems course. (United States)

    Palha, Joana Almeida; Almeida, Armando; Correia-Pinto, Jorge; Costa, Manuel João; Ferreira, Maria Amélia; Sousa, Nuno


    Undergraduate medical education is moving from traditional disciplinary basic science courses into more integrated curricula. Integration models based on organ systems originated in the 1950s, but few longitudinal studies have evaluated their effectiveness. This article outlines the development and implementation of the Organic and Functional Systems (OFS) courses at the University of Minho in Portugal, using evidence collected over 10 years. It describes the organization of content, student academic performance and acceptability of the courses, the evaluation of preparedness for future courses and the retention of knowledge on basic sciences. Students consistently rated the OFS courses highly. Physician tutors in subsequent clinical attachments considered that students were appropriately prepared. Performance in the International Foundations of Medicine examination of a self-selected sample of students revealed similar performances in basic science items after the last OFS course and 4 years later, at the moment of graduation. In conclusion, the organizational and pedagogical approaches of the OFS courses achieve high acceptability by students and result in positive outcomes in terms of preparedness for subsequent training and long-term retention of basic science knowledge.

  15. Educational Program Evaluation Model, From the Perspective of the New Theories

    Directory of Open Access Journals (Sweden)

    Soleiman Ahmady


    Full Text Available Introduction: This study is focused on common theories that influenced the history of program evaluation and introduce the educational program evaluation proposal format based on the updated theory. Methods: Literature searches were carried out in March-December 2010 with a combination of key words, MeSH terms and other free text terms as suitable for the purpose. A comprehensive search strategy was developed to search Medline by the PubMed interface, ERIC (Education Resources Information Center and the main journal of medical education regarding current evaluation models and theories. We included all study designs in our study. We found 810 articles related to our topic, and finally 63 with the full text article included. We compared documents and used expert consensus for selection the best model. Results: We found that the complexity theory using logic model suggests compatible evaluation proposal formats, especially with new medical education programs. Common components of a logic model are: situation, inputs, outputs, and outcomes that our proposal format is based on. Its contents are: title page, cover letter, situation and background, introduction and rationale, project description, evaluation design, evaluation methodology, reporting, program evaluation management, timeline, evaluation budget based on the best evidences, and supporting documents. Conclusion: We found that the logic model is used for evaluation program planning in many places, but more research is needed to see if it is suitable for our context.

  16. Postsynaptic signal transduction models for long-term potentiation and depression

    Directory of Open Access Journals (Sweden)

    Tiina Manninen


    Full Text Available More than a hundred biochemical species, activated by neurotransmitters binding to transmembrane receptors, are important in long-term potentiation and depression. To investigate which species and interactions are critical for synaptic plasticity, many computational postsynaptic signal transduction models have been developed. The models range from simple models with a single reversible reaction to detailed models with several hundred kinetic reactions. In this study, more than a hundred models are reviewed, and their features are compared and contrasted so that similarities and differences are more readily apparent. The models are classified according to the type of synaptic plasticity that is modeled (long-term potentiation or long-term depression and whether they include diffusion or electrophysiological phenomena. Other characteristics that discriminate the models include the phase of synaptic plasticity modeled (induction, expression, or maintenance and the simulation method used (deterministic or stochastic method. We find that models are becoming increasingly sophisticated, by including stochastic properties, integrating with electrophysiological properties of entire neurons, or incorporating diffusion of signaling molecules. Simpler models continue to be developed because they are computationally efficient and allow theoretical analysis. The more complex models permit investigation of mechanisms underlying specific properties and experimental verification of model predictions. Nonetheless, it is difficult to fully comprehend the evolution of these models because (1 several models are not described in detail in the publications, (2 only a few models are provided in existing model databases, and (3 comparison to previous models is lacking. We conclude that the value of these models for understanding molecular mechanisms of synaptic plasticity is increasing and will be enhanced further with more complete descriptions and sharing of the

  17. Global daily reference evapotranspiration modeling and evaluation (United States)

    Senay, G.B.; Verdin, J.P.; Lietzow, R.; Melesse, Assefa M.


    Accurate and reliable evapotranspiration (ET) datasets are crucial in regional water and energy balance studies. Due to the complex instrumentation requirements, actual ET values are generally estimated from reference ET values by adjustment factors using coefficients for water stress and vegetation conditions, commonly referred to as crop coefficients. Until recently, the modeling of reference ET has been solely based on important weather variables collected from weather stations that are generally located in selected agro-climatic locations. Since 2001, the National Oceanic and Atmospheric Administration’s Global Data Assimilation System (GDAS) has been producing six-hourly climate parameter datasets that are used to calculate daily reference ET for the whole globe at 1-degree spatial resolution. The U.S. Geological Survey Center for Earth Resources Observation and Science has been producing daily reference ET (ETo) since 2001, and it has been used on a variety of operational hydrological models for drought and streamflow monitoring all over the world. With the increasing availability of local station-based reference ET estimates, we evaluated the GDAS-based reference ET estimates using data from the California Irrigation Management Information System (CIMIS). Daily CIMIS reference ET estimates from 85 stations were compared with GDAS-based reference ET at different spatial and temporal scales using five-year daily data from 2002 through 2006. Despite the large difference in spatial scale (point vs. ∼100 km grid cell) between the two datasets, the correlations between station-based ET and GDAS-ET were very high, exceeding 0.97 on a daily basis to more than 0.99 on time scales of more than 10 days. Both the temporal and spatial correspondences in trend/pattern and magnitudes between the two datasets were satisfactory, suggesting the reliability of using GDAS parameter-based reference ET for regional water and energy balance studies in many parts of the world

  18. 76 FR 68769 - Bridging the Idea Development Evaluation Assessment and Long-Term Initiative and Total Product... (United States)


    ... No. FDA-2011-N-0002] Bridging the Idea Development Evaluation Assessment and Long-Term Initiative and... various stakeholders to further refine and advance the Idea Development Evaluation Assessment and Long... and evaluation for surgical devices and procedures. Date and Time: The meeting will be held on...

  19. Inventory and source term evaluation of Russian nuclear power plants for marine applications

    Energy Technology Data Exchange (ETDEWEB)

    Reistad, O. [Norwegian Radiation Protection Authority (Norway); Oelgaard, P.L. [Risoe National Lab. (Denmark)


    This report discusses inventory and source term properties in regard to operation and possible releases due to accidents from Russian marine reactor systems. The first part of the report discusses relevant accidents on the basis of both Russian and western sources. The overview shows that certain vessels were much more accident prone compared to others, in addition, there have been a noteworthy reduction in accidents the last two decades. However, during the last years new types of incidents, such as collisions, has occurred more frequently. The second part of the study considers in detail the most important factors for the source term; reactor operational characteristics and the radionuclide inventory. While Russian icebreakers has been operated on a similar basis as commercial power plants, the submarines has different power cyclograms which results in considerable lower values for fission product inventory. Theoretical values for radionuclide inventory are compared with computed results using the modelling tool HELIOS. Regarding inventory of transuranic elements, the results of the calculations are discussed in detail for selected vessels. Criticality accidents, loss-of-cooling accidents and sinking accidents are considered, bases on actual experiences with these types of accident and on theoretical considerations, and source terms for these accidents are discussed in the last chapter. (au)

  20. Presenting an Evaluation Model for the Cancer Registry Software. (United States)

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh


    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  1. Exploring the predictive power of interaction terms in a sophisticated risk equalization model using regression trees. (United States)

    van Veen, S H C M; van Kleef, R C; van de Ven, W P M M; van Vliet, R C J A


    This study explores the predictive power of interaction terms between the risk adjusters in the Dutch risk equalization (RE) model of 2014. Due to the sophistication of this RE-model and the complexity of the associations in the dataset (N = ~16.7 million), there are theoretically more than a million interaction terms. We used regression tree modelling, which has been applied rarely within the field of RE, to identify interaction terms that statistically significantly explain variation in observed expenses that is not already explained by the risk adjusters in this RE-model. The interaction terms identified were used as additional risk adjusters in the RE-model. We found evidence that interaction terms can improve the prediction of expenses overall and for specific groups in the population. However, the prediction of expenses for some other selective groups may deteriorate. Thus, interactions can reduce financial incentives for risk selection for some groups but may increase them for others. Furthermore, because regression trees are not robust, additional criteria are needed to decide which interaction terms should be used in practice. These criteria could be the right incentive structure for risk selection and efficiency or the opinion of medical experts. Copyright © 2017 John Wiley & Sons, Ltd.

  2. A simple rainfall-runoff model for the single and long term hydrological performance of green roofs

    DEFF Research Database (Denmark)

    Locatelli, Luca; Mark, Ole; Mikkelsen, Peter Steen

    Green roofs are being widely implemented for storm water control and runoff reduction. There is need for incorporating green roofs into urban drainage models in order to evaluate their impact. These models must have low computational costs and fine time resolution. This paper aims to develop...... a model of green roof hydrological performance. A simple conceptual model for the long term and single event hydrological performance of green roofs, shows to be capable of reproducing observed runoff measurements. The model has surface and subsurface storage components representing the overall retention...... capacity of the green roof. The runoff from the system is described by the non-linear reservoir method and the storage capacity of the green roof is continuously re-established by evapotranspiration. Runoff data from a green roof in Denmark are collected and used for parameter calibration....

  3. Osteopathic evaluation of somatic dysfunction and craniosacral strain pattern among preterm and term newborns. (United States)

    Pizzolorusso, Gianfranco; Cerritelli, Francesco; D'Orazio, Marianna; Cozzolino, Vincenzo; Turi, Patrizia; Renzetti, Cinzia; Barlafante, Gina; D'Incecco, Carmine


    Palpatory skills are a central part of osteopathic manipulative treatment and palpatory diagnosis. The aim of osteopathic structural examination is to locate somatic dysfunction and cranial strain pattern, which are the hallmarks that form the basis for treatment decisions and strategy. In the osteopathic literature, there is a lack of studies evaluating preterm or term newborns during hospitalization. To determine the prevalence of somatic dysfunction and cranial strain pattern in a population of preterm and term newborns who were treated in a neonatal intensive care unit (NICU). During a period of 6 months--November 2009 through April 2010--the authors performed a retrospective review of data on consecutive preterm and term newborns who were admitted to the NICU of the Spirito Santo Public Hospital. Osteo pathic evaluation was performed once on each newborn, and somatic dysfunction and cranial strain pattern were identified. Descriptive analysis and test of association based on the χ(2) test were performed. One hundred fifty-five preterm and term newborns met the study's eligibility criteria. The highest rate of somatic dysfunction was found in the pelvic area of 63 newborns (40.7%). The sacroiliac joints were compressed unilaterally or bilaterally in 82 newborns (52.9%); the lumbosacral junction was restricted in 61 newborns (39.4%), and intraosseous lesions of the sacral bone were diagnosed in 57 newborns (36.8%). The spine accounted for somatic dysfunction in 38 newborns (24.5%), with the middle thoracic and lower thoracic areas restricted in 29 (18.7%) and 21 (16.8%) newborns, respectively. Sphenobasilar synchondrosis compression and lateral-vertical strain were diagnosed in 57 newborns (36.8%), with the sagittal and the coronal sutures found restricted in 35 (22.6%) and 30 (19.4%) newborns, respectively. The occipital bone presented the highest rate of intraosseous lesions, with the left condyle compressed in 48 newborns (31%), the right condyle in 46

  4. An Evaluation Model of Digital Educational Resources

    Directory of Open Access Journals (Sweden)

    Abderrahim El Mhouti


    Full Text Available Abstract—Today, the use of digital educational resources in teaching and learning is considerably expanding. Such expansion calls educators and computer scientists to reflect more on the design of such products. However, this reflection exposes a number of criteria and recommendations that can guide and direct any teaching tool design be it campus-based or online (e-learning. Our work is at the heart of this issue. We suggest, through this article, examining academic, pedagogical, didactic and technical criteria to conduct this study which aims to evaluate the quality of digital educational resources. Our approach consists in addressing the specific and relevant factors of each evaluation criterion. We will then explain the detailed structure of the evaluation instrument used : “evaluation grid”. Finally, we show the evaluation outcomes based on the conceived grid and then we establish an analytical evaluation of the state of the art of digital educational resources.

  5. A proposed monitoring and evaluation curriculum based on a model that institutionalises monitoring and evaluation

    Directory of Open Access Journals (Sweden)

    Kambidima Wotela


    Full Text Available Background: African politicians, bureaucrats and technocrats have thrown their weight in support of monitoring and evaluation (M&E. This weight has compelled training institutions to add M&E to their offerings. Most often at the end of these training programmes, attendees know what they have learnt but seem not to internalise it and, worse, they hardly ever put their newly acquired knowledge into practice. This allegation has led to what we term ‘monitoring and evaluation training hopping’ where participants move from one training to another hoping that they will eventually fully comprehend the skill and apply it to their work. This rarely happens and as such participants often blame themselves and yet the problem is with the training institutions that are teaching the middle-third tier (how to monitor and evaluate as well as the bottom-third tier (data and information management. However, the top-third tier that links M&E to ‘the what’ and ‘the how’ as well as ‘the why’ in the development intervention and public policy landscape is missing.Objectives: To propose a M&E curriculum that institutionalises M&E within implementation and management of development interventions.Method: We use systems thinking to derive the key themes of our discussion and then apply summative thematic content analysis to interrogate M&E and related literature. Firstly, we present and describe a model that situates M&E within development and public policy. This model ‘idealises or realises’ an institutionalised M&E by systematically linking the contextual as well as key terms prominent in established descriptions of M&E. Secondly, we briefly describe M&E from a systems thinking approach by pointing out its components, processes, established facts, as well as issues and debates. Lastly, we use this model and the systems thinking description of M&E to propose an institutionalised M&E curriculum.Results: Our results show that for an explicit

  6. Program evaluation models and related theories: AMEE guide no. 67. (United States)

    Frye, Ann W; Hemmer, Paul A


    This Guide reviews theories of science that have influenced the development of common educational evaluation models. Educators can be more confident when choosing an appropriate evaluation model if they first consider the model's theoretical basis against their program's complexity and their own evaluation needs. Reductionism, system theory, and (most recently) complexity theory have inspired the development of models commonly applied in evaluation studies today. This Guide describes experimental and quasi-experimental models, Kirkpatrick's four-level model, the Logic Model, and the CIPP (Context/Input/Process/Product) model in the context of the theories that influenced their development and that limit or support their ability to do what educators need. The goal of this Guide is for educators to become more competent and confident in being able to design educational program evaluations that support intentional program improvement while adequately documenting or describing the changes and outcomes-intended and unintended-associated with their programs.

  7. Development of oil supply and demand planning model for mid- and long-term

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Hyun [Korea Energy Economics Institute, Euiwang (Korea)


    Despite the liberalization of oil market, a systematic model is required for reasonable supply and demand of oil, which still has an important influence on industry and state economy. It is required a demand model deriving prospects of each sector and product and a supply model examining the optimum rate of operation, production mix of products, stock, export and import, and the size of equipment investment to meet given demand. As the first phase for the development of supply and demand model, the existing oil and energy models in domestic and overseas were reviewed and recommendations for establishing a Korean oil supply and demand model were derived in this study. Based on these, a principle for establishing a model and a rough framework were set up. In advance of mid- and long-term prospects, a short-term prospect model was established and the short-term prospects for the first quarter of 1999 and for the year 1999 were presented on trial. Due to the size and characters of a supply model, a plan for an ideal model was first explained and then a plan for creating a model step by step was presented as a realistic scheme. (author). 16 refs., 9 figs., 19 tabs.

  8. Evaluation of Short-Term Bioassays to Predict Functional Impairment. Selected Short-Term Cardiovascular Toxicity Tests. (United States)


    and L.H. Opie, 1979. "Protective Action of Amiodarone Against Ventricular Fibrillation in the Isolated Perfused Rat Heart." American Journal of...Journal of Cardiology 31:202-210. Rabkin, S.W., J.M. Friesen, J.A. Ferris, and H.Y Fung, 1979. "A Model of Cardiac Arrhythmias and Sudden Death

  9. Evaluation of black carbon estimations in global aerosol models

    NARCIS (Netherlands)

    Koch, D.; Schulz, M.; McNaughton, C.; Spackman, J.R.; Balkanski, Y.; Bauer, S.; Krol, M.C.


    We evaluate black carbon (BC) model predictions from the AeroCom model intercomparison project by considering the diversity among year 2000 model simulations and comparing model predictions with available measurements. These model-measurement intercomparisons include BC surface and aircraft

  10. Creating long-term weather data from thin air for crop simulation modeling


    van Wart, Justin; Grassini, Patricio; Yang, Haishun; Claessens, Lieven; Jarvis, Andrew; Cassman, Kenneth G.


    Simulating crop yield and yield variability requires long-term, high-quality daily weather data, including solar radiation, maximum (Tmax) and minimum temperature (Tmin), and precipitation. In many regions, however, daily weather data of sufficient quality and duration are not available. To overcome this limitation, we evaluated a new method to create long-term weather series based on a few years of observed daily temperature data (hereafter called propagated data). The propagated data are co...

  11. Towards a benchmark simulation model for plant-wide control strategy performance evaluation of WWTPs

    DEFF Research Database (Denmark)

    Jeppsson, Ulf; Rosen, Christian; Alex, Jens


    worldwide, demonstrates the interest in such a tool within the research community In this paper, an extension of the benchmark simulation model no 1 (BSM1) is proposed. This extension aims at facilitating control strategy development and performance evaluation at a plant-wide level and, consequently...... the changes, the evaluation period has been extended to one year. A prolonged evaluation period allows for long-term control strategies to be assessed and enables the use of control handles that cannot be evaluated in a realistic fashion in the one-week BSM1 evaluation period. In the paper, the extended plant...

  12. Statistical models of shape optimisation and evaluation

    CERN Document Server

    Davies, Rhodri; Taylor, Chris


    Deformable shape models have wide application in computer vision and biomedical image analysis. This book addresses a key issue in shape modelling: establishment of a meaningful correspondence between a set of shapes. Full implementation details are provided.

  13. The Relevance of the CIPP Evaluation Model for Educational Accountability. (United States)

    Stufflebeam, Daniel L.

    The CIPP Evaluation Model was originally developed to provide timely information in a systematic way for decision making, which is a proactive application of evaluation. This article examines whether the CIPP model also serves the retroactive purpose of providing information for accountability. Specifically, can the CIPP Model adequately assist…

  14. Rhode Island Model Evaluation & Support System: Teacher. Edition III (United States)

    Rhode Island Department of Education, 2015


    Rhode Island educators believe that implementing a fair, accurate, and meaningful educator evaluation and support system will help improve teaching and learning. The primary purpose of the Rhode Island Model Teacher Evaluation and Support System (Rhode Island Model) is to help all teachers improve. Through the Model, the goal is to help create a…

  15. Evaluation Model of the Entrepreneurial Character in EU Countries

    Directory of Open Access Journals (Sweden)

    Sebastian Madalin Munteanu


    Full Text Available The evidence of entrepreneurship development as a factor of sustainable growth at national and regional level frequently calls for the interest of theorists and practitioners on identifying and outlining the best conditions and economic essential prerequisites for supporting the entrepreneurial initiatives on the long term. In this context, the objective of the present research is to analyse and measure the entrepreneurial character of the European Union member countries in an integrated manner, by developing an innovative model for proposing specific action lines and objectively evaluating the entrepreneurship development in the investigated states. Our model is based on a synthesis variable of the entrepreneurial national character, which was developed by sequential application of principal component analysis, while the initial variables are from secondary sources with good conceptual representativeness. Depending on the objective relevance of the three model components (cultural, economic and administrative, and entrepreneurial education components, the achieved results confirm the importance of a favourable cultural and economic and administrative background for entrepreneurship development and they reiterate the inefficiency of isolated entrepreneurial education unless supported by good entrepreneurial culture or adequate economic and administrative infrastructure. The case of Romania, in relation with the European Union member countries, is presented in detail.

  16. Radionuclide scintigraphy in the evaluation of gastroesophageal reflux in symptomatic and asymptomatic pre-term infants

    Energy Technology Data Exchange (ETDEWEB)

    Morigeri, C.; Mukhopadhyay, K.; Narang, A. [Postgraduate Institute of Medical Education and Research (PGIMER), Division of Neonatology, Department of Paediatrics, Chandigarh (India); Bhattacharya, A.; Mittal, B.R. [Postgraduate Institute of Medical Education and Research (PGIMER), Department of Nuclear Medicine, Chandigarh (India)


    Gastroesophageal reflux (GER) is very common in pre-term infants. The diagnosis based on symptoms is always questionable. The incidence of GER in symptomatic babies varies from 22% to 85%, but literature regarding the incidence of reflux in asymptomatic pre-term infants is lacking. We used radionuclide scintigraphy to evaluate the incidence of GER in symptomatic as well as asymptomatic pre-term neonates and to assess whether symptoms have any relation with positive scintigraphy. We studied 106 pre-term infants (52 symptomatic, 54 asymptomatic) of less than 34 weeks of gestation, who fulfilled the eligibility criteria. Babies were considered symptomatic in the presence of vomiting, regurgitation, apnea, de-saturations, unexplained bradycardia and recurrent lung collapses. Radionuclide scintigraphy was conducted at post-conceptional age of 32-34 weeks when they were clinically stable for 72 h. Feeding was avoided for 2 h preceding the study. {sup 99m}Tc sulphur colloid was administered in a dose of 1.85 MBq (0.05 mCi) in 1 ml, followed by milk (full feed) through an orogastric tube, prior to imaging under a gamma camera. Reflux was graded as low or high, and reflux episodes during the study were counted. The incidence of GER in the symptomatic group was 71.2% and in asymptomatic babies 61.1% (p=0.275). High-grade reflux was more common (71.4%) than low-grade (28.6%) in both groups (p=0.449). Mean number of reflux episodes in 20 min was 4.4{+-}2.4 in symptomatic babies and 4.9 {+-}2.2 in asymptomatic babies (p=0.321). Babies with positive scintigraphy were similar in birth weight, gestation, time to achieve full feeds, weight and age at discharge to those with negative scintigraphy. GER is common in pre-term infants of less than 34 weeks gestation. The incidence of positive scintigraphy and grade of reflux is not significantly different in symptomatic vs. asymptomatic babies. Though radionuclide scintigraphy is a simple, quick and non-invasive investigation in

  17. Statistical Models for Tornado Climatology: Long and Short-Term Views. (United States)

    Elsner, James B; Jagger, Thomas H; Fricker, Tyler


    This paper estimates regional tornado risk from records of past events using statistical models. First, a spatial model is fit to the tornado counts aggregated in counties with terms that control for changes in observational practices over time. Results provide a long-term view of risk that delineates the main tornado corridors in the United States where the expected annual rate exceeds two tornadoes per 10,000 square km. A few counties in the Texas Panhandle and central Kansas have annual rates that exceed four tornadoes per 10,000 square km. Refitting the model after removing the least damaging tornadoes from the data (EF0) produces a similar map but with the greatest tornado risk shifted south and eastward. Second, a space-time model is fit to the counts aggregated in raster cells with terms that control for changes in climate factors. Results provide a short-term view of risk. The short-term view identifies a shift of tornado activity away from the Ohio Valley under El Niño conditions and away from the Southeast under positive North Atlantic oscillation conditions. The combined predictor effects on the local rates is quantified by fitting the model after leaving out the year to be predicted from the data. The models provide state-of-the-art views of tornado risk that can be used by government agencies, the insurance industry, and the general public.

  18. Estimating the Term Structure With a Semiparametric Bayesian Hierarchical Model: An Application to Corporate Bonds1 (United States)

    Cruz-Marcelo, Alejandro; Ensor, Katherine B.; Rosner, Gary L.


    The term structure of interest rates is used to price defaultable bonds and credit derivatives, as well as to infer the quality of bonds for risk management purposes. We introduce a model that jointly estimates term structures by means of a Bayesian hierarchical model with a prior probability model based on Dirichlet process mixtures. The modeling methodology borrows strength across term structures for purposes of estimation. The main advantage of our framework is its ability to produce reliable estimators at the company level even when there are only a few bonds per company. After describing the proposed model, we discuss an empirical application in which the term structure of 197 individual companies is estimated. The sample of 197 consists of 143 companies with only one or two bonds. In-sample and out-of-sample tests are used to quantify the improvement in accuracy that results from approximating the term structure of corporate bonds with estimators by company rather than by credit rating, the latter being a popular choice in the financial literature. A complete description of a Markov chain Monte Carlo (MCMC) scheme for the proposed model is available as Supplementary Material. PMID:21765566

  19. Short-Term Memory for Serial Order: A Recurrent Neural Network Model (United States)

    Botvinick, Matthew M.; Plaut, David C.


    Despite a century of research, the mechanisms underlying short-term or working memory for serial order remain uncertain. Recent theoretical models have converged on a particular account, based on transient associations between independent item and context representations. In the present article, the authors present an alternative model, according…

  20. Specifications of the equations for LEAP Model 22C. [Long-Term Energy Analysis Program

    Energy Technology Data Exchange (ETDEWEB)

    Goldstein, M.; Alsmiller, R.G. Jr.; Barish, J.


    This report describes the equations implemented in the Long-Term Energy Analysis Program (LEAP) Model 22C. The equilibrium equations of each of the processes contained in the LEAP Model 22C are specified, and the interrelations of the various equations are discussed. No attempt is made to derive the equations or to discuss their validity from an economic point of view.

  1. Integrated modeling of long-term vegetation and hydrologic dynamics in Rocky Mountain watersheds (United States)

    Robert Steven Ahl


    Changes in forest structure resulting from natural disturbances, or managed treatments, can have negative and long lasting impacts on water resources. To facilitate integrated management of forest and water resources, a System for Long-Term Integrated Management Modeling (SLIMM) was developed. By combining two spatially explicit, continuous time models, vegetation...

  2. Folk music style modelling by recurrent neural networks with long short term memory units


    Sturm, Bob,; Santos, João Felipe; Korshunova, Iryna


    We demonstrate two generative models created by training a recurrent neural network (RNN) with three hidden layers of long short-term memory (LSTM) units. This extends past work in numerous directions, including training deeper models with nearly 24,000 high-level transcriptions of folk tunes. We discuss our on-going work.

  3. Semantic Vector Space Model: Implementation and Evaluation. (United States)

    Liu, Geoffrey Z.


    Presents the Semantic Vector Space Model, a text representation and searching technique based on the combination of Vector Space Model with heuristic syntax parsing and distributed representation of semantic case structures. In this model, both documents and queries are represented as semantic matrices, and retrieval is achieved by computing…

  4. Medium term operation planning - DECOMP model; Planejamento da operacao de medio prazo: Modelo DECOMP

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Simone; Gorenstin, Boris G.; Costa, Joari P. da [Centro de Pesquisas em Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Sa Junior, Cesar L. Correa de; Castro, Frederico G.S.M. [ELETROBRAS, Rio de Janeiro, RJ (Brazil)


    The interconnected hydrothermal system operation planning is usually divided in three stages; long, short and medium term planning. Each one of these stages presents specific objectives and uses different models, information and representation of the generation system. This work describes the optimization model names DECOMP, developed for studies of medium term operation planning. In addition to be an adequate tool for the elaboration of studies for the monthly planning of the operation, this model is also capable of enabling the performance of economic analysis. A case study is presented 2 figs., 1 tab., 6 refs.

  5. The forward rates for multifactor model of term structure “with square root”


    Medvedev, G. A.


    The multifactor model “with square root” is discussed in details. For such model, the representation of state variable process in the integral form is derived and its covariance ma-trix is found. The special attention to the problem connected with the tendency for the term structure of long-term forward rates to slope downwards is given. For multifactor models with square root the following results are derived: representa-tions of the forward rate curve through the volatility of the state ...

  6. The Soul Mates Model: A Seven-Stage Model for Couple's Long-Term Relationship Development and Flourishing (United States)

    De La Lama, Luisa Batthyany; De La Lama, Luis; Wittgenstein, Ariana


    This article presents the integrative soul mates relationship development model, which provides the helping professionals with a conceptual map for couples' relationship development from dating, to intimacy, to soul mating, and long-term flourishing. This model is informed by a holistic, a developmental, and a positive psychology conceptualization…


    Directory of Open Access Journals (Sweden)

    Ienciu Nicoleta Maria


    Full Text Available In the classical theory of economics, capital is one of the three factors of production, in addition to land and labor, and refers in particular to buildings, equipment, and machinery etc., used for the production of other goods (the term physical capital is also used by the specialized literature (Bratianu and Jianu, 2006. The present study intend to bring to the forefront the main evalluation methods for intellectual capital, as proposed, supported and criticized at the same time by researchers and practitioners. The study offers response to the following research questions: Which are the advantages and disadvantages of the intellectual capital evaluation methods? And what are the main studies approaching the subject of intellectual capital evaluation at international level? The collection and analysis of intellectual capital evaluation models and the non-participative observation are the main instruments used to bring to the forefront the main international existing evaluation frameworks. The information sources representing the base for these researches are especially constituted by articles published in specialized magazines, both from accounting and economics fields, specialized works relevant to the reference field, legislative documents, official documents, press releases and other documents issued by various national and international bodies. The most representative studies bringing to the forefront the evaluation of intellectual capital are the ones elaborated by Mouritsen et al (Mouritsen et al, 2001, Manea and Gorgan (Manea and Gorgan, 2003, Tayles (Tayles, 2002, Tayles et al (Tayles et al, 2007. The presented approaches offer a general idea on the range of methods, disciplines and operational specializations existing for the evaluation of intellectual capital. Only one of them - Balanced Scorecard is largely used, while the rest of the methods remain too theoretical or too poorly developed to be universally accepted. We believe that

  8. Animal Models to Study the Role of Long-Term Hypergastrinemia in Gastric Carcinogenesis

    Directory of Open Access Journals (Sweden)

    Reidar Fossmark


    Full Text Available Patients with chronic hypergastrinemia due to chronic atrophic gastritis or gastrinomas have an increased risk of developing gastric malignancy, and it has been questioned whether also patients with hypergastrinemia caused by long-term use of acid inhibiting drugs are at risk. Gastric carcinogenesis in humans is affected by numerous factors and progresses slowly over years. When using animal models with the possibility of intervention, a complex process can be dissected by studying the role of hypergastrinemia in carcinogenesis within a relatively short period of time. We have reviewed findings from relevant models where gastric changes in animal models of long-term hypergastrinemia have been investigated. In all species where long-term hypergastrinemia has been induced, there is an increased risk of gastric malignancy. There is evidence that hypergastrinemia is a common causative factor in carcinogenesis in the oxyntic mucosa, while other cofactors may vary in the different models.

  9. Density-dependent microbial turnover improves soil carbon model predictions of long-term litter manipulations (United States)

    Georgiou, Katerina; Abramoff, Rose; Harte, John; Riley, William; Torn, Margaret


    Climatic, atmospheric, and land-use changes all have the potential to alter soil microbial activity via abiotic effects on soil or mediated by changes in plant inputs. Recently, many promising microbial models of soil organic carbon (SOC) decomposition have been proposed to advance understanding and prediction of climate and carbon (C) feedbacks. Most of these models, however, exhibit unrealistic oscillatory behavior and SOC insensitivity to long-term changes in C inputs. Here we diagnose the sources of instability in four models that span the range of complexity of these recent microbial models, by sequentially adding complexity to a simple model to include microbial physiology, a mineral sorption isotherm, and enzyme dynamics. We propose a formulation that introduces density-dependence of microbial turnover, which acts to limit population sizes and reduce oscillations. We compare these models to results from 24 long-term C-input field manipulations, including the Detritus Input and Removal Treatment (DIRT) experiments, to show that there are clear metrics that can be used to distinguish and validate the inherent dynamics of each model structure. We find that widely used first-order models and microbial models without density-dependence cannot readily capture the range of long-term responses observed across the DIRT experiments as a direct consequence of their model structures. The proposed formulation improves predictions of long-term C-input changes, and implies greater SOC storage associated with CO2-fertilization-driven increases in C inputs over the coming century compared to common microbial models. Finally, we discuss our findings in the context of improving microbial model behavior for inclusion in Earth System Models.

  10. Evaluation of Short-Term Bioassays to Predict Functional Impairment. Selected Short-Term Hepatic Toxicity tests. (United States)


    as ethanol, is usu- ally enlargement rather than proliferation. Cytoplasmic poisons, like heavy doses of carbon tetrachloride, cause dissolution of... Paracetamol - Induced Hepatic Necrosis." Gut 16(10):800-807. Dobbins, W.O., E.L. Rollins, S.G. Brooks and A.J. Fallon, 1972. "A Quantitative...Model of Liver Injury Using Paracetamol Treatment of Liver Slices and Prevention of Injury by Some Antioxidants." Biochemical Pharmacology 27(4):425- 430

  11. Evaluating temporal consistency of long-term global NDVI datasets for trend analysis

    DEFF Research Database (Denmark)

    Tian, Feng; Fensholt, Rasmus; Verbesselt, Jan


    As a way to understand vegetation changes, trend analysis on NDVI (normalized difference vegetation index) time series data have been widely performed at regional to global scales. However, most long-term NDVI datasets are based upon multiple sensor systems and unsuccessful corrections related...... to sensor shifts potentially introduce substantial uncertainties and artifacts in the analysis of trends. The temporal consistency of NDVI datasets should therefore be evaluated before performing trend analysis to obtain reliable results. In this study we analyze the temporal consistency of multi......'Observation de la Terre VEGETATION). Single sensor time series from MODIS (MODerate Resolution Imaging Spectroradiometer) Terra and Aqua are used as reference datasets. The global land surface is divided into six regions according to the world humidity zones and averaged NDVI time series in each region...

  12. Gasbuggy, New Mexico Long-Term Hydrologic Monitoring Program Evaluation Report

    Energy Technology Data Exchange (ETDEWEB)



    This report summarizes an evaluation of the Long-Term Hydrologic Monitoring Program (LTHMP) that has been conducted since 1972 at the Gasbuggy, New Mexico underground nuclear detonation site. The nuclear testing was conducted by the U.S. Atomic Energy Commission under the Plowshare program, which is discussed in greater detail in Appendix A. The detonation at Gasbuggy took place in 1967, 4,240 feet below ground surface, and was designed to fracture the host rock of a low-permeability natural gas-bearing formation in an effort to improve gas production. The site has historically been managed under the Nevada Offsites Project. These underground nuclear detonation sites are within the United States but outside of the Nevada Test Site where most of the experimental nuclear detonations conducted by the U.S. Government took place. Gasbuggy is managed by the U.S. Department of Energy (DOE) Office of Legacy Management (LM ).

  13. Long-term evaluation of a Canadian back pain mass media campaign. (United States)

    Suman, Arnela; Bostick, Geoffrey P; Schopflocher, Donald; Russell, Anthony S; Ferrari, Robert; Battié, Michele C; Hu, Richard; Buchbinder, Rachelle; Gross, Douglas P


    This paper evaluates the long-term impact of a Canadian mass media campaign on general public beliefs about staying active when experiencing low back pain (LBP). Changes in beliefs about staying active during an episode of LBP were studied using telephone and web-based surveys. Logistic regression analysis was used to investigate changes in beliefs over time and the effect of exposure to campaign messaging. The percentage of survey respondents agreeing that they should stay active through LBP increased annually from 58.9 to ~72.0%. Respondents reporting exposure to campaign messaging were statistically significantly more likely to agree with staying active than respondents who did not report exposure to campaign messaging (adjusted OR, 95% CI = 1.96, 1.73-2.21). The mass media campaign had continued impact on public LBP beliefs over the course of 7 years. Improvements over time were associated with exposure to campaign messaging.

  14. Symbolic Evaluation Graphs and Term Rewriting — A General Methodology for Analyzing Logic Programs

    DEFF Research Database (Denmark)

    Giesl, J.; Ströder, T.; Schneider-Kamp, P.


    There exist many powerful techniques to analyze termination and complexity of term rewrite systems (TRSs). Our goal is to use these techniques for the analysis of other programming languages as well. For instance, approaches to prove termination of definite logic programs by a transformation...... to TRSs have been studied for decades. However, a challenge is to handle languages with more complex evaluation strategies (such as Prolog, where predicates like the cut influence the control flow). We present a general methodology for the analysis of such programs. Here, the logic program is first...... information on the termination or complexity of the original logic program. More information can be found in the full paper [1]. © 2013 Springer-Verlag....

  15. Improving Service Quality in Long-term Care Hospitals: National Evaluation on Long-term Care Hospitals and Employees Perception of Quality Dimensions. (United States)

    Kim, Jinkyung; Han, Woosok


    To investigate predictors for specific dimensions of service quality perceived by hospital employees in long-term care hospitals. Data collected from a survey of 298 hospital employees in 18 long-term care hospitals were analysed. Multivariate ordinary least squares regression analysis with hospital fixed effects was used to determine the predictors of service quality using respondents' and organizational characteristics. The most significant predictors of employee-perceived service quality were job satisfaction and degree of consent on national evaluation criteria. National evaluation results on long-term care hospitals and work environment also had positive effects on service quality. The findings of the study show that organizational characteristics are significant determinants of service quality in long-term care hospitals. Assessment of the extent to which hospitals address factors related to employeeperceived quality of services could be the first step in quality improvement activities. Results have implications for efforts to improve service quality in longterm care hospitals and designing more comprehensive national evaluation criteria.

  16. Ocular surface evaluation in eyes with chronic glaucoma on long term topical antiglaucoma therapy

    Directory of Open Access Journals (Sweden)

    Manu Saini


    Full Text Available AIM: To evaluate ocular surface changes and its correlation with the central corneal subbasal nerve fibre layer in chronic glaucoma patients. METHODS: A prospective comparative study of ocular surface evaluation was performed in 50 eyes of 25 patients using two or more antiglaucoma medications for at least 6mo and 50 eyes of 25 normal subjects without any ocular problems as controls. The study parameters evaluated included visual acuity, intraocular pressure, ocular surface evaluation parameters [fluorescein break-up time (FTBUT, Schirmer’s I test, ocular surface staining scores and ocular surface disease index score (OSDI], central corneal sensation (Cochet Bonnett aesthesiometer, central subbasal nerve fiber layer density (SBNFLD by confocal microscopy. RESULTS: The mean values in the glaucoma cases and control groups respectively were as follows: OSDI score (35.89±16.07/6.02±3.84; P=0.001, Schirmer’s I test score (7.63±2.64 mm/12.86±1.93 mm; P=0.001, FTBUT (9.44±2.76s/11.8±1.88s; P=0.001, corneal (5.7±2.33/ 1.1±0.58; P=0.001 and conjunctival staining score (5.06±1.94/0.84±0.46; P=0.001, corneal sensitivity (4.68±0.44/5.07±0.37; P=0.076, mean subbasal nerve fiber number (3.58±0.99/5.40±1.70; P=0.001, SBNFL length (1101.44±287.56 μm/1963.70±562.56 μm; P=0.001 and density (6883.94±1798.03 μm/mm2/12 273.15±3516.04 μm/mm2; P=0.001. Dry eye severity of level 2 and 3 was seen in 66% of glaucoma group. Corneal (R²=0.86 and conjunctival staining (R²=0.71 and OSDI score (R²=0.67 showed statistically significant negative correlation with central corneal SBNFLD while FTBUT (R²=0.84, corneal sensitivity (R²=0.52 showed positive correlation to central corneal SBNFLD in the long term topical antiglaucoma medication group. CONCLUSION: Ocular surface changes and antiglaucoma therapy induced dry eye is found to be associated with decreased SBNFLD in eyes on long term topical antiglaucoma medications.

  17. Issues in Value-at-Risk Modeling and Evaluation

    NARCIS (Netherlands)

    J. Daníelsson (Jón); C.G. de Vries (Casper); B.N. Jorgensen (Bjørn); P.F. Christoffersen (Peter); F.X. Diebold (Francis); T. Schuermann (Til); J.A. Lopez (Jose); B. Hirtle (Beverly)


    textabstractDiscusses the issues in value-at-risk modeling and evaluation. Value of value at risk; Horizon problems and extreme events in financial risk management; Methods of evaluating value-at-risk estimates.

  18. Evaluation of the antipsychotic medication review process at four long-term facilities in Alberta

    Directory of Open Access Journals (Sweden)

    Birney A


    Full Text Available Arden Birney,1 Paola Charland,1 Mollie Cole,2 Mubashir Aslam Arain1 1Workforce Research & Evaluation, 2Seniors Health Strategic Clinical Network, Alberta Health Services, Calgary, AB, Canada Purpose: The goal of this evaluation was to understand how four long-term care (LTC ­facilities in Alberta have implemented medication reviews for the Appropriate Use of Antipsychotics (AUA initiative. We aimed to determine how interprofessional (IP collaboration was incorporated in the antipsychotic medication reviews and how the reviews had been sustained.Methods: Four LTC facilities in Alberta participated in this evaluation. We conducted semistructured interviews with 18 facility staff and observed one antipsychotic medication review at each facility. We analyzed data according to the following key components that we identified as relevant to the antipsychotic medication reviews: the structure of the reviews, IP interactions between the staff members, and strategies for sustaining the reviews.Results: The duration of antipsychotic medication reviews ranged from 1 to 1.5 hours. The number of professions in attendance ranged from 3 to 9; a pharmacist led the review at two sites, while a registered nurse led the review at one site and a nurse practitioner at the remaining site. The number of residents discussed during the review ranged from 6 to 20. The process at some facilities was highly IP, demonstrating each of the six IP competencies. Other facilities conducted the review in a less IP manner due to challenges of physician involvement and staff workload, particularly of health care aides. Facilities that had a nurse practitioner on site were more efficient with the process of implementing recommendations resulting from the medication reviews.Conclusion: The LTC facilities were successful in implementing the medication review process and the process seemed to be sustainable. A few challenges were observed in the implementation process at two facilities

  19. Evaluation of multivariate calibration models transferred between spectroscopic instruments

    DEFF Research Database (Denmark)

    Eskildsen, Carl Emil Aae; Hansen, Per W.; Skov, Thomas


    for the same samples using the transferred model. However, sometimes the success of a model transfer is evaluated by comparing the transferred model predictions with the reference values. This is not optimal, as uncertainties in the reference method will impact the evaluation. This paper proposes a new method...... for calibration model transfer evaluation. The new method is based on comparing predictions from different instruments, rather than comparing predictions and reference values. A total of 75 flour samples were available for the study. All samples were measured on ten near infrared (NIR) instruments from two......, this paper highlights the problems of including reference values in the evaluation of a model transfer, as uncertainties in the reference method impact the evaluation. At the same time, this paper highlights the power of the proposed model transfer evaluation, which is based on comparing predictions obtained...

  20. Long-term effects of methadone maintenance treatment with different psychosocial intervention models.

    Directory of Open Access Journals (Sweden)

    Lirong Wang

    Full Text Available This study evaluated the long-term effects of different psychosocial intervention models in methadone maintenance treatment (MMT in Xi'an China. Patients from five MMT clinics were divided into three groups receiving MMT only, MMT with counseling psychology (CP or MMT with contingency management (CM. A five-year follow-up was carried out with daily records of medication, monthly random urine morphine tests, and tests for anti-HIV and anti-HCV every six months. Drug use behavior was recorded six months after initial recruitment using a survey. Adjusted RRs and their 95% confidence intervals (CIs were estimated using an unconditional logistic regression model or a Cox proportional hazard model. A total of 2662 patients were recruited with 797 in MMT, 985 in MMT with CP, and 880 in MMT with CM. Following six months of treatment, the injection rates of MMT with CP and MMT with CM groups were significantly lower than that of MMT (5.1% and 6.9% vs. 16.3%, x²  =  47.093 and 29.908, respectively; P<0.05. HIV incidences for MMT, MMT with CP and MMT with CM at the five year follow-up were 20.09, 0.00 and 10.02 per ten thousand person-years, respectively. HCV incidences were 18.35, 4.42 and 6.61 per hundred person-years, respectively, demonstrating that CP and CM were protective factors for HCV incidence (RR  =  0.209 and 0.414, with range of 0.146-0.300 and 0.298-0.574, respectively. MMT supplemented with CP or CM can reduce heroin use and related risk behaviors, thereby reducing the incidence of HIV and HCV.

  1. Long-Term Effects of Methadone Maintenance Treatment with Different Psychosocial Intervention Models (United States)

    Wang, Lirong; Wei, Xiaoli; Wang, Xueliang; Li, Jinsong; Li, Hengxin; Jia, Wei


    This study evaluated the long-term effects of different psychosocial intervention models in methadone maintenance treatment (MMT) in Xi'an China. Patients from five MMT clinics were divided into three groups receiving MMT only, MMT with counseling psychology (CP) or MMT with contingency management (CM). A five-year follow-up was carried out with daily records of medication, monthly random urine morphine tests, and tests for anti-HIV and anti-HCV every six months. Drug use behavior was recorded six months after initial recruitment using a survey. Adjusted RRs and their 95% confidence intervals (CIs) were estimated using an unconditional logistic regression model or a Cox proportional hazard model. A total of 2662 patients were recruited with 797 in MMT, 985 in MMT with CP, and 880 in MMT with CM. Following six months of treatment, the injection rates of MMT with CP and MMT with CM groups were significantly lower than that of MMT (5.1% and 6.9% vs. 16.3%, x2  =  47.093 and 29.908, respectively; P<0.05). HIV incidences for MMT, MMT with CP and MMT with CM at the five year follow-up were 20.09, 0.00 and 10.02 per ten thousand person-years, respectively. HCV incidences were 18.35, 4.42 and 6.61 per hundred person-years, respectively, demonstrating that CP and CM were protective factors for HCV incidence (RR  =  0.209 and 0.414, with range of 0.146 – 0.300 and 0.298 – 0.574, respectively). MMT supplemented with CP or CM can reduce heroin use and related risk behaviors, thereby reducing the incidence of HIV and HCV. PMID:24498406

  2. Long-term effects of methadone maintenance treatment with different psychosocial intervention models. (United States)

    Wang, Lirong; Wei, Xiaoli; Wang, Xueliang; Li, Jinsong; Li, Hengxin; Jia, Wei


    This study evaluated the long-term effects of different psychosocial intervention models in methadone maintenance treatment (MMT) in Xi'an China. Patients from five MMT clinics were divided into three groups receiving MMT only, MMT with counseling psychology (CP) or MMT with contingency management (CM). A five-year follow-up was carried out with daily records of medication, monthly random urine morphine tests, and tests for anti-HIV and anti-HCV every six months. Drug use behavior was recorded six months after initial recruitment using a survey. Adjusted RRs and their 95% confidence intervals (CIs) were estimated using an unconditional logistic regression model or a Cox proportional hazard model. A total of 2662 patients were recruited with 797 in MMT, 985 in MMT with CP, and 880 in MMT with CM. Following six months of treatment, the injection rates of MMT with CP and MMT with CM groups were significantly lower than that of MMT (5.1% and 6.9% vs. 16.3%, x²  =  47.093 and 29.908, respectively; P<0.05). HIV incidences for MMT, MMT with CP and MMT with CM at the five year follow-up were 20.09, 0.00 and 10.02 per ten thousand person-years, respectively. HCV incidences were 18.35, 4.42 and 6.61 per hundred person-years, respectively, demonstrating that CP and CM were protective factors for HCV incidence (RR  =  0.209 and 0.414, with range of 0.146-0.300 and 0.298-0.574, respectively). MMT supplemented with CP or CM can reduce heroin use and related risk behaviors, thereby reducing the incidence of HIV and HCV.

  3. Evaluation model development for sprinkler irrigation uniformity ...

    African Journals Online (AJOL)


    2010). The most commonly used term for placing a numerical value on uniformity of application for agricultural irrigation systems is Christiansen's coefficient of uniformity expressed as a percent (Christiansen,. 1942). It is based on the absolute deviation of individual amounts from the mean amount. Another parameter that.

  4. Long-term evaluation of osseointegrated implants in regenerated and nonregenerated bone. (United States)

    Corrente, G; Abundo, R; Cardaropoli, D; Cardaropoli, G; Martuscelli, G


    This investigation evaluated the predictability of dental implants subjected to bone regeneration procedures at the time of insertion. Fifty-two test implants were inserted into sites with periimplant bone defects. A calcium carbonate allograft material with or without a fibrin-fibronectin sealing system was used to fill the defects. Sixty control implants were inserted into an adequate volume of nonaugmented bone. Each of the 29 study patients received at least one test implant and one control implant. At the second-stage surgery, fill of the bone defect was assessed as complete or incomplete. The cumulative success rate was 91.7% (mean follow-up 55 mo) for the test implants and 93.2% (mean follow-up 59 mo) for the control implants. Within the test group, implants with complete bone fill achieved 97.6% success versus 59.1% success for implants with incomplete bone fill. These preliminary results suggest that implants placed with simultaneous bone regeneration procedures achieve long-term predictability that is comparable to that of implants placed in an adequate volume of bone, provided that complete bone fill of the periimplant defect is achieved. Long-term studies with other augmentation materials are needed to fully validate these findings.

  5. Internal evaluation of a physically-based distributed model using data from a Mediterranean mountain catchment

    Directory of Open Access Journals (Sweden)

    S. P. Anderton


    Full Text Available An evaluation of the performance of a physically-based distributed model of a small Mediterranean mountain catchment is presented. This was carried out using hydrological response data, including measurements of runoff, soil moisture, phreatic surface level and actual evapotranspiration. A-priori model parameterisation was based as far as possible on property data measured in the catchment. Limited model calibration was required to identify an appropriate value for terms controlling water loss to a deeper regional aquifer. The model provided good results for an initial calibration period, when judged in terms of catchment discharge. However, model performance for runoff declined substantially when evaluated against a consecutive, rather drier, period of data. Evaluation against other catchment responses allowed identification of the problems responsible for the observed lack of model robustness in flow simulation. In particular, it was shown that an incorrect parameterisation of the soil water model was preventing adequate representation of drainage from soils during hydrograph recessions. This excess moisture was then being removed via an overestimation of evapotranspiration. It also appeared that the model underestimated canopy interception. The results presented here suggest that model evaluation against catchment scale variables summarising its water balance can be of great use in identifying problems with model parameterisation, even for distributed models. Evaluation using spatially distributed data yielded less useful information on model performance, owing to the relative sparseness of data points, and problems of mismatch of scale between the measurement and the model grid. Keywords: physically-based distributed model, SHETRAN, parameterisation, Mediterranean mountain catchment, internal evaluation, multi-response

  6. Mandibular distraction in hemifacial microsomia is not a permanent treatment: a long-term evaluation. (United States)

    Ascenço, Adriana Sayuri Kurogi; Balbinot, Priscilla; Junior, Ivan Maluf; D'Oro, Ubiratan; Busato, Luciano; da Silva Freitas, Renato


    Hemifacial microsomia presents with abnormalities including short ramus, absence of condyle, abnormal canting, deviated chin, and facial asymmetry. Many studies about distraction osteogenesis have been published over the last 20 years, but without long-term follow-up. The aim of this study was to evaluate patients with unilateral craniofacial microsomia who were treated with mandible distraction and with follow-up of more than 5 years. The following retrospective study was evaluated and approved by the Assistance Center for Cleft Lip and Palate. Data were compiled from the charts of 33 patients with hemifacial microsomia who underwent unilateral mandible distraction. Average age at time of procedure was 7.3 years, with an average degree of distraction of 20 mm. Seventy percent of cases were treated with internal distraction, 30% external. Follow-up varied between 5 and 15 years, with a mean follow-up of 9 years. Ninety percent of the 33 patients in the study had recurrence of their asymmetry. Mean time to postsurgical recurrence was 44 months. Thirty patients were referred for orthognathic surgery. Six patients have already undergone corrective bimaxillary surgery. One patient underwent genioplasty only, and 1 patient underwent genioplasty with orthognathic jaw surgery. Twenty-two patients are awaiting orthognathic surgery, including one with temporomandibular joint ankylosis. Only 3 subjects had good outcomes, without signs of recurrence. Bone distraction once seemed a promising long-term option for treatment of craniofacial microsomia. However, this has not proven effective for all cases, and most patients needed subsequent orthognathic surgery.

  7. Evaluation of a brief anti-stigma campaign in Cambridge: do short-term campaigns work?

    Directory of Open Access Journals (Sweden)

    Henderson Claire


    Full Text Available Abstract Background In view of the high costs of mass-media campaigns, it is important to understand whether it is possible for a media campaign to have significant population effects over a short period of time. This paper explores this question specifically in reference to stigma and discrimination against people with mental health problems using the Time to Change Cambridge anti-stigma campaign as an example. Methods 410 face-to-face interviews were performed pre, during and post campaign activity to assess campaign awareness and mental health-related knowledge, attitudes and behaviours. Results Although campaign awareness was not sustained following campaign activity, significant and sustained shifts occurred for mental health-related knowledge items. Specifically, there was a 24% (p If a friend had a mental health problem, I know what advice to give them to get professional help, following the campaign. Additionally, for the statement: Medication can be an effective treatment for people with mental health problems, there was a 10% rise (p = 0.05 in the proportion of interviewees responding 'agree' or 'strongly agree' following the campaign. These changes, however, were not evident for attitudinal or behaviour related questions. Conclusions Although these results only reflect the impact of one small scale campaign, these preliminary findings suggest several considerations for mass-media campaign development and evaluation strategies such as: (1 Aiming to influence outcomes pertaining to knowledge in the short term; (2 Planning realistic and targeted outcomes over the short, medium and long term during sustained campaigns; and (3 Monitoring indirect campaign effects such as social discourse or other social networking/contact in the evaluation.

  8. Long-term exposure models for traffic related NO 2 across geographically diverse areas over separate years (United States)

    Sally Liu, L.-J.; Tsai, Ming-Yi; Keidel, Dirk; Gemperli, Armin; Ineichen, Alex; Hazenkamp-von Arx, Marianne; Bayer-Oglesby, Lucy; Rochat, Thierry; Künzli, Nino; Ackermann-Liebrich, Ursula; Straehl, Peter; Schwartz, Joel; Schindler, Christian


    Although recent air pollution epidemiologic studies have embraced land-use regression models for estimating outdoor traffic exposure, few have examined the spatio-temporal variability of traffic related pollution over a long term period and the optimal methods to take these factors into account for exposure estimates. We used home outdoor NO 2 measurements taken from eight geographically diverse areas to examine spatio-temporal variations, construct, and evaluate models that could best predict the within-city contrasts in observations. Passive NO 2 measurements were taken outside of up to 100 residences per area over three seasons in 1993 and 2003 as part of the Swiss cohort study on air pollution and lung and heart disease in adults (SAPALDIA). The spatio-temporal variation of NO 2 differed by area and year. Regression models constructed using the annual NO 2 means from central monitoring stations and geographic parameters predicted home outdoor NO 2 levels better than a dispersion model. However, both the regression and dispersion models underestimated the within-city contrasts of NO 2 levels. Our results indicated that the best models should be constructed for individual areas and years, and would use the dispersion estimates as the urban background, geographic information system (GIS) parameters to enhance local characteristics, and temporal and meteorological variables to capture changing local dynamics. Such models would be powerful tools for assessing health effects from long-term exposure to air pollution in a large cohort.

  9. Short-term spheroid culture of primary colorectal cancer cells as an in vitro model for personalizing cancer medicine

    DEFF Research Database (Denmark)

    Jeppesen, Maria; Hagel, Grith; Glenthoj, Anders


    for increasing treatment efficacy is to test the chemosensitivity of cancer cells obtained from the patient's tumour. 3D culture represents a promising method for modelling patient tumours in vitro. The aim of this study was therefore to evaluate how closely short-term spheroid cultures of primary colorectal...... and combinations most commonly used for treatment of colorectal cancer. In summary, short-term spheroid culture of primary colorectal adenocarcinoma cells represents a promising in vitro model for use in personalized medicine....... cancer cells resemble the original tumour. Colorectal cancer cells were isolated from human tumour tissue and cultured as spheroids. Spheroid cultures were established with a high success rate and remained viable for at least 10 days. The spheroids exhibited significant growth over a period of 7 days...

  10. Postsynaptic Signal Transduction Models for Long-Term Potentiation and Depression (United States)

    Manninen, Tiina; Hituri, Katri; Kotaleski, Jeanette Hellgren; Blackwell, Kim T.; Linne, Marja-Leena


    More than a hundred biochemical species, activated by neurotransmitters binding to transmembrane receptors, are important in long-term potentiation (LTP) and long-term depression (LTD). To investigate which species and interactions are critical for synaptic plasticity, many computational postsynaptic signal transduction models have been developed. The models range from simple models with a single reversible reaction to detailed models with several hundred kinetic reactions. In this study, more than a hundred models are reviewed, and their features are compared and contrasted so that similarities and differences are more readily apparent. The models are classified according to the type of synaptic plasticity that is modeled (LTP or LTD) and whether they include diffusion or electrophysiological phenomena. Other characteristics that discriminate the models include the phase of synaptic plasticity modeled (induction, expression, or maintenance) and the simulation method used (deterministic or stochastic). We find that models are becoming increasingly sophisticated, by including stochastic properties, integrating with electrophysiological properties of entire neurons, or incorporating diffusion of signaling molecules. Simpler models continue to be developed because they are computationally efficient and allow theoretical analysis. The more complex models permit investigation of mechanisms underlying specific properties and experimental verification of model predictions. Nonetheless, it is difficult to fully comprehend the evolution of these models because (1) several models are not described in detail in the publications, (2) only a few models are provided in existing model databases, and (3) comparison to previous models is lacking. We conclude that the value of these models for understanding molecular mechanisms of synaptic plasticity is increasing and will be enhanced further with more complete descriptions and sharing of the published models. PMID:21188161

  11. Model evaluation of marine primary organic aerosol emission schemes

    Directory of Open Access Journals (Sweden)

    B. Gantt


    Full Text Available In this study, several marine primary organic aerosol (POA emission schemes have been evaluated using the GEOS-Chem chemical transport model in order to provide guidance for their implementation in air quality and climate models. These emission schemes, based on varying dependencies of chlorophyll a concentration ([chl a] and 10 m wind speed (U10, have large differences in their magnitude, spatial distribution, and seasonality. Model comparison with weekly and monthly mean values of the organic aerosol mass concentration at two coastal sites shows that the source function exclusively related to [chl a] does a better job replicating surface observations. Sensitivity simulations in which the negative U10 and positive [chl a] dependence of the organic mass fraction of sea spray aerosol are enhanced show improved prediction of the seasonality of the marine POA concentrations. A top-down estimate of submicron marine POA emissions based on the parameterization that compares best to the observed weekly and monthly mean values of marine organic aerosol surface concentrations has a global average emission rate of 6.3 Tg yr−1. Evaluation of existing marine POA source functions against a case study during which marine POA contributed the major fraction of submicron aerosol mass shows that none of the existing parameterizations are able to reproduce the hourly-averaged observations. Our calculations suggest that in order to capture episodic events and short-term variability in submicron marine POA concentration over the ocean, new source functions need to be developed that are grounded in the physical processes unique to the organic fraction of sea spray aerosol.

  12. Long term peatland subsidence: Experimental study and modeling scenarios in the Venice coastland (United States)

    Zanello, Francesca; Teatini, Pietro; Putti, Mario; Gambolati, Giuseppe


    Land subsidence in drained cultivated peatlands is responsible for a number of serious environmental concerns and economical problems at both the local and the global scale. In low-lying coastal areas it enhances the risk of flooding, the saltwater contamination of shallow aquifers, and the maintenance costs of the systems that help keep the farmland drained. Since the subsidence is a major consequence of the bio-oxidation of the soil organic fraction in the upper aerated zone, cropped peatlands in temperate and tropic regions are important sources of CO2 into the atmosphere. A 4-year long experimental study has been performed in a drained peatland located south of the Venice Lagoon, Italy, to help calibrate a land subsidence model developed to predict the expected behavior of the ground surface elevation. Continuous monitoring of the hydrological regime and land displacements shows that the vertical movement of the peat surface consists of the superimposition of daily/seasonal time-scale reversible deformations related to soil moisture, depth to the water table, and temperature fluctuations, and long term irreversible subsidence due to peat oxidation. A novel two-step modeling approach to separate the two contributions from the available observations is presented. First, the elastic component is computed by integrating the peat vertical deformations evaluated by a constitutive relationship describing the porosity variation with the moisture content and pore pressure changes implemented into a variably saturated flow equation-based numerical code. The observed trend is then filtered from the computed reversible displacement and is used to calibrate an empirical relationship relating land subsidence rate to drainage depth and soil temperature. The results show that in recent years the subsidence rate ranged from 3 to 15 mm a-1. The large variability is due to the different climate conditions underlying the monitoring period, in particular a wet 2002 and a very dry

  13. Magnetized string cosmological model in cylindrically symmetric inhomogeneous universe with time dependent cosmological-term lambda


    Pradhan,Anirudh; Jotania, Kanti; Singh, Archana


    Cylindrically symmetric inhomogeneous magnetized string cosmological model is investigated with cosmological term lambda varying with time. To get the deterministic solution, it has been assumed that the expansion (theta) in the model is proportional to the eigen value sigma1 1 of the shear tensor sigmai j. The value of cosmological constant for the model is found to be small and positive which is supported by the results from recent supernovae Ia observations. The physical and geometric prop...

  14. A Regional Climate Model Evaluation System Project (United States)

    National Aeronautics and Space Administration — Develop a packaged data management infrastructure for the comparison of generated climate model output to existing observational datasets that includes capabilities...

  15. Evaluation of stochastic reservoir operation optimization models (United States)

    Celeste, Alcigeimes B.; Billib, Max


    This paper investigates the performance of seven stochastic models used to define optimal reservoir operating policies. The models are based on implicit (ISO) and explicit stochastic optimization (ESO) as well as on the parameterization-simulation-optimization (PSO) approach. The ISO models include multiple regression, two-dimensional surface modeling and a neuro-fuzzy strategy. The ESO model is the well-known and widely used stochastic dynamic programming (SDP) technique. The PSO models comprise a variant of the standard operating policy (SOP), reservoir zoning, and a two-dimensional hedging rule. The models are applied to the operation of a single reservoir damming an intermittent river in northeastern Brazil. The standard operating policy is also included in the comparison and operational results provided by deterministic optimization based on perfect forecasts are used as a benchmark. In general, the ISO and PSO models performed better than SDP and the SOP. In addition, the proposed ISO-based surface modeling procedure and the PSO-based two-dimensional hedging rule showed superior overall performance as compared with the neuro-fuzzy approach.

  16. Making long-term economic growth more sustainable. Evaluating the costs and benefits

    Energy Technology Data Exchange (ETDEWEB)

    Islam, Sardar M.N.; Clarke, Matthew [Sustainable Economic Growth Program, Centre for Strategic Economic Studies, City Campus, Victoria University, PO Box 14428, Melbourne, Vic. (Australia); Munasinghe, Mohan [Munasinghe Institute for Development (MIND), Colombo (Sri Lanka)


    Currently, traditional development issues such as economic stagnation, poverty, hunger, and illness as well as newer challenges like environmental degradation and globalisation demand attention. Sustainable development, including its economic, environmental and social elements, is a key goal of decisionmakers. Optimal economic growth has also been a crucial goal of both development theorists and practitioners. This paper examines the conditions under which optimal growth might be sustainable, by assessing the costs and benefits of growth. Key environmental and social aspects are considered. The Ecol-Opt-Growth-1 model analyses economic-ecological interactions, including resource depletion, pollution, irreversibility, other environmental effects, and uncertainty. It addresses some important issues, including savings, investment, technical progress, substitutability of productive factors, intergenerational efficiency, equity, and policies to make economic growth more sustainable-a basic element of the sustainomics framework. The empirical results support growing concerns that costs of growth may outweigh its benefits, resulting in unsustainability. Basically, in a wide range of circumstances, long term economic growth is unsustainable due to increasing environmental damage. Nevertheless, the model has many options that can be explored by policy makers, to make the development path more sustainable, as advocated by sustainomics. One example suggests that government supported abatement programs are needed to move towards sustainable development, since the model runs without abatement were infeasible. The optimal rate of abatement increases over time. Abatement of pollution is necessary to improve ecosystem viability and increase sustainability. Further research is necessary to seek conditions under which alternative economic growth paths are likely to become sustainable.

  17. Modeling hourly electricity dynamics for policy making in long-term scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Pina, Andre, E-mail: [Center for Innovation, Technology and Policy Research - IN, Instituto Superior Tecnico, Technical University of Lisbon, Av. Rovisco Pais 1, 1049-001 Lisbon (Portugal); MIT-Portugal Program, Sustainable Energy Systems Focus Area (Portugal); Silva, Carlos [IDMEC, Instituto Superior Tecnico, Technical University of Lisbon (Portugal); MIT-Portugal Program, Sustainable Energy Systems Focus Area (Portugal); Ferrao, Paulo [Center for Innovation, Technology and Policy Research - IN, Instituto Superior Tecnico, Technical University of Lisbon, Av. Rovisco Pais 1, 1049-001 Lisbon (Portugal); MIT-Portugal Program, Sustainable Energy Systems Focus Area (Portugal)


    Energy policies are often related to the global effort in reducing greenhouse gas emissions through increased use of renewable energies in electricity production. The impact of these policies is usually calculated by energy planning tools. However, the modeling methodologies most currently used are not adequate to simulate long-term scenarios while considering the hourly dynamics of supply and demand. This paper presents an extension of the TIMES energy planning tool for investment decisions in electricity production that considers seasonal, daily and hourly supply and demand dynamics. The inclusion of these dynamics enables the model to produce more accurate results in what concerns the impact of introducing energy efficiency policies and the increased use of renewable energies. The model was validated in Sao Miguel (Azores, Portugal) for the years 2006-2009, where a comparison with real data showed that the model can simulate the supply and demand dynamics. Further, the long-term analysis shows that the inclusion of these dynamics contributes to a better assessment of the renewable energy potential, suggests the postponement of investments in new generation capacity, and demonstrates that using fine time resolution modeling is very valuable for the design of effective policy measures under high renewable penetration energy systems. - Highlights: > We develop a high temporal resolution TIMES model for long-term policy analysis. > The model is capable of considering hourly electricity supply and demand dynamics. > Lower resolution models can overestimate the optimum amount of renewable energies. > Modeling hourly dynamics of policies can help avoid non cost-effective investments.

  18. Evaluating the adequacy of climate change information to support long-term water resource planning (United States)

    Brekke, L. D.; Clark, M. P.; Gutmann, E. D.; Pruitt, T.; Mizukami, N.; Mendoza, P. A.; Rasmussen, R.; Arnold, J.; Raff, D. A.; Rajagopalan, B.


    The National Center for Atmospheric Research (NCAR), the Department of Interior's Bureau of Reclamation (Reclamation) and the U.S. Army Corps of Engineers (USACE) are partnering to understand appropriate applications of downscaling methods and hydrologic analysis used to produce projections of hydroclimate impacts used in long-term water resources planning and management. The overall objectives of this project are to determine the extent to which the portrayal of hydroclimate impacts depends on methodological choices, understand why different methods produce different results, and provide guidance on the suitability of different methods to provide state-of-the-art intelligence for water resources planning and management. Research questions include: (1) How does the portrayal of hydrologic impacts under climate change depend on the chosen downscaling method and resolution (i.e. dynamical downscaling using regional climate models versus non-dynamical downscaling using statistical or empirical methods)? (2) How does the portrayal of hydrologic impacts under climate change depend on the choice/configuration of hydrologic model(s) used for impact assessment and the parameter estimation strategy? This presentation provides a synthesis of methods and key findings. Main results are (i) high-resolution dynamic downscaling using a model such as the Weather Research and Forecasting model (WRF) is required to properly capture precipitation processes over complex terrain in the Colorado Headwaters region, and climate change scenarios from the 4-km WRF simulations are very different from current guidance being provided to water managers; (ii) WRF simulations at 12-km and 36-km have poor correspondence to observations, and very different change signals to the 4-km WRF simulations; (iii) the statistical downscaling methods examined struggle to adequately capture daily precipitation characteristics that are important to hydrologic impacts, such as wet-day frequency and spatial

  19. Decision-analytical modelling in health-care economic evaluations. (United States)

    Sun, Xin; Faunce, Thomas


    Decision-analytical modelling is widely used in health-care economic evaluations, especially in situations where evaluators lack clinical trial data, and in circumstances where such evaluations factor into reimbursement pricing decisions. This paper aims to improve the understanding and use of modelling techniques in this context, with particular emphasis on Markov modelling. We provide an overview, in this paper, of the principles and methodological details of decision-analytical modelling. We propose a common route for practicing modelling that accommodates any type of decision-analytical modelling techniques. We use the treatment of chronic hepatitis B as an example to indicate the process of development, presentation and analysis of the Markov model, and discuss the strengths, weaknesses and pitfalls of different approaches. Good practice of modelling requires careful planning, conduct and analysis of the model, and needs input from modellers and users.

  20. Long-term monitored catchments in Norway - a hydrologic and chemical evaluation -

    Energy Technology Data Exchange (ETDEWEB)

    Lydersen, E.


    About 20 years ago, long-term monitoring of small Norwegian catchments were initiated, because of increasing concern regarding acidification of surface water and damage to fish populations. Long range transported air pollutants were considered to be the major acidification factor and so both precipitation and runoff chemistry were included in the monitoring programme. This report contains a thorough hydrologic and chemical evaluation of precipitation and runoff water separately as well as relationships between precipitation chemistry and runoff chemistry. The data comes from four catchments: Birkenes, Storgama, Langtjern and Kaarvatn. The chapters are (1) Sampling and analysis, (2) Description of the catchments, (3) Hydrology, (4) Chemistry, with subsections on wet deposition, dry deposition, concentration of marine compounds with distance from the sea, acid precipitation, runoff chemistry, sulphuric acid and other acidifying compounds, acid neutralizing capacity, and aluminium, (5) Time trends in precipitation and runoff chemistry. The time trends are evaluated in relation to the declining emissions of sulphur compounds in Europe since the late seventies. 134 refs., 213 figs., 54 tabs.

  1. Using an Ecosystem Model to Evaluate Fisheries Management ...

    African Journals Online (AJOL)

    Town, Marine Research Institute, Rondebosch 7701, South Africa. Keywords: Fisheries management, ecosystem modelling, artisanal fisheries, climate change impacts, trophodynamics, coral reefs. Abstract—A coral reef ecosystem simulation model, CAFFEE, developed to evaluate the effects of fisheries management ...

  2. Modeling, Simulation and Performance Evaluation of Parabolic Trough

    African Journals Online (AJOL)


    MODELING, SIMULATION AND PERFORMANCE EVALUATION OF. PARABOLIC TROUGH. SOLAR COLLECTOR POWER GENERATION SYSTEM. Mekuannint Mesfin and Abebayehu Assefa. Department of Mechanical Engineering. Addis Ababa University. ABSTRACT. Model of a parabolic trough power plant, taking.

  3. iFlorida model deployment final evaluation report. (United States)


    This document is the final report for the evaluation of the USDOT-sponsored Surface Transportation Security and Reliability Information System Model Deployment, or iFlorida Model Deployment. This report discusses findings in the following areas: ITS ...


    Directory of Open Access Journals (Sweden)

    P. V. Filonov


    Full Text Available An approach to computation of the airport throughput that is based on the quantum model is considered. The description of the quantum system is proposed in terms of macroparameters. The evolution process of quantum system is shown in terms of the markov chain. The transition probabilities between the different states of the system are shown in the analytical form. The dependency between the number of intersection of the SID/STAR - trajectories and airport throughput is considered.

  5. Short-Term Wind Power Interval Forecasting Based on an EEMD-RT-RVM Model


    Haixiang Zang; Lei Fan; Mian Guo; Zhinong Wei; Guoqiang Sun; Li Zhang


    Accurate short-term wind power forecasting is important for improving the security and economic success of power grids. Existing wind power forecasting methods are mostly types of deterministic point forecasting. Deterministic point forecasting is vulnerable to forecasting errors and cannot effectively deal with the random nature of wind power. In order to solve the above problems, we propose a short-term wind power interval forecasting model based on ensemble empirical mode decomposition (EE...

  6. Evaluating Econometric Models and Expert Intuition

    NARCIS (Netherlands)

    R. Legerstee (Rianne)


    textabstractThis thesis is about forecasting situations which involve econometric models and expert intuition. The first three chapters are about what it is that experts do when they adjust statistical model forecasts and what might improve that adjustment behavior. It is investigated how expert

  7. Modeling, simulation and performance evaluation of parabolic ...

    African Journals Online (AJOL)

    Model of a parabolic trough power plant, taking into consideration the different losses associated with collection of the solar irradiance and thermal losses is presented. MATLAB software is employed to model the power plant at reference state points. The code is then used to find the different reference values which are ...

  8. Educational game models: conceptualization and evaluation ...

    African Journals Online (AJOL)

    South African Journal of Higher Education ... The relationship between educational theories, game design and game development are used to develop models for the creation of complex learning environments. ... These models were developed to better understand the relationships between story, play and learning.

  9. Evaluation of EOR Processes Using Network Models

    DEFF Research Database (Denmark)

    Larsen, Jens Kjell; Krogsbøll, Anette


    The report consists of the following parts: 1) Studies of wetting properties of model fluids and fluid mixtures aimed at an optimal selection of candidates for micromodel experiments. 2) Experimental studies of multiphase transport properties using physical models of porous networks (micromodels...

  10. Linear model applied to the evaluation of pharmaceutical stability data

    Directory of Open Access Journals (Sweden)

    Renato Cesar Souza


    Full Text Available The expiry date on the packaging of a product gives the consumer the confidence that the product will retain its identity, content, quality and purity throughout the period of validity of the drug. The definition of this term in the pharmaceutical industry is based on stability data obtained during the product registration. By the above, this work aims to apply the linear regression according to the guideline ICH Q1E, 2003, to evaluate some aspects of a product undergoing in a registration phase in Brazil. With this propose, the evaluation was realized with the development center of a multinational company in Brazil, with samples of three different batches composed by two active principal ingredients in two different packages. Based on the preliminary results obtained, it was possible to observe the difference of degradation tendency of the product in two different packages and the relationship between the variables studied, added knowledge so new models of linear equations can be applied and developed for other products.

  11. Robustness-based evaluation of long-term river basin planning under climate change (United States)

    Taner, M. U.; Ray, P. A.; Brown, C. M.


    This work develops a bottom-up, multi-stage planning framework for the sustainable development of river basin systems under deep climate uncertainty. The research focuses on whether and when it is desirable to invest for costly water infrastructure projects, and how to select among a set of project alternatives in order achieve the desired economic benefits with a relatively low level of risk. The proposed framework begins with identifying a set of climate conditions to represent the future vulnerability domain of the system using simulation analysis. The conditions identified in the simulation analysis are then used to develop a scenario-tree, to represent the manner in which the uncertainties may evolve over the course of the planning period. Next, optimal decisions are repeatedly explored through a multi-stage optimization model, by varying the probability weights employed in the scenario-tree. The resulting vector of optimal decisions are post-processed to identify robust choices that are least sensitive to the scenario probabilities. The proposed planning framework is illustrated for the Niger Basin, over a 45-year planning period from 2015 to 2060. The Niger Basin is a transboundary system facing a series of challenges including endemic poverty, inadequate infrastructure and weak adaptive capacity to climate variability and change. The case study assesses long-term economic benefits from four new dam projects, and from a range of expansions across the eleven irrigation zones. The climate scenarios are obtained by first generating new climate variability realizations from a stochastic weather generator, and then placing climate change factors on the generated climate realizations. Basin runoff response to climate scenarios are simulated by a series of monthly, two-compartment water balance models. Long-term economic benefits are estimated from the sectors of irrigated agriculture, hydropower, navigation, fishing, and environmental protection, using a mixed

  12. Regression Model Term Selection for the Analysis of Strain-Gage Balance Calibration Data (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.


    The paper discusses the selection of regression model terms for the analysis of wind tunnel strain-gage balance calibration data. Different function class combinations are presented that may be used to analyze calibration data using either a non-iterative or an iterative method. The role of the intercept term in a regression model of calibration data is reviewed. In addition, useful algorithms and metrics originating from linear algebra and statistics are recommended that will help an analyst (i) to identify and avoid both linear and near-linear dependencies between regression model terms and (ii) to make sure that the selected regression model of the calibration data uses only statistically significant terms. Three different tests are suggested that may be used to objectively assess the predictive capability of the final regression model of the calibration data. These tests use both the original data points and regression model independent confirmation points. Finally, data from a simplified manual calibration of the Ames MK40 balance is used to illustrate the application of some of the metrics and tests to a realistic calibration data set.

  13. A reusable simulation model to evaluate the effects of walk-in for diagnostic examinations

    NARCIS (Netherlands)

    Braaksma, Aleida; Kortbeek, Nikky; Smid, Kees; Sprengers, Marieke E.S.


    Enabling patients to walk in for their diagnostic examination without an appointment has considerable potential in terms of quality of care, patient service, and system efficiency. We present a model to evaluate the effect of implementing a combined walk-in and appointment system, offering

  14. Evaluating Energy Efficiency Policies with Energy-Economy Models

    Energy Technology Data Exchange (ETDEWEB)

    Mundaca, Luis; Neij, Lena; Worrell, Ernst; McNeil, Michael A.


    The growing complexities of energy systems, environmental problems and technology markets are driving and testing most energy-economy models to their limits. To further advance bottom-up models from a multidisciplinary energy efficiency policy evaluation perspective, we review and critically analyse bottom-up energy-economy models and corresponding evaluation studies on energy efficiency policies to induce technological change. We use the household sector as a case study. Our analysis focuses on decision frameworks for technology choice, type of evaluation being carried out, treatment of market and behavioural failures, evaluated policy instruments, and key determinants used to mimic policy instruments. Although the review confirms criticism related to energy-economy models (e.g. unrealistic representation of decision-making by consumers when choosing technologies), they provide valuable guidance for policy evaluation related to energy efficiency. Different areas to further advance models remain open, particularly related to modelling issues, techno-economic and environmental aspects, behavioural determinants, and policy considerations.

  15. Evaluation of Digital Model Accuracy and Time‑dependent ...

    African Journals Online (AJOL)

    Time‑dependent deformation of the alginate impressions and the accuracy of the conventional plaster models and digital models were evaluated separately. Results: Plaster models, negative and positive digital models showed significant differences in nearly all measurements at T (0), T (1), and T (2) times (P < 0.01, ...

  16. Metrics for evaluating performance and uncertainty of Bayesian network models (United States)

    Bruce G. Marcot


    This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...

  17. Statistical Models for Tornado Climatology: Long and Short-Term Views

    CERN Document Server

    Elsner, James B; Fricker, Tyler


    This paper estimates local tornado risk from records of past events using statistical models. First, a spatial model is fit to the tornado counts aggregated in counties with terms that control for changes in observational practices over time. Results provide a long-term view of risk that delineates the main tornado corridors in the United States where the expected annual rate exceeds two tornadoes per 10,000 square km. A few counties in the Texas Panhandle and central Kansas have annual rates that exceed four tornadoes per 10,000 square km. Refitting the model after removing the least damaging tornadoes from the data (EF0) produces a similar map but with the greatest tornado risk shifted south and eastward. Second, a space-time model is fit to the counts aggregated in raster cells with terms that control for changes in climate factors. Results provide a short-term view of risk. The short-term view identifies the shift of tornado activity away from the Ohio Valley under El Ni\\~no conditions and away from the S...

  18. Short-term earthquake forecasting based on an epidemic clustering model (United States)

    Console, Rodolfo; Murru, Maura; Falcone, Giuseppe


    The application of rigorous statistical tools, with the aim of verifying any prediction method, requires a univocal definition of the hypothesis, or the model, characterizing the concerned anomaly or precursor, so as it can be objectively recognized in any circumstance and by any observer. This is mandatory to build up on the old-fashion approach consisting only of the retrospective anecdotic study of past cases. A rigorous definition of an earthquake forecasting hypothesis should lead to the objective identification of particular sub-volumes (usually named alarm volumes) of the total time-space volume within which the probability of occurrence of strong earthquakes is higher than the usual. The test of a similar hypothesis needs the observation of a sufficient number of past cases upon which a statistical analysis is possible. This analysis should be aimed to determine the rate at which the precursor has been followed (success rate) or not followed (false alarm rate) by the target seismic event, or the rate at which a target event has been preceded (alarm rate) or not preceded (failure rate) by the precursor. The binary table obtained from this kind of analysis leads to the definition of the parameters of the model that achieve the maximum number of successes and the minimum number of false alarms for a specific class of precursors. The mathematical tools suitable for this purpose may include the definition of Probability Gain or the R-Score, as well as the application of popular plots such as the Molchan error-diagram and the ROC diagram. Another tool for evaluating the validity of a forecasting method is the concept of the likelihood ratio (also named performance factor) of occurrence and non-occurrence of seismic events under different hypotheses. Whatever is the method chosen for building up a new hypothesis, usually based on retrospective data, the final assessment of its validity should be carried out by a test on a new and independent set of observations

  19. Evaluation of consumer satisfaction using the tetra-class model. (United States)

    Clerfeuille, Fabrice; Poubanne, Yannick; Vakrilova, Milena; Petrova, Guenka


    A number of studies have shown the importance of consumers' satisfaction toward pharmacy services. The measurement of patient satisfaction through different elements of services provided is challenging within the context of a dynamic economic environment. Patient satisfaction is the result of long-term established habits and expectations to the pharmacy as an institution. Few studies to date have attempted to discern whether these changes have led to increased patient satisfaction and loyalty, particularly within developing nations. The objective of this study was to evaluate the elements of the services provided in Bulgarian pharmacies and their contribution to consumer satisfaction using a tetra-class model. Three main hypotheses were tested in pharmacies to validate the model in the case of complex services. Additionally, the contribution of the different service elements to the clients' satisfaction was studied. The analysis was based on a survey of customers in central and district pharmacies in Sofia, Bulgaria. The data were analyzed through a correspondence analysis which was applied to the results of the 752 distributed questionnaires. It was observed that different dimensions of the pharmacies contribute uniquely to customer satisfaction, with consumer gender contributing greatly toward satisfaction, with type/location of pharmacy, consumer age, and educational degree also playing a part. The duration of time over which the consumers have been clients at a given pharmacy influences the subsequent service categorization. This research demonstrated that the tetra-class model is suitable for application in the pharmaceutical sector. The model results could be beneficial for both researchers and pharmacy managers.


    AERMOD is an advanced plume model that incorporates updated treatments of the boundary layer theory, understanding of turbulence and dispersion, and includes handling of terrain interactions. This paper presents an overview of AERMOD's features relative to ISCST3. AERM...

  1. Hydrologic Evaluation of Landfill Performance (HELP) Model (United States)

    The program models rainfall, runoff, infiltration, and other water pathways to estimate how much water builds up above each landfill liner. It can incorporate data on vegetation, soil types, geosynthetic materials, initial moisture conditions, slopes, etc.

  2. Model Energy Efficiency Program Impact Evaluation Guide (United States)

    This document provides guidance on model approaches for calculating energy, demand, and emissions savings resulting from energy efficiency programs. It describes several standard approaches that can be used in order to make these programs more efficient.

  3. An Evaluation of Software Cost Estimating Models. (United States)


    and Model Outputs - Wolverton 5-15 Table 7 4s a summary comparison of the model outputs with the needs described in Sectin 3. A liberal interpretation...Computer Program Development Costs, Tecolote Research, Inc., TM-7, Dec. 1974. A-82 LUJO = 4j 1o -(AI V; L/- c- - I Iun 00. Ln uI 0e V.0 / I) 0o I- t

  4. Calibration of short rate term structure models from bid-ask coupon bond prices (United States)

    Gomes-Gonçalves, Erika; Gzyl, Henryk; Mayoral, Silvia


    In this work we use the method of maximum entropy in the mean to provide a model free, non-parametric methodology that uses only market data to provide the prices of the zero coupon bonds, and then, a term structure of the short rates. The data used consists of the prices of the bid-ask ranges of a few coupon bonds quoted in the market. The prices of the zero coupon bonds obtained in the first stage, are then used as input to solve a recursive set of equations to determine a binomial recombinant model of the short term structure of the interest rates.

  5. Research on Short-Term Wind Power Prediction Based on Combined Forecasting Models

    Directory of Open Access Journals (Sweden)

    Zhang Chi


    Full Text Available Short-Term wind power forecasting is crucial for power grid since the generated energy of wind farm fluctuates frequently. In this paper, a physical forecasting model based on NWP and a statistical forecasting model with optimized initial value in the method of BP neural network are presented. In order to make full use of the advantages of the models presented and overcome the limitation of the disadvantage, the equal weight model and the minimum variance model are established for wind power prediction. Simulation results show that the combination forecasting model is more precise than single forecasting model and the minimum variance combination model can dynamically adjust weight of each single method, restraining the forecasting error further.

  6. [Decision modeling for economic evaluation of health technologies]. (United States)

    de Soárez, Patrícia Coelho; Soares, Marta Oliveira; Novaes, Hillegonda Maria Dutilh


    Most economic evaluations that participate in decision-making processes for incorporation and financing of technologies of health systems use decision models to assess the costs and benefits of the compared strategies. Despite the large number of economic evaluations conducted in Brazil, there is a pressing need to conduct an in-depth methodological study of the types of decision models and their applicability in our setting. The objective of this literature review is to contribute to the knowledge and use of decision models in the national context of economic evaluations of health technologies. This article presents general definitions about models and concerns with their use; it describes the main models: decision trees, Markov chains, micro-simulation, simulation of discrete and dynamic events; it discusses the elements involved in the choice of model; and exemplifies the models addressed in national economic evaluation studies of diagnostic and therapeutic preventive technologies and health programs.

  7. Long-term metabolic correction of Wilson's disease in a murine model by gene therapy. (United States)

    Murillo, Oihana; Luqui, Daniel Moreno; Gazquez, Cristina; Martinez-Espartosa, Debora; Navarro-Blasco, Iñigo; Monreal, Jose Ignacio; Guembe, Laura; Moreno-Cermeño, Armando; Corrales, Fernando J; Prieto, Jesus; Hernandez-Alcoceba, Ruben; Gonzalez-Aseguinolaza, Gloria


    Wilson's disease (WD) is an autosomal recessively inherited copper storage disorder due to mutations in the ATP7B gene that causes hepatic and neurologic symptoms. Current treatments are based on lifelong copper chelating drugs and zinc salts, which may cause side effects and do not restore normal copper metabolism. In this work we assessed the efficacy of gene therapy to treat this condition. We transduced the liver of the Atp7b(-/-) WD mouse model with an adeno-associated vector serotype 8 (AAV8) encoding the human ATP7B cDNA placed under the control of the liver-specific α1-antitrypsin promoter (AAV8-AAT-ATP7B). After vector administration we carried out periodic evaluation of parameters associated with copper metabolism and disease progression. The animals were sacrificed 6months after treatment to analyze copper storage and hepatic histology. We observed a dose-dependent therapeutic effect of AAV8-AAT-ATP7B manifested by the reduction of serum transaminases and urinary copper excretion, normalization of serum holoceruloplasmin, and restoration of physiological biliary copper excretion in response to copper overload. The liver of treated animals showed normalization of copper content and absence of histological alterations. Our data demonstrate that AAV8-AAT-ATP7B-mediated gene therapy provides long-term correction of copper metabolism in a clinically relevant animal model of WD providing support for future translational studies. Copyright © 2015 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.

  8. Modeling the Near-Term Risk of Climate Uncertainty: Interdependencies among the U.S. States (United States)

    Lowry, T. S.; Backus, G.; Warren, D.


    Decisions made to address climate change must start with an understanding of the risk of an uncertain future to human systems, which in turn means understanding both the consequence as well as the probability of a climate induced impact occurring. In other words, addressing climate change is an exercise in risk-informed policy making, which implies that there is no single correct answer or even a way to be certain about a single answer; the uncertainty in future climate conditions will always be present and must be taken as a working-condition for decision making. In order to better understand the implications of uncertainty on risk and to provide a near-term rationale for policy interventions, this study estimates the impacts from responses to climate change on U.S. state- and national-level economic activity by employing a risk-assessment methodology for evaluating uncertain future climatic conditions. Using the results from the Intergovernmental Panel on Climate Change’s (IPCC) Fourth Assessment Report (AR4) as a proxy for climate uncertainty, changes in hydrology over the next 40 years were mapped and then modeled to determine the physical consequences on economic activity and to perform a detailed 70-industry analysis of the economic impacts among the interacting lower-48 states. The analysis determines industry-level effects, employment impacts at the state level, interstate population migration, consequences to personal income, and ramifications for the U.S. trade balance. The conclusions show that the average risk of damage to the U.S. economy from climate change is on the order of $1 trillion over the next 40 years, with losses in employment equivalent to nearly 7 million full-time jobs. Further analysis shows that an increase in uncertainty raises this risk. This paper will present the methodology behind the approach, a summary of the underlying models, as well as the path forward for improving the approach.

  9. Evaluation model development for sprinkler irrigation uniformity ...

    African Journals Online (AJOL)

    A new evaluation method with accompanying software was developed to precisely calculate uniformity from catch-can test data, assuming sprinkler distribution data to be a continuous variable. Two interpolation steps are required to compute unknown water application depths at grid distribution points from radial ...

  10. Evaluating the soil physical quality under long-term field experiments in Southern Italy (United States)

    Castellini, Mirko; Stellacci, Anna Maria; Iovino, Massimo; Rinaldi, Michele; Ventrella, Domenico


    Long-term field experiments performed in experimental farms are important research tools to assess the soil physical quality (SPQ) given that relatively stable conditions can be expected in these soils. However, different SPQ indicators may sometimes provide redundant or conflicting results, making difficult an SPQ evaluation (Castellini et al., 2014). As a consequence, it is necessary to apply appropriate statistical procedures to obtain a minimum set of key indicators. The study was carried out at the Experimental Farm of CREA-SCA (Foggia) in two long-term field experiments performed on durum wheat. The first long-term experiment is aiming at evaluating the effects of two residue management systems (burning, B or soil incorporation of crop residues, I) while the second at comparing the effect of tillage (conventional tillage, CT) and sod-seeding (direct drilling, DD). In order to take into account both optimal and non-optimal soil conditions, five SPQ indicators were monitored at 5-6 sampling dates during the crop season (i.e., between November and June): soil bulk density (BD), macroporosity (PMAC), air capacity (AC), plant available water capacity (PAWC) and relative field capacity (RFC). Two additional data sets, collected on DD plot in different cropping seasons and in Sicilian soils differing for texture, depth and land use (N=140), were also used with the aim to check the correlation among indicators. Impact of soil management was assessed by comparing SPQ evaluated under different management systems with optimal reference values reported in literature. Two techniques of multivariate analysis (principal component analysis, PCA and stepwise discriminant analysis, SDA) were applied to select the most suitable indicator to facilitate the judgment on SPQ. Regardless of the considered management system, sampling date or auxiliary data set, correlation matrices always showed significant negative relationships between RFC and AC. Decreasing RFC at increasing AC is

  11. Evaluation of Gaussian approximations for data assimilation in reservoir models

    KAUST Repository

    Iglesias, Marco A.


    The Bayesian framework is the standard approach for data assimilation in reservoir modeling. This framework involves characterizing the posterior distribution of geological parameters in terms of a given prior distribution and data from the reservoir dynamics, together with a forward model connecting the space of geological parameters to the data space. Since the posterior distribution quantifies the uncertainty in the geologic parameters of the reservoir, the characterization of the posterior is fundamental for the optimal management of reservoirs. Unfortunately, due to the large-scale highly nonlinear properties of standard reservoir models, characterizing the posterior is computationally prohibitive. Instead, more affordable ad hoc techniques, based on Gaussian approximations, are often used for characterizing the posterior distribution. Evaluating the performance of those Gaussian approximations is typically conducted by assessing their ability at reproducing the truth within the confidence interval provided by the ad hoc technique under consideration. This has the disadvantage of mixing up the approximation properties of the history matching algorithm employed with the information content of the particular observations used, making it hard to evaluate the effect of the ad hoc approximations alone. In this paper, we avoid this disadvantage by comparing the ad hoc techniques with a fully resolved state-of-the-art probing of the Bayesian posterior distribution. The ad hoc techniques whose performance we assess are based on (1) linearization around the maximum a posteriori estimate, (2) randomized maximum likelihood, and (3) ensemble Kalman filter-type methods. In order to fully resolve the posterior distribution, we implement a state-of-the art Markov chain Monte Carlo (MCMC) method that scales well with respect to the dimension of the parameter space, enabling us to study realistic forward models, in two space dimensions, at a high level of grid refinement. Our

  12. Benchmarking the New RESRAD-OFFSITE Source Term Model with DUST-MS and GoldSim - 13377

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, J.J.; Kamboj, S.; Gnanapragasam, E.; Yu, C. [Argonne National Laboratory, Argonne, IL 60439 (United States)


    RESRAD-OFFSITE is a computer code developed by Argonne National Laboratory under the sponsorship of U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC). It is designed on the basis of RESRAD (onsite) code, a computer code designated by DOE and NRC for evaluating soil-contaminated sites for compliance with human health protection requirements pertaining to license termination or environmental remediation. RESRAD-OFFSITE has enhanced capabilities of modeling radionuclide transport to offsite locations and calculating potential radiation exposure to offsite receptors. Recently, a new source term model was incorporated into RESRAD-OFFSITE to enhance its capability further. This new source term model allows simulation of radionuclide releases from different waste forms, in addition to the soil sources originally considered in RESRAD (onsite) and RESRAD-OFFSITE codes. With this new source term model, a variety of applications can be achieved by using RESRAD-OFFSITE, including but not limited to, assessing the performance of radioactive waste disposal facilities. This paper presents the comparison of radionuclide release rates calculated by the new source term model of RESRAD-OFFSITE versus those calculated by DUST-MS and GoldSim, respectively. The focus of comparison is on the release rates of radionuclides from the bottom of the contaminated zone that was assumed to contain radioactive source materials buried in soil. The transport of released contaminants outside of the primary contaminated zone is beyond the scope of this paper. Overall, the agreement between the RESRAD-OFFSITE results and the DUST-MS and GoldSim results is fairly good, with all three codes predicting identical or similar radionuclide release profiles over time. Numerical dispersion in the DUST-MS and GoldSim results was identified as potentially contributing to the disagreement in the release rates. In general, greater discrepancy in the release rates was found for short

  13. Disturbance frequency and vertical distribution of seeds affect long-term population dynamics: a mechanistic seed bank model. (United States)

    Eager, Eric Alan; Haridas, Chirakkal V; Pilson, Diana; Rebarber, Richard; Tenhumberg, Brigitte


    Seed banks are critically important for disturbance specialist plants because seeds of these species germinate only in disturbed soil. Disturbance and seed depth affect the survival and germination probability of seeds in the seed bank, which in turn affect population dynamics. We develop a density-dependent stochastic integral projection model to evaluate the effect of stochastic soil disturbances on plant population dynamics with an emphasis on mimicking how disturbances vertically redistribute seeds within the seed bank. We perform a simulation analysis of the effect of the frequency and mean depth of disturbances on the population's quasi-extinction probability, as well as the long-term mean and variance of the total density of seeds in the seed bank. We show that increasing the frequency of disturbances increases the long-term viability of the population, but the relationship between the mean depth of disturbance and the long-term viability of the population are not necessarily monotonic for all parameter combinations. Specifically, an increase in the probability of disturbance increases the long-term viability of the total seed bank population. However, if the probability of disturbance is too low, a shallower mean depth of disturbance can increase long-term viability, a relationship that switches as the probability of disturbance increases. However, a shallow disturbance depth is beneficial only in scenarios with low survival in the seed bank.

  14. Tapering off benzodiazepines in long-term users: an economic evaluation. (United States)

    Oude Voshaar, Richard C; Krabbe, Paul F M; Gorgels, Wim J M J; Adang, Eddy M M; van Balkom, Anton J L M; van de Lisdonk, Eloy H; Zitman, Frans G


    Discontinuation of benzodiazepine usage has never been evaluated in economic terms. This study aimed to compare the relative costs and outcomes of tapering off long-term benzodiazepine use combined with group cognitive behavioural therapy (TO+CBT), tapering off alone (TOA) and usual care. A randomised controlled trial was conducted, incorporating a cost-effectiveness analysis from a societal as well as a pharmaceutical perspective. The cost of intervention treatment, prescribed drugs, healthcare services, productivity loss, and patients' costs were measured using drug prescription data and cost diaries. Costs were indexed at 2001 prices. The principal outcome was the proportion of patients able to discontinue benzodiazepine use during the 18-month follow-up. A secondary outcome measure was quality of life (Health Utility Index Mark III [HUI-3] and the Medical Outcomes Study 36-item Short-Form Health Survey [SF-36]). A total of 180 patients were randomised to one of TO+CBT (n = 73), TOA (n = 73) or usual care (n = 34). Intervention treatment costs were an average of 172.99Euro per patient for TO+CBT and 69.50Euro per patient for TOA. Both treatment conditions significantly reduced benzodiazepine costs during follow-up compared with usual care. The incremental cost-effectiveness ratios (ICERs) showed that, for each incremental 1% successful benzodiazepine discontinuation, TO+CBT cost 10.30-62.53Euro versus usual care, depending on the study perspective. However, TO+CBT was extendedly dominated or was dominated by TOA. This resulted in ICERs of 0.57Euro, 10.21Euro and 48.92Euro for TOA versus usual care from the limited pharmaceutical, comprehensive pharmaceutical and societal perspective, respectively. TO+CBT and TOA both led to a reduction in benzodiazepine costs. However, it remains uncertain which healthcare utilisation has a causal relationship with long-term benzodiazepine consumption or its treatment. Although the ICERs indicated better cost effectiveness for

  15. Model evaluation and optimisation of nutrient removal potential for ...

    African Journals Online (AJOL)

    Performance of sequencing batch reactors for simultaneous nitrogen and phosphorus removal is evaluated by means of model simulation, using the activated sludge model, ASM2d, involving anoxic phosphorus uptake, recently proposed by the IAWQ Task group. The evaluation includes all major process configurations ...

  16. Evaluating The Impact Of Building Information Modeling (BIM) On Construction (United States)


    1 EVALUATING THE IMPACT OF BUILDING INFORMATION MODELING (BIM) ON CONSTRUCTION By PATRICK C. SUERMANN...Evaluating The Impact Of Building Information Modeling (BIM) On Construction 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...Building Owners and Managers Association ASC Associated Schools of Construction AtoN Aids to Navigation (USCG) BEP Business Enterprise Priority

  17. Rhode Island Model Evaluation & Support System: Building Administrator. Edition III (United States)

    Rhode Island Department of Education, 2015


    Rhode Island educators believe that implementing a fair, accurate, and meaningful educator evaluation and support system will help improve teaching, learning, and school leadership. The primary purpose of the Rhode Island Model Building Administrator Evaluation and Support System (Rhode Island Model) is to help all building administrators improve.…

  18. Rhode Island Model Evaluation & Support System: Support Professional. Edition II (United States)

    Rhode Island Department of Education, 2015


    Rhode Island educators believe that implementing a fair, accurate, and meaningful evaluation and support system for support professionals will help improve student outcomes. The primary purpose of the Rhode Island Model Support Professional Evaluation and Support System (Rhode Island Model) is to help all support professionals do their best work…

  19. Spectral evaluation of Earth geopotential models and an experiment ...

    Indian Academy of Sciences (India)

    gravity field related datasets have been compiled over local/regional scale (see ICGEM 2010), evalu- ating these models to clarify the differences among them and hence to monitor the improvements. Keywords. Earth geopotential model; spectral evaluation; terrestrial data; CHAMP and GRACE; EGM08; Remove. Compute ...

  20. Pilot evaluation in TENCompetence: a theory-driven model1

    NARCIS (Netherlands)

    Schoonenboom, J.; Sligte, H.; Moghnieh, A.; Specht, M.; Glahn, C.; Stefanov, K.; Navarrete, T.; Blat, J.


    This paper describes a theory-driven evaluation model that is used in evaluating four pilots in which an infrastructure for lifelong competence development, which is currently being developed, is validated. The model makes visible the separate implementation steps that connect the envisaged

  1. Dental students' reflections about long-term care experiences through an existing model of oral health. (United States)

    Brondani, Mario; Pattanaporn, Komkham


    The aim of this study was to explore students' reflective thinking about long-term care experiences from the perspective of a model of oral health. A total of 186 reflections from 193 second-year undergraduate dental students enrolled between 2011/12 and 2014/15 at the University of British Columbia were explored qualitatively. Reflections had a word limit of 300, and students were asked to relate an existing model of oral health to their long-term care experiences. We have identified the main ideas via a thematic analysis related to the geriatric dentistry experience in long-term care. The thematic analysis revealed that students attempted to demystify their pre-conceived ideas about older people and long-term care facilities, to think outside the box, for example away from a typical dental office, and to consider caring for elderly people from an interprofessional lens. According to some students, not all domains from the existing model of oral health were directly relevant to their geriatric experience while other domains, including interprofessionalism and cognition, were missing. While some participants had a positive attitude towards caring for this cohort of the population, others did not take this educational activity as a constructive experience. The nature of most students' reflective thinking within a long-term care experience showed to be related to an existing model of oral health. This model can help to give meaning to the dental geriatric experience of an undergraduate curriculum. Such experience has been instrumental in overcoming potential misconceptions about long-term care and geriatric dentistry. © 2017 John Wiley & Sons A/S and The Gerodontology Association. Published by John Wiley & Sons Ltd.

  2. Optimization of global model composed of radial basis functions using the term-ranking approach

    Energy Technology Data Exchange (ETDEWEB)

    Cai, Peng; Tao, Chao, E-mail:; Liu, Xiao-Jun [Key Laboratory of Modern Acoustics, Nanjing University, Nanjing 210093 (China)


    A term-ranking method is put forward to optimize the global model composed of radial basis functions to improve the predictability of the model. The effectiveness of the proposed method is examined by numerical simulation and experimental data. Numerical simulations indicate that this method can significantly lengthen the prediction time and decrease the Bayesian information criterion of the model. The application to real voice signal shows that the optimized global model can capture more predictable component in chaos-like voice data and simultaneously reduce the predictable component (periodic pitch) in the residual signal.

  3. Plane Symmetric Viscous Fluid Cosmological Models with Varying Λ-Term (United States)

    Pradhan, Anirudh; Pandey, Purnima; Jotania, Kanti; Yadav, Mahesh Kumar


    Plane symmetric viscous fluid cosmological models of the universe with a variable cosmological term are investigated. The viscosity coefficient of bulk viscous fluid is assumed to be a power function of mass density whereas the coefficient of shear viscosity is to be proportional to rate of expansion in the model. We have also obtained a special model in which the shear viscosity is assumed to be zero. The cosmological constant Λ is found to be a decreasing function of time and a positive which is supported by results from recent supernovae Ia observations. Some physical and geometric properties of the models are also discussed.

  4. On Modelling Long Term Stock Returns with Ergodic Diffusion Processes: Arbitrage and Arbitrage-Free Specifications

    Directory of Open Access Journals (Sweden)

    Bernard Wong


    martingale component is based on an ergodic diffusion with a specified stationary distribution. These models are particularly useful for long horizon asset-liability management as they allow the modelling of long term stock returns with heavy tail ergodic diffusions, with tractable, time homogeneous dynamics, and which moreover admit a complete financial market, leading to unique pricing and hedging strategies. Unfortunately the standard specifications of these models in literature admit arbitrage opportunities. We investigate in detail the features of the existing model specifications which create these arbitrage opportunities and consequently construct a modification that is arbitrage free.

  5. Systematic evaluation of atmospheric chemistry-transport model CHIMERE (United States)

    Khvorostyanov, Dmitry; Menut, Laurent; Mailler, Sylvain; Siour, Guillaume; Couvidat, Florian; Bessagnet, Bertrand; Turquety, Solene


    Regional-scale atmospheric chemistry-transport models (CTM) are used to develop air quality regulatory measures, to support environmentally sensitive decisions in the industry, and to address variety of scientific questions involving the atmospheric composition. Model performance evaluation with measurement data is critical to understand their limits and the degree of confidence in model results. CHIMERE CTM ( is a French national tool for operational forecast and decision support and is widely used in the international research community in various areas of atmospheric chemistry and physics, climate, and environment ( This work presents the model evaluation framework applied systematically to the new CHIMERE CTM versions in the course of the continuous model development. The framework uses three of the four CTM evaluation types identified by the Environmental Protection Agency (EPA) and the American Meteorological Society (AMS): operational, diagnostic, and dynamic. It allows to compare the overall model performance in subsequent model versions (operational evaluation), identify specific processes and/or model inputs that could be improved (diagnostic evaluation), and test the model sensitivity to the changes in air quality, such as emission reductions and meteorological events (dynamic evaluation). The observation datasets currently used for the evaluation are: EMEP (surface concentrations), AERONET (optical depths), and WOUDC (ozone sounding profiles). The framework is implemented as an automated processing chain and allows interactive exploration of the results via a web interface.

  6. Recommendations concerning energy information model documentation, public access, and evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wood, D.O.; Mason, M.J.


    A review is presented of the Energy Information Administration (EIA) response to Congressional and management concerns, relating specifically to energy information system documentation, public access to EIA systems, and scientific/peer evaluation. The relevant organizational and policy responses of EIA are discussed. An analysis of the model development process and approaches to, and organization of, model evaluation is presented. Included is a survey of model evaluation studies. A more detailed analysis of the origins of the legislated documentation and public access requirements is presented in Appendix A, and the results of an informal survey of other agency approaches to public access and evaluation is presented in Appendix B. Appendix C provides a survey of non-EIA activities relating to model documentation and evaluation. Twelve recommendations to improve EIA's procedures for energy information system documentation, evaluation activities, and public access are determined. These are discussed in detail. (MCW)

  7. Evaluation of a monthly hydrological model for Integrated Assessment Models (United States)

    Liu, Y.; Hejazi, M. I.; Li, H. Y.; Zhang, X.; Leng, G.


    The Integrated Assessment modeling (IAM) community, which generated the four representative concentration pathways (RCPs), is actively moving toward including endogenous representations of water supply and demand in their economic modeling frameworks. Toward integrating the water supply module, we build an efficient object-oriented and open-source hydrologic model (HM) to be embedded in IAMs - Global Change Assessment Model (GCAM). The main objective for this new HM is to strike a balance between model complexity and computational efficiency; i.e., possessing sufficient fidelity to capture both the annual and the seasonal signals of water fluxes and pools and being highly computationally efficient so that it can be used for large number of simulations or uncertainty quantification analyses. To this end, we build a monthly gridded hydrological model based on the ABCD model with some additional features such as a snow scheme and the effects of land use and land cover change (LULCC) on the hydrological cycle. In this framework, we mainly simulate the pools of soil moisture, snowpack and groundwater storage, and the fluxes of evapotranspiration, recharge to groundwater, direct runoff and groundwater discharge. We assess the performance of the model by comparing the model results against runoff simulation from the model of Variable Infiltration Capacity (VIC) as well as historical streamflow observations at various gauge stations. We will present results on the model performance, the gains of adding different model components (e.g., snow scheme, effects of LULCC), and the variations of hydrological cycle globally over the historical period of 1901-2010.

  8. Center for Integrated Nanotechnologies (CINT) Chemical Release Modeling Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Stirrup, Timothy Scott [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)


    This evaluation documents the methodology and results of chemical release modeling for operations at Building 518, Center for Integrated Nanotechnologies (CINT) Core Facility. This evaluation is intended to supplement an update to the CINT [Standalone] Hazards Analysis (SHA). This evaluation also updates the original [Design] Hazards Analysis (DHA) completed in 2003 during the design and construction of the facility; since the original DHA, additional toxic materials have been evaluated and modeled to confirm the continued low hazard classification of the CINT facility and operations. This evaluation addresses the potential catastrophic release of the current inventory of toxic chemicals at Building 518 based on a standard query in the Chemical Information System (CIS).


    Directory of Open Access Journals (Sweden)

    V. A. Fedorov


    Full Text Available Aim. The aim of this article is to understand the current issues of pedagogical training situation of a potential employee an experienced professional when using the internal resources of micro-enterprises. The relevance of the research problem dues to the needs of the labour market, terms of developing economy situation of micro-entrepreneurship and the demands of the subject of labour activity to vocational training without discontinuing work.Methodology and research methods. A leading approach to the study of this problem is a system-activity one, which allows us to represent the process of professional training in terms of the micro-enterprise as a system activity of subjects for the development of professional competence of the employee. The following research methods are used to solve the set tasks: theoretical study and analysis of psychological, pedagogical, sociological, scientific-methodical and special literature on the problem under study; a systematic approach to the disclosure of the nature of the problem and the formation of conceptual-terminological apparatus of the research; study and analysis of legislative and normative-legal acts; empirical – pedagogical observation, generalization and study of teaching experience, pedagogical design, questionnaire, interview, interviews, analysis of results, method of expert evaluations and their generalization.Results. The results of the research showed that the process of training and professional interaction of the micro-enterprise employees will be effective if training and professional interaction of the micro-enterprise employees to be considered as a productive mutual agreed actions of subjects of labour, aimed at solving the educational and professional problems in the process of joint labour activity. Developed structural-functional model of training and professional interaction of employees of micro-enterprises allows us to introduce the process of professional training as a

  10. Statistical modeling for visualization evaluation through data fusion. (United States)

    Chen, Xiaoyu; Jin, Ran


    There is a high demand of data visualization providing insights to users in various applications. However, a consistent, online visualization evaluation method to quantify mental workload or user preference is lacking, which leads to an inefficient visualization and user interface design process. Recently, the advancement of interactive and sensing technologies makes the electroencephalogram (EEG) signals, eye movements as well as visualization logs available in user-centered evaluation. This paper proposes a data fusion model and the application procedure for quantitative and online visualization evaluation. 15 participants joined the study based on three different visualization designs. The results provide a regularized regression model which can accurately predict the user's evaluation of task complexity, and indicate the significance of all three types of sensing data sets for visualization evaluation. This model can be widely applied to data visualization evaluation, and other user-centered designs evaluation and data analysis in human factors and ergonomics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Using short-term bioassays to evaluate the endocrine disrupting capacity of the pesticides linuron and fenoxycarb. (United States)

    Spirhanzlova, Petra; De Groef, Bert; Nicholson, Freda E; Grommen, Sylvia V H; Marras, Giulia; Sébillot, Anthony; Demeneix, Barbara A; Pallud-Mothré, Sophie; Lemkine, Gregory F; Tindall, Andrew J; Du Pasquier, David


    Several short-term whole-organism bioassays based on transgenic aquatic models are now under validation by the OECD (Organization for Economic Co-operation and Development) to become standardized test guidelines for the evaluation of the endocrine activity of substances. Evaluation of the endocrine disrupting capacity of pesticides will be a domain of applicability of these future reference tests. The herbicide linuron and the insecticide fenoxycarb are two chemicals commonly used in agricultural practices. While numerous studies indicate that linuron is likely to be an endocrine disruptor, there is little information available on the effect of fenoxycarb on vertebrate endocrine systems. Using whole-organism bioassays based on transgenic Xenopus laevis tadpoles and medaka fry we assessed the potential of fenoxycarb and linuron to disrupt thyroid, androgen and estrogen signaling. In addition we used in silico approach to simulate the affinity of these two pesticides to human hormone receptors. Linuron elicited thyroid hormone-like activity in tadpoles at all concentrations tested and, showed an anti-estrogenic activity in medaka at concentrations 2.5mg/L and higher. Our experiments suggest that, in addition to its previously established anti-androgenic action, linuron exhibits thyroid hormone-like responses, as well as acting at the estrogen receptor level to inhibit estrogen signaling. Fenoxycarb on the other hand, did not cause any changes in thyroid, androgen or estrogen signaling at the concentrations tested. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. The SSI TOOLBOX Source Term Model SOSIM - Screening for important radionuclides and parameter sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Avila Moreno, R.; Barrdahl, R.; Haegg, C.


    The main objective of the present study was to carry out a screening and a sensitivity analysis of the SSI TOOLBOX source term model SOSIM. This model is a part of the SSI TOOLBOX for radiological impact assessment of the Swedish disposal concept for high-level waste KBS-3. The outputs of interest for this purpose were: the total released fraction, the time of total release, the time and value of maximum release rate, the dose rates after direct releases of the biosphere. The source term equations were derived and simple equations and methods were proposed for calculation of these. A literature survey has been performed in order to determine a characteristic variation range and a nominal value for each model parameter. In order to reduce the model uncertainties the authors recommend a change in the initial boundary condition for solution of the diffusion equation for highly soluble nuclides. 13 refs.

  13. Evaluating Differential Effects Using Regression Interactions and Regression Mixture Models (United States)

    Van Horn, M. Lee; Jaki, Thomas; Masyn, Katherine; Howe, George; Feaster, Daniel J.; Lamont, Andrea E.; George, Melissa R. W.; Kim, Minjung


    Research increasingly emphasizes understanding differential effects. This article focuses on understanding regression mixture models, which are relatively new statistical methods for assessing differential effects by comparing results to using an interactive term in linear regression. The research questions which each model answers, their…

  14. How to Conduct a Qualitative Program Evaluation in the Light of Eisner’s Educational Connoisseurship and Criticism Model

    Directory of Open Access Journals (Sweden)

    İsmail Yüksel


    Full Text Available AbstractThe quantitative methodologies have been traditionally employed in the educational research so far. However, as long as with the appreciation and widespread use of the qualitative methodologies in many disciplines, many different educational areas have started to be examined in terms of qualitative research aspects. Particularly, the qualitative evaluation of the education programs has received considerable interest and there have been recently some attempts to develop a qualitative methodology for evaluating educational programs based upon the tenets of program evaluation. The evaluators have underlined the benefits of qualitative methods to boost the information shared with decision-makers and policy makers. The most inclusive endeavour has been carried out by Eisner. Eisner’s program evaluation model presents the role of educational connoisseurship and criticism in educational evaluation in terms of qualitative evaluation. This study aims at examining how a qualitative program evaluation is conducted in relation with the Eisner’s evaluation model.

  15. Combining techniques for screening and evaluating interaction terms on high-dimensional time-to-event data. (United States)

    Sariyar, Murat; Hoffmann, Isabell; Binder, Harald


    Molecular data, e.g. arising from microarray technology, is often used for predicting survival probabilities of patients. For multivariate risk prediction models on such high-dimensional data, there are established techniques that combine parameter estimation and variable selection. One big challenge is to incorporate interactions into such prediction models. In this feasibility study, we present building blocks for evaluating and incorporating interactions terms in high-dimensional time-to-event settings, especially for settings in which it is computationally too expensive to check all possible interactions. We use a boosting technique for estimation of effects and the following building blocks for pre-selecting interactions: (1) resampling, (2) random forests and (3) orthogonalization as a data pre-processing step. In a simulation study, the strategy that uses all building blocks is able to detect true main effects and interactions with high sensitivity in different kinds of scenarios. The main challenge are interactions composed of variables that do not represent main effects, but our findings are also promising in this regard. Results on real world data illustrate that effect sizes of interactions frequently may not be large enough to improve prediction performance, even though the interactions are potentially of biological relevance. Screening interactions through random forests is feasible and useful, when one is interested in finding relevant two-way interactions. The other building blocks also contribute considerably to an enhanced pre-selection of interactions. We determined the limits of interaction detection in terms of necessary effect sizes. Our study emphasizes the importance of making full use of existing methods in addition to establishing new ones.

  16. Short-term Forecast Model of Vehicles Volume Based on ARIMA Seasonal Model and Holt-Winters

    Directory of Open Access Journals (Sweden)

    Wang Zhi-Hui


    Full Text Available In order to alleviate the urban traffic congestion and ensure traffic safety, we need to do a good job in urban road traffic safety planning, make the real-time analysis and forecast of urban traffic flow to detect changes of current traffic flow in time, make scientific planning of roads and improve the road service ability and the transport efficiency of freight vehicles. The data of short-term vehicles volume is characterized by uncertainty and timing correlation series. Given this, the ARIMA seasonal model and the Holt-Winters model are used to establish a forecasting model for the short-term vehicles volume of the city. Finally, we compare the model with predictions.

  17. Evaluating methods for estimating space-time paths of individuals in calculating long-term personal exposure to air pollution (United States)

    Schmitz, Oliver; Soenario, Ivan; Vaartjes, Ilonca; Strak, Maciek; Hoek, Gerard; Brunekreef, Bert; Dijst, Martin; Karssenberg, Derek


    Air pollution is one of the major concerns for human health. Associations between air pollution and health are often calculated using long-term (i.e. years to decades) information on personal exposure for each individual in a cohort. Personal exposure is the air pollution aggregated along the space-time path visited by an individual. As air pollution may vary considerably in space and time, for instance due to motorised traffic, the estimation of the spatio-temporal location of a persons' space-time path is important to identify the personal exposure. However, long term exposure is mostly calculated using the air pollution concentration at the x, y location of someone's home which does not consider that individuals are mobile (commuting, recreation, relocation). This assumption is often made as it is a major challenge to estimate space-time paths for all individuals in large cohorts, mostly because limited information on mobility of individuals is available. We address this issue by evaluating multiple approaches for the calculation of space-time paths, thereby estimating the personal exposure along these space-time paths with hyper resolution air pollution maps at national scale. This allows us to evaluate the effect of the space-time path and resulting personal exposure. Air pollution (e.g. NO2, PM10) was mapped for the entire Netherlands at a resolution of 5×5 m2 using the land use regression models developed in the European Study of Cohorts for Air Pollution Effects (ESCAPE, and the open source software PCRaster ( The models use predictor variables like population density, land use, and traffic related data sets, and are able to model spatial variation and within-city variability of annual average concentration values. We approximated space-time paths for all individuals in a cohort using various aggregations, including those representing space-time paths as the outline of a persons' home or associated parcel

  18. Development of a Watershed-Scale Long-Term Hydrologic Impact Assessment Model with the Asymptotic Curve Number Regression Equation

    Directory of Open Access Journals (Sweden)

    Jichul Ryu


    Full Text Available In this study, 52 asymptotic Curve Number (CN regression equations were developed for combinations of representative land covers and hydrologic soil groups. In addition, to overcome the limitations of the original Long-term Hydrologic Impact Assessment (L-THIA model when it is applied to larger watersheds, a watershed-scale L-THIA Asymptotic CN (ACN regression equation model (watershed-scale L-THIA ACN model was developed by integrating the asymptotic CN regressions and various modules for direct runoff/baseflow/channel routing. The watershed-scale L-THIA ACN model was applied to four watersheds in South Korea to evaluate the accuracy of its streamflow prediction. The coefficient of determination (R2 and Nash–Sutcliffe Efficiency (NSE values for observed versus simulated streamflows over intervals of eight days were greater than 0.6 for all four of the watersheds. The watershed-scale L-THIA ACN model, including the asymptotic CN regression equation method, can simulate long-term streamflow sufficiently well with the ten parameters that have been added for the characterization of streamflow.


    Directory of Open Access Journals (Sweden)

    Q. X. Xu


    Full Text Available The semantic similarities are important in concept definition, recognition, categorization, interpretation, and integration. Many semantic similarity models have been established to evaluate semantic similarities of objects or/and concepts. To find out the suitability and performance of different models in evaluating concept similarities, we make a comparison of four main types of models in this paper: the geometric model, the feature model, the network model, and the transformational model. Fundamental principles and main characteristics of these models are introduced and compared firstly. Land use and land cover concepts of NLCD92 are employed as examples in the case study. The results demonstrate that correlations between these models are very high for a possible reason that all these models are designed to simulate the similarity judgement of human mind.

  20. A Stochastic Model for Short-Term Probabilistic Forecast of Solar Photo-Voltaic Power


    Ramakrishna, Raksha; Scaglione, Anna; Vittal, Vijay


    In this paper, a stochastic model with regime switching is developed for solar photo-voltaic (PV) power in order to provide short-term probabilistic forecasts. The proposed model for solar PV power is physics inspired and explicitly incorporates the stochasticity due to clouds using different parameters addressing the attenuation in power.Based on the statistical behavior of parameters, a simple regime-switching process between the three classes of sunny, overcast and partly cloudy is propose...

  1. Mathematical modelling of filtration in submerged anaerobic MBRs (SAnMBRs): long-term validation


    Robles Martínez, Ángel; Ruano García, María Victoria; Ribes Bertomeu, José; SECO TORRECILLAS, AURORA; Ferrer, J.


    The aim of this study was the long-term validation of a model capable of reproducing the filtration process occurring in a submerged anaerobic membrane bioreactor (SAnMBR) system. The proposed model was validated using data obtained horn a SAnMBR demonstration plant fitted with industrial-scale hollow-fibre membranes. The validation was carried out using both lightly and heavily fouled membranes operating at different bulk concentrations, gas sparging intensities and transmembrane fluxes. Acr...

  2. Long term warranty and after sales service concept, policies and cost models

    CERN Document Server

    Rahman, Anisur


    This volume presents concepts, policies and cost models for various long-term warranty and maintenance contracts. It offers several numerical examples for estimating costs to both the manufacturer and consumer. Long-term warranties and maintenance contracts are becoming increasingly popular, as these types of aftersales services provide assurance to consumers that they can enjoy long, reliable service, and protect them from defects and the potentially high costs of repairs. Studying long-term warranty and service contracts is important to manufacturers and consumers alike, as offering long-term warranty and maintenance contracts produce additional costs for manufacturers / service providers over the product’s service life. These costs must be factored into the price, or the manufacturer / dealer will incur losses instead of making a profit. On the other hand, the buyer / consumer needs to weigh the cost of maintaining it over its service life and to decide whether or not these policies are worth purchasing....

  3. Crash Frequency Analysis Using Hurdle Models with Random Effects Considering Short-Term Panel Data. (United States)

    Chen, Feng; Ma, Xiaoxiang; Chen, Suren; Yang, Lin


    Random effect panel data hurdle models are established to research the daily crash frequency on a mountainous section of highway I-70 in Colorado. Road Weather Information System (RWIS) real-time traffic and weather and road surface conditions are merged into the models incorporating road characteristics. The random effect hurdle negative binomial (REHNB) model is developed to study the daily crash frequency along with three other competing models. The proposed model considers the serial correlation of observations, the unbalanced panel-data structure, and dominating zeroes. Based on several statistical tests, the REHNB model is identified as the most appropriate one among four candidate models for a typical mountainous highway. The results show that: (1) the presence of over-dispersion in the short-term crash frequency data is due to both excess zeros and unobserved heterogeneity in the crash data; and (2) the REHNB model is suitable for this type of data. Moreover, time-varying variables including weather conditions, road surface conditions and traffic conditions are found to play importation roles in crash frequency. Besides the methodological advancements, the proposed technology bears great potential for engineering applications to develop short-term crash frequency models by utilizing detailed data from field monitoring data such as RWIS, which is becoming more accessible around the world.

  4. Long-term evaluation of two reoperation groups for intermittent exotropia. (United States)

    Lee, Ju-Yeun; Lee, Ga-In; Park, Kyung-Ah; Oh, Sei Yeul


    To evaluate the effect of initial postoperative deviation on subsequent reoperation in patients with intermittent exotropia and to compare the clinical factors and surgical outcomes between the two surgical failure groups. The medical records of patients who underwent reoperation after failed primary surgery for intermittent exotropia at a single center were reviewed retrospectively. Patients with recurrent intermittent exotropia and consecutive esotropia were considered surgical failures. Various clinical factors were compared between these two groups, including age at surgery, interval between surgeries, stereoacuity, spherical equivalent, office control, surgical type, presence of neurologic disease, amblyopia and other strabismus, and postoperative angles of deviation. Of the 3,406 patients who underwent surgery for intermittent exotropia, 139 patients met inclusion criteria. Of these, 125 (3.8%) underwent reoperation for recurrent intermittent exotropia; 14 (0.4%), for consecutive esotropia. On postoperative day 1 the intermittent exotropia group showed esodeviation at distance fixation of 2(Δ) ± 4(Δ); the esotropia group, esodeviation of 5(Δ) ± 4(Δ). The intermittent exotropia group showed a significant progression of exodeviation from 2 months postoperatively (all P intermittent exotropia may not predict long-term success. Careful monitoring for consecutive esotropia is needed 6 months postoperatively, and annual check-ups are recommended for all patients with under- and overcorrections for a period of at least 5 years after surgery. Copyright © 2017 American Association for Pediatric Ophthalmology and Strabismus. Published by Elsevier Inc. All rights reserved.

  5. Clinical Nursing Leadership Education in Long-Term Care: Intervention Design and Evaluation. (United States)

    Fiset, Valerie; Luciani, Tracy; Hurtubise, Alyssa; Grant, Theresa L


    The main objective of the current case study was to investigate the perceived leadership learning needs and feasibility of delivering leadership education to registered staff involved in direct care in long-term care (LTC) homes. The study was conducted in Ontario, Canada, and participants included RNs, registered practical nurses, and nursing administrators. Phase 1 bilingual web-based survey and bilingual focus group needs assessment data supported a preference for external training along with in-house mentoring to support sustainability. An intervention designed using insights gained from Phase 1 data was delivered via a 2-day, in-person workshop. Phases 2 and 3 evaluation survey data identified aspects of leadership training for LTC that require ongoing refinement. Findings suggest that communication skills and managing day-to-day nursing demands in the context of regulatory frameworks were areas of particular interest for leadership training in the LTC setting. [Journal of Gerontological Nursing, 43(4), 49-56.]. Copyright 2017, SLACK Incorporated.

  6. Evaluation of the benefits of enteral nutrition in long-term care elderly patients. (United States)

    Arinzon, Zeev; Peisakh, Alexander; Berner, Yitshal N


    Demented patients may refuse to eat as they come closer to the end of their lives. We evaluated the effectiveness of enteral nutrition in the improvement of survival and nutritional and functional status in very dependent and demented long-term care (LTC) elderly patients and its correlation with the nutritional parameters. Fifty-seven elderly patients, aged 60 years and older, who received nutrition by the enteral route (enteral nutrition group, ENG), were compared with 110 age-, sex-, comorbibity-, cognitive-, and dependent-matched subjects (control group, CG). Indications for enteral nutrition, type of tube; weight status subsequent to enteral nutrition; cognitive, functional, and pressure sore status; and complete clinical, complete blood count, and biochemical profile were recorded for each subject on initiation and conclusion of the study. Enteral nutrition was associated with improvement in blood count (hemoglobin and lymphocyte count), in renal function tests and electrolytes (BUN, creatinine, BUN/creatinine ratio, sodium and potassium), hydration status, serum osmolarity, and in serum proteins (total protein, albumin, and transferrin), but not in serum cholesterol and CRP levels. Decline in functional and in cognitive status was higher in CG than in ENG (Delta changes; respectively P = .24 and P .05). Complication rate related to nutrition was higher in ENG than in CG (61% and 34%, respectively; P preventing pressure sore development in an LTC setting.

  7. Transition of municipal sludge anaerobic digestion from mesophilic to thermophilic and long-term performance evaluation. (United States)

    Tezel, Ulas; Tandukar, Madan; Hajaya, Malek G; Pavlostathis, Spyros G


    Strategies for the transition of municipal sludge anaerobic digestion from mesophilic to thermophilic were assessed and the long-term stability and performance of thermophilic digesters operated at a solids retention time of 30days were evaluated. Transition from 36°C to 53.3°C at a rate of 3°C/day resulted in fluctuation of the daily gas and volatile fatty acids (VFAs) production. Steady-state was reached within 35days from the onset of temperature increase. Transitions from either 36 or 53.3°C to 60°C resulted in relatively stable daily gas production, but VFAs remained at very high levels (in excess of 5000mg COD/L) and methane production was lower than that of the mesophilic reactor. It was concluded that in order to achieve high VS and COD destruction and methane production, the temperature of continuous-flow, suspended growth digesters fed with mixed municipal sludge should be kept below 60°C. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Use of the TRPV1 Agonist Capsaicin to Provide Long-Term Analgesia in a Rat Limb Fracture/Open Repair, Internal Fixation Model (United States)


    been no published works evaluating the efficacy of locally applied capsaicin for analgesia in fracture pain or its effects on bone healing and local...term analgesia for postsurgical pain after total knee arthroplasty. Pain Med 10 (2005): 1. 9. Diamond E, Richards PT, Miller T. ALGRX 4975 reduces...Provide Long-Term Analgesia in a Rat Limb Fracture/Open Repair, Internal Fixation Model PRINCIPAL INVESTIGATOR: Michael J. Buys, M.D

  9. Structural equation modeling: building and evaluating causal models: Chapter 8 (United States)

    Grace, James B.; Scheiner, Samuel M.; Schoolmaster, Donald R.


    Scientists frequently wish to study hypotheses about causal relationships, rather than just statistical associations. This chapter addresses the question of how scientists might approach this ambitious task. Here we describe structural equation modeling (SEM), a general modeling framework for the study of causal hypotheses. Our goals are to (a) concisely describe the methodology, (b) illustrate its utility for investigating ecological systems, and (c) provide guidance for its application. Throughout our presentation, we rely on a study of the effects of human activities on wetland ecosystems to make our description of methodology more tangible. We begin by presenting the fundamental principles of SEM, including both its distinguishing characteristics and the requirements for modeling hypotheses about causal networks. We then illustrate SEM procedures and offer guidelines for conducting SEM analyses. Our focus in this presentation is on basic modeling objectives and core techniques. Pointers to additional modeling options are also given.

  10. Enhanced stability of car-following model upon incorporation of short-term driving memory (United States)

    Liu, Da-Wei; Shi, Zhong-Ke; Ai, Wen-Huan


    Based on the full velocity difference model, a new car-following model is developed to investigate the effect of short-term driving memory on traffic flow in this paper. Short-term driving memory is introduced as the influence factor of driver's anticipation behavior. The stability condition of the newly developed model is derived and the modified Korteweg-de Vries (mKdV) equation is constructed to describe the traffic behavior near the critical point. Via numerical method, evolution of a small perturbation is investigated firstly. The results show that the improvement of this new car-following model over the previous ones lies in the fact that the new model can improve the traffic stability. Starting and breaking processes of vehicles in the signalized intersection are also investigated. The numerical simulations illustrate that the new model can successfully describe the driver's anticipation behavior, and that the efficiency and safety of the vehicles passing through the signalized intersection are improved by considering short-term driving memory.

  11. Novel Fuzzy Modeling and Synchronization of Chaotic Systems With Multinonlinear Terms by Advanced Ge-Li Fuzzy Model. (United States)

    Li, Shih-Yu; Tam, Lap-Mou; Tsai, Shang-En; Ge, Zheng-Ming


    Ge and Li proposed an alternative strategy to model and synchronize two totally different nonlinear systems in the end of 2011, which provided a new version for fuzzy modeling and has been applied to several fields to simplify their modeling works and solve the mismatch problems [1]-[17]. However, the proposed model limits the number of nonlinear terms in each equation so that this model could not be used in all kinds of nonlinear dynamic systems. As a result, in this paper, a more efficient and comprehensive advanced-Ge-Li fuzzy model is given to further release the limitation and improve the effectiveness of the original one. The novel fuzzy model can be applied to all kinds of complex nonlinear systems--this is the universal strategy and only m x 2 fuzzy rules as well as two linear subsystems are needed to simulate nonlinear behaviors (m is the number of states in a nonlinear dynamic system), whatever the nonlinear terms are copious or complicated. Further, the fuzzy synchronization of two nonlinear dynamic systems with totally distinct structures can be achieved via only two sets of control gains designed through the novel fuzzy model as well as its corresponding fuzzy synchronization scheme. Two complicated dynamic systems are designed to be the illustrations, Mathieu-Van der pol system with uncertainties and Quantum-cellular neural networks nano system with uncertainties, to show the effectiveness and feasibility of the novel fuzzy model.

  12. Predicting long-term depression outcome using a three-mode principal component model for depression heterogeneity. (United States)

    Monden, Rei; Stegeman, Alwin; Conradi, Henk Jan; de Jonge, Peter; Wardenaar, Klaas J


    Depression heterogeneity has hampered development of adequate prognostic models. Therefore, more homogeneous clinical entities (e.g. dimensions, subtypes) have been developed, but their differentiating potential is limited because neither captures all relevant variation across persons, symptoms and time. To address this, three-mode Principal Component Analysis (3MPCA) was previously applied to capture person-, symptom- and time-level variation in a single model (Monden et al., 2015). This study evaluated the added prognostic value of such an integrated model for longer-term depression outcomes. The Beck Depression Inventory (BDI) was administered quarterly for two years to major depressive disorder outpatients participating in a randomized controlled trial. A previously developed 3MPCA model decomposed the data into 2 symptom-components ('somatic-affective', 'cognitive'), 2 time-components ('recovering', 'persisting') and 3 person-components ('severe non-persisting depression', 'somatic depression' and 'cognitive depression'). The predictive value of the 3MPCA model for BDI scores at 3-year (n=136) and 11-year follow-up (n=145) was compared with traditional latent variable models and traditional prognostic factors (e.g. baseline BDI component scores, personality). 3MPCA components predicted 41% and 36% of the BDI variance at 3- and 11-year follow-up, respectively. A latent class model, growth mixture model and other known prognostic variables predicted 4-32% and 3-24% of the BDI variance at 3- and 11-year follow-up, respectively. Only primary care patients were included. There was no independent validation sample. Accounting for depression heterogeneity at the person-, symptom- and time-level improves longer-term predictions of depression severity, underlining the potential of this approach for developing better prognostic models. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Long-Term Results from Evaluation of Advanced New Construction Packages in Test Homes: Lake Elsinore, California

    Energy Technology Data Exchange (ETDEWEB)

    Stecher, D.; Brozyna, K.


    This report presents the long-term evaluation results from a hot-dry climate project that examines the room-to-room temperature conditions that exist in a high performance envelope, the performance of a simplified air distribution system, and a comparison of modeled energy performance with measured energy use. The project, a prototype house built by K. Hovnanian Homes' Ontario Group, is located in Lake Elsinore, Riverside County, California, and achieves a 50% level of whole house source energy savings with respect to the Building America (BA) Benchmark Definition 2009 (Hendron and Engebrecht 2010). Temperature measurements in three rooms indicate that the temperature difference between the measured locations and the thermostat were within recommendations 90.3% of the time in heating mode and 99.3% of the time in cooling mode. The air distribution system is operating efficiently with average delivered temperatures adequate to facilitate proper heating and cooling and only minor average temperature differences observed between the system's plenum and farthest register. Monitored energy use results for the house indicate that it is using less energy than predicted from modeling. A breakdown of energy use according to end use determined little agreement between comparable values.

  14. Modelling long-term (300ka) upland catchment response to multiple lava damming events

    NARCIS (Netherlands)

    van Gorp, W.; Temme, A. J. A. M.; Veldkamp, A.; Schoorl, J. M.


    Landscapes respond in complex ways to external drivers such as base level change due to damming events. In this study, landscape evolution modelling was used to understand and analyse long-term catchment response to lava damming events. PalaeoDEM reconstruction of a small Turkish catchment (45km(2))

  15. The capital-asset pricing model reconsidered: tests in real terms on ...

    African Journals Online (AJOL)

    As in that work, the main question this study aimed to answer remains: Can the CAPM be accepted in the South African market for the purposes of the stochastic modelling of investment returns in typical actuarial applications? To test the CAPM in real terms, conventional and index-linked bonds were included both in the ...

  16. Some analytical results pertaining to Cournot models for short-term electricity markets

    Energy Technology Data Exchange (ETDEWEB)

    Ruiz, C.; Conejo, A.J.; Garcia-Bertrand, R. [Department of Electrical Engineering, Univ. Castilla-La Mancha, Campus Universitario s/n, 13071 Ciudad Real (Spain)


    This paper provides some theoretical results pertaining to the Cournot model applied to short-term electricity markets. Price, quantities and profits are first obtained, and then results related to sensitivities and limit values are derived and discussed. The cases of both several identical Cournot producers and one dominant Cournot producer are analyzed. A case example illustrates the results obtained. (author)

  17. User Acceptance of Long-Term Evolution (LTE) Services: An Application of Extended Technology Acceptance Model (United States)

    Park, Eunil; Kim, Ki Joon


    Purpose: The aim of this paper is to propose an integrated path model in order to explore user acceptance of long-term evolution (LTE) services by examining potential causal relationships between key psychological factors and user intention to use the services. Design/methodology/approach: Online survey data collected from 1,344 users are analysed…

  18. Modeling the long-term population dynamics of benthic foraminiferal communities using field and experimental data

    NARCIS (Netherlands)

    Ernst, S.R.|info:eu-repo/dai/nl/241208971; Duijnstee, Ivo|info:eu-repo/dai/nl/222347414; de Stigter, H.C.; van der Zwaan, Bert|info:eu-repo/dai/nl/068698240


    A mathematically simple model is used to simulate the long-term impact of variable food flux and oxygenation over decades. [nput characteristics were offspring and generation length, the values of which are derived from laboratory experiments. Other input consisted of a parameter describing the

  19. Probing for the Multiplicative Term in Modern Expectancy-Value Theory: A Latent Interaction Modeling Study (United States)

    Trautwein, Ulrich; Marsh, Herbert W.; Nagengast, Benjamin; Ludtke, Oliver; Nagy, Gabriel; Jonkmann, Kathrin


    In modern expectancy-value theory (EVT) in educational psychology, expectancy and value beliefs additively predict performance, persistence, and task choice. In contrast to earlier formulations of EVT, the multiplicative term Expectancy x Value in regression-type models typically plays no major role in educational psychology. The present study…

  20. Pattern Formation of a Keller-Segel Model with the Source Term up(1-u

    Directory of Open Access Journals (Sweden)

    Shengmao Fu


    Full Text Available Nonlinear dynamics near an unstable constant equilibrium in a Keller-Segel model with the source term up(1-u is considered. It is proved that nonlinear dynamics of a general perturbation is determined by the finite number of linear growing modes over a time scale of ln(1/δ, where δ is a strength of the initial perturbation.