WorldWideScience

Sample records for term model evaluations

  1. Standardizing the performance evaluation of short-term wind prediction models

    DEFF Research Database (Denmark)

    Madsen, Henrik; Pinson, Pierre; Kariniotakis, G.

    2005-01-01

    Short-term wind power prediction is a primary requirement for efficient large-scale integration of wind generation in power systems and electricity markets. The choice of an appropriate prediction model among the numerous available models is not trivial, and has to be based on an objective...... evaluation of model performance. This paper proposes a standardized protocol for the evaluation of short-term wind-poser preciction systems. A number of reference prediction models are also described, and their use for performance comparison is analysed. The use of the protocol is demonstrated using results...... from both on-shore and off-shore wind forms. The work was developed in the frame of the Anemos project (EU R&D project) where the protocol has been used to evaluate more than 10 prediction systems....

  2. Using Reactive Transport Modeling to Evaluate the Source Term at Yucca Mountain

    Energy Technology Data Exchange (ETDEWEB)

    Y. Chen

    2001-12-19

    The conventional approach of source-term evaluation for performance assessment of nuclear waste repositories uses speciation-solubility modeling tools and assumes pure phases of radioelements control their solubility. This assumption may not reflect reality, as most radioelements (except for U) may not form their own pure phases. As a result, solubility limits predicted using the conventional approach are several orders of magnitude higher then the concentrations of radioelements measured in spent fuel dissolution experiments. This paper presents the author's attempt of using a non-conventional approach to evaluate source term of radionuclide release for Yucca Mountain. Based on the general reactive-transport code AREST-CT, a model for spent fuel dissolution and secondary phase precipitation has been constructed. The model accounts for both equilibrium and kinetic reactions. Its predictions have been compared against laboratory experiments and natural analogues. It is found that without calibrations, the simulated results match laboratory and field observations very well in many aspects. More important is the fact that no contradictions between them have been found. This provides confidence in the predictive power of the model. Based on the concept of Np incorporated into uranyl minerals, the model not only predicts a lower Np source-term than that given by conventional Np solubility models, but also produces results which are consistent with laboratory measurements and observations. Moreover, two hypotheses, whether Np enters tertiary uranyl minerals or not, have been tested by comparing model predictions against laboratory observations, the results favor the former. It is concluded that this non-conventional approach of source term evaluation not only eliminates over-conservatism in conventional solubility approach to some extent, but also gives a realistic representation of the system of interest, which is a prerequisite for truly understanding the long-term

  3. Using Reactive Transport Modeling to Evaluate the Source Term at Yucca Mountain

    International Nuclear Information System (INIS)

    Y. Chen

    2001-01-01

    The conventional approach of source-term evaluation for performance assessment of nuclear waste repositories uses speciation-solubility modeling tools and assumes pure phases of radioelements control their solubility. This assumption may not reflect reality, as most radioelements (except for U) may not form their own pure phases. As a result, solubility limits predicted using the conventional approach are several orders of magnitude higher then the concentrations of radioelements measured in spent fuel dissolution experiments. This paper presents the author's attempt of using a non-conventional approach to evaluate source term of radionuclide release for Yucca Mountain. Based on the general reactive-transport code AREST-CT, a model for spent fuel dissolution and secondary phase precipitation has been constructed. The model accounts for both equilibrium and kinetic reactions. Its predictions have been compared against laboratory experiments and natural analogues. It is found that without calibrations, the simulated results match laboratory and field observations very well in many aspects. More important is the fact that no contradictions between them have been found. This provides confidence in the predictive power of the model. Based on the concept of Np incorporated into uranyl minerals, the model not only predicts a lower Np source-term than that given by conventional Np solubility models, but also produces results which are consistent with laboratory measurements and observations. Moreover, two hypotheses, whether Np enters tertiary uranyl minerals or not, have been tested by comparing model predictions against laboratory observations, the results favor the former. It is concluded that this non-conventional approach of source term evaluation not only eliminates over-conservatism in conventional solubility approach to some extent, but also gives a realistic representation of the system of interest, which is a prerequisite for truly understanding the long-term

  4. Source term model evaluations for the low-level waste facility performance assessment

    Energy Technology Data Exchange (ETDEWEB)

    Yim, M.S.; Su, S.I. [North Carolina State Univ., Raleigh, NC (United States)

    1995-12-31

    The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.

  5. Evaluating Modeled Impact Metrics for Human Health, Agriculture Growth, and Near-Term Climate

    Science.gov (United States)

    Seltzer, K. M.; Shindell, D. T.; Faluvegi, G.; Murray, L. T.

    2017-12-01

    Simulated metrics that assess impacts on human health, agriculture growth, and near-term climate were evaluated using ground-based and satellite observations. The NASA GISS ModelE2 and GEOS-Chem models were used to simulate the near-present chemistry of the atmosphere. A suite of simulations that varied by model, meteorology, horizontal resolution, emissions inventory, and emissions year were performed, enabling an analysis of metric sensitivities to various model components. All simulations utilized consistent anthropogenic global emissions inventories (ECLIPSE V5a or CEDS), and an evaluation of simulated results were carried out for 2004-2006 and 2009-2011 over the United States and 2014-2015 over China. Results for O3- and PM2.5-based metrics featured minor differences due to the model resolutions considered here (2.0° × 2.5° and 0.5° × 0.666°) and model, meteorology, and emissions inventory each played larger roles in variances. Surface metrics related to O3 were consistently high biased, though to varying degrees, demonstrating the need to evaluate particular modeling frameworks before O3 impacts are quantified. Surface metrics related to PM2.5 were diverse, indicating that a multimodel mean with robust results are valuable tools in predicting PM2.5-related impacts. Oftentimes, the configuration that captured the change of a metric best over time differed from the configuration that captured the magnitude of the same metric best, demonstrating the challenge in skillfully simulating impacts. These results highlight the strengths and weaknesses of these models in simulating impact metrics related to air quality and near-term climate. With such information, the reliability of historical and future simulations can be better understood.

  6. Empirical evaluation of the conceptual model underpinning a regional aquatic long-term monitoring program using causal modelling

    Science.gov (United States)

    Irvine, Kathryn M.; Miller, Scott; Al-Chokhachy, Robert K.; Archer, Erik; Roper, Brett B.; Kershner, Jeffrey L.

    2015-01-01

    Conceptual models are an integral facet of long-term monitoring programs. Proposed linkages between drivers, stressors, and ecological indicators are identified within the conceptual model of most mandated programs. We empirically evaluate a conceptual model developed for a regional aquatic and riparian monitoring program using causal models (i.e., Bayesian path analysis). We assess whether data gathered for regional status and trend estimation can also provide insights on why a stream may deviate from reference conditions. We target the hypothesized causal pathways for how anthropogenic drivers of road density, percent grazing, and percent forest within a catchment affect instream biological condition. We found instream temperature and fine sediments in arid sites and only fine sediments in mesic sites accounted for a significant portion of the maximum possible variation explainable in biological condition among managed sites. However, the biological significance of the direct effects of anthropogenic drivers on instream temperature and fine sediments were minimal or not detected. Consequently, there was weak to no biological support for causal pathways related to anthropogenic drivers’ impact on biological condition. With weak biological and statistical effect sizes, ignoring environmental contextual variables and covariates that explain natural heterogeneity would have resulted in no evidence of human impacts on biological integrity in some instances. For programs targeting the effects of anthropogenic activities, it is imperative to identify both land use practices and mechanisms that have led to degraded conditions (i.e., moving beyond simple status and trend estimation). Our empirical evaluation of the conceptual model underpinning the long-term monitoring program provided an opportunity for learning and, consequently, we discuss survey design elements that require modification to achieve question driven monitoring, a necessary step in the practice of

  7. Identfying the Needs of Pre-Service Classroom Teachers about Science Teaching Methodology Course in Terms of Parlett's Illuminative Program Evaluation Model

    Science.gov (United States)

    Çaliskan, Ilke

    2014-01-01

    The aim of this study was to identify the needs of third grade classroom teaching students about science teaching course in terms of Parlett's Illuminative program evaluation model. Phenomographic research design was used in this study. Illuminative program evaluation model was chosen for this study in terms of its eclectic and process-based…

  8. Identfying the Needs of Pre-Service Classroom Teachers about Science Teaching Methodology Courses in Terms of Parlett's Illuminative Program Evaluation Model

    Science.gov (United States)

    Çaliskan, Ilke

    2014-01-01

    The aim of this study was to identify the needs of third grade classroom teaching students about science teaching course in terms of Parlett's Illuminative program evaluation model. Phenomographic research design was used in this study. Illuminative program evaluation model was chosen for this study in terms of its eclectic and process-based…

  9. Long-term BPA infusions. Evaluation in the rat brain tumor and rat spinal cord models

    International Nuclear Information System (INIS)

    Coderre, J.A.; Micca, P.L.; Nawrocky, M.M.; Joel, D.D.; Morris, G.M.

    2000-01-01

    In the BPA-based dose escalation clinical trial, the observations of tumor recurrence in areas of extremely high calculated tumor doses suggest that the BPA distribution is non-uniform. Longer (6-hour) i.v. infusions of BPA are evaluated in the rat brain tumor and spinal cord models to address the questions of whether long-term infusions are more effective against the tumor and whether long-term infusions are detrimental in the central nervous system. In the rat spinal cord, the 50% effective doses (ED 50 ) for myeloparesis were not significantly different after a single i.p. injection of BPA-fructose or a 6 hour i.v. infusion. In the rat 9L gliosarcoma brain tumor model, BNCT following 2-hr or 6-hr infusions of BPA-F produced similar levels of long term survival. (author)

  10. Evaluation of long-term creep-fatigue life of stainless steel weldment based on a microstructure degradation model

    International Nuclear Information System (INIS)

    Asayama, Tai; Hasebe, Shinichi

    1997-01-01

    This paper describes a newly developed analytical method of evaluation of creep-fatigue strength of stainless weld metals. Based on the observation that creep-fatigue crack initiates adjacent to the interface of sigma-phase/delta-ferrite and matrix, a mechanistic model which allows the evaluation of micro stress/strain concentration adjacent to the interface was developed. Fatigue and creep damage were evaluated using the model which describes the microstructure after exposed to high temperatures for a long time. Thus it was made possible to predict analytically the long-term creep-fatigue life of stainless steel metals whose microstructure is degraded as a result of high temperature service. (author)

  11. Modeling the Interest Rate Term Structure: Derivatives Contracts Dynamics and Evaluation

    Directory of Open Access Journals (Sweden)

    Pedro L. Valls Pereira

    2005-06-01

    Full Text Available This article deals with a model for the term structure of interest rates and the valuation of derivative contracts directly dependent on it. The work is of a theoretical nature and deals, exclusively, with continuous time models, making ample use of stochastic calculus results and presents original contributions that we consider relevant to the development of the fixed income market modeling. We develop a new multifactorial model of the term structure of interest rates. The model is based on the decomposition of the yield curve into the factors level, slope, curvature, and the treatment of their collective dynamics. We show that this model may be applied to serve various objectives: analysis of bond price dynamics, valuation of derivative contracts and also market risk management and formulation of operational strategies which is presented in another article.

  12. Source term evaluation model for high-level radioactive waste repository with decay chain build-up.

    Science.gov (United States)

    Chopra, Manish; Sunny, Faby; Oza, R B

    2016-09-18

    A source term model based on two-component leach flux concept is developed for a high-level radioactive waste repository. The long-lived radionuclides associated with high-level waste may give rise to the build-up of activity because of radioactive decay chains. The ingrowths of progeny are incorporated in the model using Bateman decay chain build-up equations. The model is applied to different radionuclides present in the high-level radioactive waste, which form a part of decay chains (4n to 4n + 3 series), and the activity of the parent and daughter radionuclides leaching out of the waste matrix is estimated. Two cases are considered: one when only parent is present initially in the waste and another where daughters are also initially present in the waste matrix. The incorporation of in situ production of daughter radionuclides in the source is important to carry out realistic estimates. It is shown that the inclusion of decay chain build-up is essential to avoid underestimation of the radiological impact assessment of the repository. The model can be a useful tool for evaluating the source term of the radionuclide transport models used for the radiological impact assessment of high-level radioactive waste repositories.

  13. METHODOLOGICAL PRINCIPLES AND METHODS OF TERMS OF TRADE STATISTICAL EVALUATION

    Directory of Open Access Journals (Sweden)

    N. Kovtun

    2014-09-01

    Full Text Available The paper studies the methodological principles and guidance of the statistical evaluation of terms of trade for the United Nations classification model – Harmonized Commodity Description and Coding System (HS. The practical implementation of the proposed three-stage model of index analysis and estimation of terms of trade for Ukraine's commodity-members for the period of 2011-2012 are realized.

  14. Development of source term evaluation method for Korean Next Generation Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Keon Jae; Cheong, Jae Hak; Park, Jin Baek; Kim, Guk Gee [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-10-15

    This project had investigate several design features of radioactive waste processing system and method to predict nuclide concentration at primary coolant basic concept of next generation reactor and safety goals at the former phase. In this project several prediction methods of source term are evaluated conglomerately and detailed contents of this project are : model evaluation of nuclide concentration at Reactor Coolant System, evaluation of primary and secondary coolant concentration of reference Nuclear Power Plant(NPP), investigation of prediction parameter of source term evaluation, basic parameter of PWR, operational parameter, respectively, radionuclide removal system and adjustment values of reference NPP, suggestion of source term prediction method of next generation NPP.

  15. Development of source term evaluation method for Korean Next Generation Reactor(III)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Geon Jae; Park, Jin Baek; Lee, Yeong Il; Song, Min Cheonl; Lee, Ho Jin [Korea Advanced Institue of Science and Technology, Taejon (Korea, Republic of)

    1998-06-15

    This project had investigated irradiation characteristics of MOX fuel method to predict nuclide concentration at primary and secondary coolant using a core containing 100% of all MOX fuel and development of source term evaluation tool. In this study, several prediction methods of source term are evaluated. Detailed contents of this project are : an evaluation of model for nuclear concentration at Reactor Coolant System, evaluation of primary and secondary coolant concentration of reference Nuclear Power Plant using purely MOX fuel, suggestion of source term prediction method of NPP with a core using MOX fuel.

  16. Population Pharmacokinetics of Intravenous Paracetamol (Acetaminophen) in Preterm and Term Neonates: Model Development and External Evaluation.

    Science.gov (United States)

    Cook, Sarah F; Roberts, Jessica K; Samiee-Zafarghandy, Samira; Stockmann, Chris; King, Amber D; Deutsch, Nina; Williams, Elaine F; Allegaert, Karel; Wilkins, Diana G; Sherwin, Catherine M T; van den Anker, John N

    2016-01-01

    The aims of this study were to develop a population pharmacokinetic model for intravenous paracetamol in preterm and term neonates and to assess the generalizability of the model by testing its predictive performance in an external dataset. Nonlinear mixed-effects models were constructed from paracetamol concentration-time data in NONMEM 7.2. Potential covariates included body weight, gestational age, postnatal age, postmenstrual age, sex, race, total bilirubin, and estimated glomerular filtration rate. An external dataset was used to test the predictive performance of the model through calculation of bias, precision, and normalized prediction distribution errors. The model-building dataset included 260 observations from 35 neonates with a mean gestational age of 33.6 weeks [standard deviation (SD) 6.6]. Data were well-described by a one-compartment model with first-order elimination. Weight predicted paracetamol clearance and volume of distribution, which were estimated as 0.348 L/h (5.5 % relative standard error; 30.8 % coefficient of variation) and 2.46 L (3.5 % relative standard error; 14.3 % coefficient of variation), respectively, at the mean subject weight of 2.30 kg. An external evaluation was performed on an independent dataset that included 436 observations from 60 neonates with a mean gestational age of 35.6 weeks (SD 4.3). The median prediction error was 10.1 % [95 % confidence interval (CI) 6.1-14.3] and the median absolute prediction error was 25.3 % (95 % CI 23.1-28.1). Weight predicted intravenous paracetamol pharmacokinetics in neonates ranging from extreme preterm to full-term gestational status. External evaluation suggested that these findings should be generalizable to other similar patient populations.

  17. Evaluation of long term leaching of borosilicate glasses

    International Nuclear Information System (INIS)

    Lanza, F.; Parnisari, E.

    1978-01-01

    For the evaluation of long term hazard of glass, data on long term glass leaching are needed. Moreover for long term leaching a model of homogeneous dissolution seems reasonable and ask for confirmation. Tests were performed at 30 0 , 80 0 , 100 0 , using an apparatus of the Soxhlet type, to 3.600 hours. Results were obtained as a weight loss and analysed following a relation with time composed by a parabolic and a linear part. Analysis of the surface layer using energy dispersion X ray spectrometry were performed. A critical analysis of the results and of the apparatus is presented

  18. Evaluating long term forecasts

    Energy Technology Data Exchange (ETDEWEB)

    Lady, George M. [Department of Economics, College of Liberal Arts, Temple University, Philadelphia, PA 19122 (United States)

    2010-03-15

    The U.S. Department of Energy's Energy Information Administration (EIA), and its predecessor organizations, has published projections of U.S. energy production, consumption, distribution and prices annually for over 30 years. A natural issue to raise in evaluating the projections is an assessment of their accuracy compared to eventual outcomes. A related issue is the determination of the sources of 'error' in the projections that are due to differences between the actual versus realized values of the associated assumptions. One way to do this would be to run the computer-based model from which the projections are derived at the time the projected values are realized, using actual rather than assumed values for model assumptions; and, compare these results to the original projections. For long term forecasts, this approach would require that the model's software and hardware configuration be archived and available for many years, possibly decades, into the future. Such archival creates many practical problems; and, in general, it is not being done. This paper reports on an alternative approach for evaluating the projections. In the alternative approach, the model is run many times for cases in which important assumptions are changed individually and in combinations. A database is assembled from the solutions and a regression analysis is conducted for each important projected variable with the associated assumptions chosen as exogenous variables. When actual data are eventually available, the regression results are then used to estimate the sources of the differences in the projections of the endogenous variables compared to their eventual outcomes. The results presented here are for residential and commercial sector natural gas and electricity consumption. (author)

  19. Evaluation for the models of neutron diffusion theory in terms of power density distributions of the HTTR

    International Nuclear Information System (INIS)

    Takamatsu, Kuniyoshi; Shimakawa, Satoshi; Nojiri, Naoki; Fujimoto, Nozomu

    2003-10-01

    In the case of evaluations for the highest temperature of the fuels in the HTTR, it is very important to expect the power density distributions accurately; therefore, it is necessary to improve the analytical model with the neutron diffusion and the burn-up theory. The power density distributions are analyzed in terms of two models, the one mixing the fuels and the burnable poisons homogeneously and the other modeling them heterogeneously. Moreover these analytical power density distributions are compared with the ones derived from the gross gamma-ray measurements and the Monte Carlo calculational code with continuous energy. As a result the homogeneous mixed model isn't enough to expect the power density distributions of the core in the axial direction; on the other hand, the heterogeneous model improves the accuracy. (author)

  20. Dynamic term structure models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller; Meldrum, Andrew

    This paper studies whether dynamic term structure models for US nominal bond yields should enforce the zero lower bound by a quadratic policy rate or a shadow rate specification. We address the question by estimating quadratic term structure models (QTSMs) and shadow rate models with at most four...

  1. Long-Term Stability Evaluation and Pillar Design Criterion for Room-and-Pillar Mines

    Directory of Open Access Journals (Sweden)

    Yang Yu

    2017-10-01

    Full Text Available The collapse of abandoned room-and-pillar mines is often violent and unpredictable. Safety concerns often resulted in mine closures with no post-mining stability evaluations. As a result, large amounts of land resources over room-and-pillar mines are wasted. This paper attempts to establish an understanding of the long-term stability issues of goafs (abandoned mines. Considering progressive pillar failures and the effect of single pillar failure on surrounding pillars, this paper proposes a pillar peeling model to evaluate the long-term stability of coal mines and the associated criteria for evaluating the long-term stability of room-and-pillar mines. The validity of the peeling model was verified by numerical simulation, and field data from 500 pillar cases from China, South Africa, and India. It is found that the damage level of pillar peeling is affected by the peel angle and pillar height and is controlled by the pillar width–height ratio.

  2. Empirically evaluating decision-analytic models.

    Science.gov (United States)

    Goldhaber-Fiebert, Jeremy D; Stout, Natasha K; Goldie, Sue J

    2010-08-01

    Model-based cost-effectiveness analyses support decision-making. To augment model credibility, evaluation via comparison to independent, empirical studies is recommended. We developed a structured reporting format for model evaluation and conducted a structured literature review to characterize current model evaluation recommendations and practices. As an illustration, we applied the reporting format to evaluate a microsimulation of human papillomavirus and cervical cancer. The model's outputs and uncertainty ranges were compared with multiple outcomes from a study of long-term progression from high-grade precancer (cervical intraepithelial neoplasia [CIN]) to cancer. Outcomes included 5 to 30-year cumulative cancer risk among women with and without appropriate CIN treatment. Consistency was measured by model ranges overlapping study confidence intervals. The structured reporting format included: matching baseline characteristics and follow-up, reporting model and study uncertainty, and stating metrics of consistency for model and study results. Structured searches yielded 2963 articles with 67 meeting inclusion criteria and found variation in how current model evaluations are reported. Evaluation of the cervical cancer microsimulation, reported using the proposed format, showed a modeled cumulative risk of invasive cancer for inadequately treated women of 39.6% (30.9-49.7) at 30 years, compared with the study: 37.5% (28.4-48.3). For appropriately treated women, modeled risks were 1.0% (0.7-1.3) at 30 years, study: 1.5% (0.4-3.3). To support external and projective validity, cost-effectiveness models should be iteratively evaluated as new studies become available, with reporting standardized to facilitate assessment. Such evaluations are particularly relevant for models used to conduct comparative effectiveness analyses.

  3. Model for low temperature oxidation during long term interim storage

    Energy Technology Data Exchange (ETDEWEB)

    Desgranges, Clara; Bertrand, Nathalie; Gauvain, Danielle; Terlain, Anne [Service de la Corrosion et du Comportement des Materiaux dans leur Environnement, CEA/Saclay - 91191 Gif-sur-Yvette Cedex (France); Poquillon, Dominique; Monceau, Daniel [CIRIMAT UMR 5085, ENSIACET-INPT, 31077 Toulouse Cedex 4 (France)

    2004-07-01

    For high-level nuclear waste containers in long-term interim storage, dry oxidation will be the first and the main degradation mode during about one century. The metal lost by dry oxidation over such a long period must be evaluated with a good reliability. To achieve this goal, modelling of the oxide scale growth is necessary and this is the aim of the dry oxidation studies performed in the frame of the COCON program. An advanced model based on the description of elementary mechanisms involved in scale growth at low temperatures, like partial interfacial control of the oxidation kinetics and/or grain boundary diffusion, is developed in order to increase the reliability of the long term extrapolations deduced from basic models developed from short time experiments. Since only few experimental data on dry oxidation are available in the temperature range of interest, experiments have also been performed to evaluate the relevant input parameters for models like grain size of oxide scale, considering iron as simplified material. (authors)

  4. Model for low temperature oxidation during long term interim storage

    International Nuclear Information System (INIS)

    Desgranges, Clara; Bertrand, Nathalie; Gauvain, Danielle; Terlain, Anne; Poquillon, Dominique; Monceau, Daniel

    2004-01-01

    For high-level nuclear waste containers in long-term interim storage, dry oxidation will be the first and the main degradation mode during about one century. The metal lost by dry oxidation over such a long period must be evaluated with a good reliability. To achieve this goal, modelling of the oxide scale growth is necessary and this is the aim of the dry oxidation studies performed in the frame of the COCON program. An advanced model based on the description of elementary mechanisms involved in scale growth at low temperatures, like partial interfacial control of the oxidation kinetics and/or grain boundary diffusion, is developed in order to increase the reliability of the long term extrapolations deduced from basic models developed from short time experiments. Since only few experimental data on dry oxidation are available in the temperature range of interest, experiments have also been performed to evaluate the relevant input parameters for models like grain size of oxide scale, considering iron as simplified material. (authors)

  5. A Neural Network Model of the Visual Short-Term Memory

    DEFF Research Database (Denmark)

    Petersen, Anders; Kyllingsbæk, Søren; Hansen, Lars Kai

    2009-01-01

    In this paper a neural network model of Visual Short-Term Memory (VSTM) is presented. The model links closely with Bundesen’s (1990) well-established mathematical theory of visual attention. We evaluate the model’s ability to fit experimental data from a classical whole and partial report study...

  6. Short-Term Wind Speed Prediction Using EEMD-LSSVM Model

    Directory of Open Access Journals (Sweden)

    Aiqing Kang

    2017-01-01

    Full Text Available Hybrid Ensemble Empirical Mode Decomposition (EEMD and Least Square Support Vector Machine (LSSVM is proposed to improve short-term wind speed forecasting precision. The EEMD is firstly utilized to decompose the original wind speed time series into a set of subseries. Then the LSSVM models are established to forecast these subseries. Partial autocorrelation function is adopted to analyze the inner relationships between the historical wind speed series in order to determine input variables of LSSVM models for prediction of every subseries. Finally, the superposition principle is employed to sum the predicted values of every subseries as the final wind speed prediction. The performance of hybrid model is evaluated based on six metrics. Compared with LSSVM, Back Propagation Neural Networks (BP, Auto-Regressive Integrated Moving Average (ARIMA, combination of Empirical Mode Decomposition (EMD with LSSVM, and hybrid EEMD with ARIMA models, the wind speed forecasting results show that the proposed hybrid model outperforms these models in terms of six metrics. Furthermore, the scatter diagrams of predicted versus actual wind speed and histograms of prediction errors are presented to verify the superiority of the hybrid model in short-term wind speed prediction.

  7. Model documentation report: Short-Term Hydroelectric Generation Model

    International Nuclear Information System (INIS)

    1993-08-01

    The purpose of this report is to define the objectives of the Short- Term Hydroelectric Generation Model (STHGM), describe its basic approach, and to provide details on the model structure. This report is intended as a reference document for model analysts, users, and the general public. Documentation of the model is in accordance with the Energy Information Administration's (AYE) legal obligation to provide adequate documentation in support of its models (Public Law 94-385, Section 57.b.2). The STHGM performs a short-term (18 to 27- month) forecast of hydroelectric generation in the United States using an autoregressive integrated moving average (UREMIA) time series model with precipitation as an explanatory variable. The model results are used as input for the short-term Energy Outlook

  8. Evaluation of effects of long term exposure on lethal toxicity with mammals

    International Nuclear Information System (INIS)

    Verma, Vibha; Yu, Qiming J.; Connell, Des W.

    2014-01-01

    The relationship between exposure time (LT 50 ) and lethal exposure concentration (LC 50 ) has been evaluated over relatively long exposure times using a novel parameter, Normal Life Expectancy (NLT), as a long term toxicity point. The model equation, ln(LT 50 ) = aLC 50 ν + b, where a, b and ν are constants, was evaluated by plotting lnLT 50 against LC 50 using available toxicity data based on inhalation exposure from 7 species of mammals. With each specific toxicant a single consistent relationship was observed for all mammals with ν always <1. Use of NLT as a long term toxicity point provided a valuable limiting point for long exposure times. With organic compounds, the Kow can be used to calculate the model constants a and v where these are unknown. The model can be used to characterise toxicity to specific mammals and then be extended to estimate toxicity at any exposure time with other mammals. -- Highlights: • Model introduces a new parameter, normal life expectancy, to explain changes in toxicity with time. • Model is innovatory as it can be used to calculate toxicity at any, particularly long exposure times. • Toxicity is influenced by normal life expectancy of the organism particularly longer exposure times. • The model was applicable to all the mammals (7 species) evaluated. • The model can be used to predict toxicity at different exposure times with untested mammals species. -- The RLE model provides a mathematical description of the change in toxicity over time for a particular chemical. This represents a major advance on the use of Haber's Rule in toxicology

  9. Study on the system development for evaluating long-term alteration of hydraulic field in near field

    International Nuclear Information System (INIS)

    Okutu, Kazuo; Morikawa, Seiji; Takamura, Hisashi

    2002-02-01

    For the high performance evaluation of reliability of TRU waste repository, the system development for evaluating long-term alteration in consideration of the changes action of barrier materials of hydraulic field in Near Field is required. In this research, system development for evaluating long-term alteration of hydraulic field in Near Field was examined. Examination of the basic specification of chemical/dynamic alteration action analysis system used as the composition element of this system and a whole system were performed. The research result of this year is shown below. 1) The system by which the chemical changes happened by Near Field as influence of the exudation liquid from cement material are evaluated was examined. In this year, document investigation about the various processes about chemical alteration and extraction of a choice, presentation of the uncertainty about a model or data, preliminary modeling, a simple analysis tool creation and sensitivity analysis, extraction of the process which should be taken into consideration in a system valuation modeling and a phenomenon analysis model, and a corresponding mathematics model, optimization of the software composition for development of a system valuation modeling, the exercise by the preliminary system analysis model, the experiment plan for the corroboration of a model were shown. 2) In consideration of change of the physical characteristic accompanying chemical alteration of bentonite material and cement material, the system by which dynamic changes action of repository is evaluated was examined. In this year, arrangement of the dynamics action of repository for long-term were shown. Extraction of a phenomenon made applicable to evaluation was shown. And the dynamic models were investigated and the prototype of the dynamics model that can take into consideration the characteristic of bentonite material was shown. And the basic composition of a dynamic changes action analysis system was shown. 3

  10. A simplified approach to evaluating severe accident source term for PWR

    International Nuclear Information System (INIS)

    Huang, Gaofeng; Tong, Lili; Cao, Xuewu

    2014-01-01

    Highlights: • Traditional source term evaluation approaches have been studied. • A simplified approach of source term evaluation for 600 MW PWR is studied. • Five release categories are established. - Abstract: For early design of NPPs, no specific severe accident source term evaluation was considered. Some general source terms have been used for some NPPs. In order to implement a best estimate, a special source term evaluation should be implemented for an NPP. Traditional source term evaluation approaches (mechanism approach and parametric approach) have some difficulties associated with their implementation. The traditional approaches are not consistent with cost-benefit assessment. A simplified approach for evaluating severe accident source term for PWR is studied. For the simplified approach, a simplified containment event tree is established. According to representative cases selection, weighted coefficient evaluation, computation of representative source term cases and weighted computation, five containment release categories are established, including containment bypass, containment isolation failure, containment early failure, containment late failure and intact containment

  11. Integrated Assessment Model Evaluation

    Science.gov (United States)

    Smith, S. J.; Clarke, L.; Edmonds, J. A.; Weyant, J. P.

    2012-12-01

    Integrated assessment models of climate change (IAMs) are widely used to provide insights into the dynamics of the coupled human and socio-economic system, including emission mitigation analysis and the generation of future emission scenarios. Similar to the climate modeling community, the integrated assessment community has a two decade history of model inter-comparison, which has served as one of the primary venues for model evaluation and confirmation. While analysis of historical trends in the socio-economic system has long played a key role in diagnostics of future scenarios from IAMs, formal hindcast experiments are just now being contemplated as evaluation exercises. Some initial thoughts on setting up such IAM evaluation experiments are discussed. Socio-economic systems do not follow strict physical laws, which means that evaluation needs to take place in a context, unlike that of physical system models, in which there are few fixed, unchanging relationships. Of course strict validation of even earth system models is not possible (Oreskes etal 2004), a fact borne out by the inability of models to constrain the climate sensitivity. Energy-system models have also been grappling with some of the same questions over the last quarter century. For example, one of "the many questions in the energy field that are waiting for answers in the next 20 years" identified by Hans Landsberg in 1985 was "Will the price of oil resume its upward movement?" Of course we are still asking this question today. While, arguably, even fewer constraints apply to socio-economic systems, numerous historical trends and patterns have been identified, although often only in broad terms, that are used to guide the development of model components, parameter ranges, and scenario assumptions. IAM evaluation exercises are expected to provide useful information for interpreting model results and improving model behavior. A key step is the recognition of model boundaries, that is, what is inside

  12. Long‐Term Post‐CABG Survival: Performance of Clinical Risk Models Versus Actuarial Predictions

    Science.gov (United States)

    Carr, Brendan M.; Romeiser, Jamie; Ruan, Joyce; Gupta, Sandeep; Seifert, Frank C.; Zhu, Wei

    2015-01-01

    Abstract Background/aim Clinical risk models are commonly used to predict short‐term coronary artery bypass grafting (CABG) mortality but are less commonly used to predict long‐term mortality. The added value of long‐term mortality clinical risk models over traditional actuarial models has not been evaluated. To address this, the predictive performance of a long‐term clinical risk model was compared with that of an actuarial model to identify the clinical variable(s) most responsible for any differences observed. Methods Long‐term mortality for 1028 CABG patients was estimated using the Hannan New York State clinical risk model and an actuarial model (based on age, gender, and race/ethnicity). Vital status was assessed using the Social Security Death Index. Observed/expected (O/E) ratios were calculated, and the models' predictive performances were compared using a nested c‐index approach. Linear regression analyses identified the subgroup of risk factors driving the differences observed. Results Mortality rates were 3%, 9%, and 17% at one‐, three‐, and five years, respectively (median follow‐up: five years). The clinical risk model provided more accurate predictions. Greater divergence between model estimates occurred with increasing long‐term mortality risk, with baseline renal dysfunction identified as a particularly important driver of these differences. Conclusions Long‐term mortality clinical risk models provide enhanced predictive power compared to actuarial models. Using the Hannan risk model, a patient's long‐term mortality risk can be accurately assessed and subgroups of higher‐risk patients can be identified for enhanced follow‐up care. More research appears warranted to refine long‐term CABG clinical risk models. doi: 10.1111/jocs.12665 (J Card Surg 2016;31:23–30) PMID:26543019

  13. STACE: Source Term Analyses for Containment Evaluations of transport casks

    International Nuclear Information System (INIS)

    Seager, K.D.; Gianoulakis, S.E.; Barrett, P.R.; Rashid, Y.R.; Reardon, P.C.

    1992-01-01

    Following the guidance of ANSI N14.5, the STACE methodology provides a technically defensible means for estimating maximum permissible leakage rates. These containment criteria attempt to reflect the true radiological hazard by performing a detailed examination of the spent fuel, CRUD, and residual contamination contributions to the releasable source term. The evaluation of the spent fuel contribution to the source term has been modeled fairly accurately using the STACE methodology. The structural model predicts the cask drop load history, the mechanical response of the fuel assembly, and the probability of cladding breach. These data are then used to predict the amount of fission gas, volatile species, and fuel fines that are releasable from the cask. There are some areas where data are sparse or lacking (e.g., the quantity and size distribution of fuel rod breaches) in which experimental validation is planned. The CRUD spallation fraction is the major area where no quantitative data has been found; therefore, this also requires experimental validation. In the interim, STACE conservatively assumes a 100% spallation fraction for computing the releasable activity. The source term methodology also conservatively assumes that there is 1 Ci of residual contamination available for release in the transport cask. However, residual contamination is still by far the smallest contributor to the source term activity

  14. A phenomenological memristor model for short-term/long-term memory

    International Nuclear Information System (INIS)

    Chen, Ling; Li, Chuandong; Huang, Tingwen; Ahmad, Hafiz Gulfam; Chen, Yiran

    2014-01-01

    Memristor is considered to be a natural electrical synapse because of its distinct memory property and nanoscale. In recent years, more and more similar behaviors are observed between memristors and biological synapse, e.g., short-term memory (STM) and long-term memory (LTM). The traditional mathematical models are unable to capture the new emerging behaviors. In this article, an updated phenomenological model based on the model of the Hewlett–Packard (HP) Labs has been proposed to capture such new behaviors. The new dynamical memristor model with an improved ion diffusion term can emulate the synapse behavior with forgetting effect, and exhibit the transformation between the STM and the LTM. Further, this model can be used in building new type of neural networks with forgetting ability like biological systems, and it is verified by our experiment with Hopfield neural network. - Highlights: • We take the Fick diffusion and the Soret diffusion into account in the ion drift theory. • We develop a new model based on the old HP model. • The new model can describe the forgetting effect and the spike-rate-dependent property of memristor. • The new model can solve the boundary effect of all window functions discussed in [13]. • A new Hopfield neural network with the forgetting ability is built by the new memristor model

  15. The establishment of a method for evaluating the long-term water-tightness durability of underground concrete structure taking into account some deteriorations

    International Nuclear Information System (INIS)

    Hironaga, Michihiko; Kawanishi, Motoi

    1996-01-01

    To establish a method of evaluating the long-term water-tightness durability of underground concrete structures, the authors firstly studied a deterioration evaluation model to express the deterioration condition of concrete structures and constructed, on the basis of this model, a function evaluation model to estimate the lowering of functions due to deterioration, consequently indicating a 'concept for evaluating the deterioration and functions of concrete structures' which will make it possible to perform the functional evaluation of concrete structures. Based on this concept, the authors then discusses a technique for evaluating the long-term water-tightness durability of underground concrete structures, specifically indicating the technique by means of illustrations. (author)

  16. A viable D-term hybrid inflation model

    Science.gov (United States)

    Kadota, Kenji; Kobayashi, Tatsuo; Sumita, Keigo

    2017-11-01

    We propose a new model of the D-term hybrid inflation in the framework of supergravity. Although our model introduces, analogously to the conventional D-term inflation, the inflaton and a pair of scalar fields charged under a U(1) gauge symmetry, we study the logarithmic and exponential dependence on the inflaton field, respectively, for the Kähler and superpotential. This results in a characteristic one-loop scalar potential consisting of linear and exponential terms, which realizes the small-field inflation dominated by the Fayet-Iliopoulos term. With the reasonable values for the coupling coefficients and, in particular, with the U(1) gauge coupling constant comparable to that of the Standard Model, our D-term inflation model can solve the notorious problems in the conventional D-term inflation, namely, the CMB constraints on the spectral index and the generation of cosmic strings.

  17. Evaluation-Function-based Model-free Adaptive Fuzzy Control

    Directory of Open Access Journals (Sweden)

    Agus Naba

    2016-12-01

    Full Text Available Designs of adaptive fuzzy controllers (AFC are commonly based on the Lyapunov approach, which requires a known model of the controlled plant. They need to consider a Lyapunov function candidate as an evaluation function to be minimized. In this study these drawbacks were handled by designing a model-free adaptive fuzzy controller (MFAFC using an approximate evaluation function defined in terms of the current state, the next state, and the control action. MFAFC considers the approximate evaluation function as an evaluative control performance measure similar to the state-action value function in reinforcement learning. The simulation results of applying MFAFC to the inverted pendulum benchmark verified the proposed scheme’s efficacy.

  18. Characteristics of geothermal structures of Poprad basin in terms of numerical modeling

    International Nuclear Information System (INIS)

    Bagelova, A.; Fendek, M.

    2011-01-01

    Poprad basin is one of the promising areas in terms of geothermal resources. In terms of impact on the environment and the exploitation of geothermal waters it is important to quantify the natural geothermal water quantity. One of the most progressive methods of their evaluation is a method of numerical modelling. Before model creation it is necessary to characterize the geothermal structure. Character of hydro-geothermal structure consists of an analysis of Spatial distribution of collectors, hydraulic properties of collectors of geothermal water, pressure and temperature conditions and boundary conditions. Basic characteristics of geothermal energy transfer in the Poprad basin are described. (authors)

  19. Short-term and long-term earthquake occurrence models for Italy: ETES, ERS and LTST

    Directory of Open Access Journals (Sweden)

    Maura Murru

    2010-11-01

    Full Text Available This study describes three earthquake occurrence models as applied to the whole Italian territory, to assess the occurrence probabilities of future (M ≥5.0 earthquakes: two as short-term (24 hour models, and one as long-term (5 and 10 years. The first model for short-term forecasts is a purely stochastic epidemic type earthquake sequence (ETES model. The second short-term model is an epidemic rate-state (ERS forecast based on a model that is physically constrained by the application to the earthquake clustering of the Dieterich rate-state constitutive law. The third forecast is based on a long-term stress transfer (LTST model that considers the perturbations of earthquake probability for interacting faults by static Coulomb stress changes. These models have been submitted to the Collaboratory for the Study of Earthquake Predictability (CSEP for forecast testing for Italy (ETH-Zurich, and they were locked down to test their validity on real data in a future setting starting from August 1, 2009.

  20. Evaluation of the long-term energy analysis program used for the 1978 EIA Administrator's Report to Congress

    Energy Technology Data Exchange (ETDEWEB)

    Peelle, R. W.; Weisbin, C. R.; Alsmiller, Jr., R. G.

    1981-10-01

    An evaluation of the Long-Term Energy Analysis Program (LEAP), a computer model of the energy portion of the US economy that was used for the 1995-2020 projections in its 1978 Annual Report to Congress, is presented. An overview of the 1978 version, LEAP Model 22C, is followed by an analysis of the important results needed by its users. The model is then evaluated on the basis of: (1) the adequacy of its documentation; (2) the local experience in operating the model; (3) the adequacy of the numerical techniques used; (4) the soundness of the economic and technical foundations of the model equations; and (5) the degree to which the computer program has been verified. To show which parameters strongly influence the results and to approach the question of whether the model can project important results with sufficient accuracy to support qualitative conclusions, the numerical sensitivities of some important results to model input parameters are described. The input data are categorized and discussed, and uncertainties are given for some parameters as examples. From this background and from the relation of LEAP to other available approaches for long-term energy modeling, an overall evaluation is given of the model's suitability for use by the EIA.

  1. A Linguistic Multigranular Sensory Evaluation Model for Olive Oil

    Directory of Open Access Journals (Sweden)

    Luis Martinez

    2008-06-01

    Full Text Available Evaluation is a process that analyzes elements in order to achieve different objectives such as quality inspection, marketing and other fields in industrial companies. This paper focuses on sensory evaluation where the evaluated items are assessed by a panel of experts according to the knowledge acquired via human senses. In these evaluation processes the information provided by the experts implies uncertainty, vagueness and imprecision. The use of the Fuzzy Linguistic Approach 32 has provided successful results modelling such a type of information. In sensory evaluation it may happen that the panel of experts have more or less degree knowledge of about the evaluated items or indicators. So, it seems suitable that each expert could express their preferences in different linguistic term sets based on their own knowledge. In this paper, we present a sensory evaluation model that manages multigranular linguistic evaluation framework based on a decision analysis scheme. This model will be applied to the sensory evaluation process of Olive Oil.

  2. Study on the system development for evaluating long-term alteration of hydraulic field in Near Field. 3

    International Nuclear Information System (INIS)

    Okutu, Kazuo; Morikawa, Seiji; Taguchi, Katsunori

    2004-02-01

    For the high performance evaluation of reliability of TRU waste repository, the system development for evaluating long-term alteration in consideration of the changes action of barrier materials of hydraulic field in Near Fields is required. In this research, the system development for evaluating the long-term alteration of hydraulic field in near field was examined. The 'Evidential Support logic' for ensuring the long-term stability of the repository was developed and evaluated. Furthermore, the developed chemical/mechanical alteration action analysis system was verified and improved. The system was coupled for the long-term alteration evaluation analysis. The research results of this year are shown below. 1) A logic tree was constructed for the purpose of supporting the high performance evaluation of reliability of a TRU waste repository. The thesis that the long term safety of the TRU waste repository is preserved was ramified into subsidiary theses until all the final theses were supported by objective evidence. The probability of the subsidiary thesis supporting the upper thesis was established by interviewing specialists. The reliability of the thesis was evaluated by applying present knowledge. Furthermore, the sensitivity of the reliability of the highest thesis to increasing reliability of evidence was investigated. Appropriate targets for experiment and analysis were presented based on the sensitivity of evidence. 2) The object of the hydraulic - chemical analysis was determined from the above-mentioned logic tree. The analysis system was improved to perform the 2D analysis. A user interface was developed to simplify the setting of analysis conditions. The system was demonstrated by comparing the results with the experimental results. Furthermore, the system was applied to the near field problem to fix the condition that the safety of the TRU waste repository is preserved. 3) Both the model of bentonite material and the model of cement material were

  3. Educational Program Evaluation Model, From the Perspective of the New Theories

    Directory of Open Access Journals (Sweden)

    Soleiman Ahmady

    2014-05-01

    Full Text Available Introduction: This study is focused on common theories that influenced the history of program evaluation and introduce the educational program evaluation proposal format based on the updated theory. Methods: Literature searches were carried out in March-December 2010 with a combination of key words, MeSH terms and other free text terms as suitable for the purpose. A comprehensive search strategy was developed to search Medline by the PubMed interface, ERIC (Education Resources Information Center and the main journal of medical education regarding current evaluation models and theories. We included all study designs in our study. We found 810 articles related to our topic, and finally 63 with the full text article included. We compared documents and used expert consensus for selection the best model. Results: We found that the complexity theory using logic model suggests compatible evaluation proposal formats, especially with new medical education programs. Common components of a logic model are: situation, inputs, outputs, and outcomes that our proposal format is based on. Its contents are: title page, cover letter, situation and background, introduction and rationale, project description, evaluation design, evaluation methodology, reporting, program evaluation management, timeline, evaluation budget based on the best evidences, and supporting documents. Conclusion: We found that the logic model is used for evaluation program planning in many places, but more research is needed to see if it is suitable for our context.

  4. Source-term model for the SYVAC3-NSURE performance assessment code

    International Nuclear Information System (INIS)

    Rowat, J.H.; Rattan, D.S.; Dolinar, G.M.

    1996-11-01

    Radionuclide contaminants in wastes emplaced in disposal facilities will not remain in those facilities indefinitely. Engineered barriers will eventually degrade, allowing radioactivity to escape from the vault. The radionuclide release rate from a low-level radioactive waste (LLRW) disposal facility, the source term, is a key component in the performance assessment of the disposal system. This report describes the source-term model that has been implemented in Ver. 1.03 of the SYVAC3-NSURE (Systems Variability Analysis Code generation 3-Near Surface Repository) code. NSURE is a performance assessment code that evaluates the impact of near-surface disposal of LLRW through the groundwater pathway. The source-term model described here was developed for the Intrusion Resistant Underground Structure (IRUS) disposal facility, which is a vault that is to be located in the unsaturated overburden at AECL's Chalk River Laboratories. The processes included in the vault model are roof and waste package performance, and diffusion, advection and sorption of radionuclides in the vault backfill. The model presented here was developed for the IRUS vault; however, it is applicable to other near-surface disposal facilities. (author). 40 refs., 6 figs

  5. Modelling Emotional and Attitudinal Evaluations of Major Sponsors

    DEFF Research Database (Denmark)

    Martensen, Anne; Hansen, Flemming

    2004-01-01

    The paper reports findings from a larger study of sponsors and their relationship to sponsoredparties. In the present reporting, the focus is on sponsors. Rather than evaluating suchsponsorships in traditional effect hierarchical terms, a conceptual Sponsor Value Model isspecified as a structural...

  6. Development of a financing model for nuclear fuel cycle cost evaluation

    International Nuclear Information System (INIS)

    Takahashi, Makoto; Yajima, Masayuki

    1984-01-01

    It is necessary to evaluate the prices of nuclear fuel pre- and post-processing in order to analyse the costs of the nuclear power generation. Those prices are directly related to the costs of construction and operation of facilities in the nuclear fuel cycle. In this report, we propose a model which evaluates financing of an undertaking that constructs and operates one of the facilities such as uranium enrichment, reprocessing or interim storage of spent fuels. The model is divided into two phases, the construction phase and the operation phase. In the construction phase, it calculates expenses during the facility construction and corresponding financings for each term. In the operation phase, the model refers to the results of the construction phase and performs calculations on profits and losses, cash-flow, and disposition to profits term by according to a certain operation schedule. Using this model, feasibility of the undertaking and effects of various pricing strategies on the nuclear fuel costs can be evaluated by simulations. (author)

  7. Performance Evaluation Model for Application Layer Firewalls.

    Science.gov (United States)

    Xuan, Shichang; Yang, Wu; Dong, Hui; Zhang, Jiangchuan

    2016-01-01

    Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers). Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  8. Performance Evaluation Model for Application Layer Firewalls.

    Directory of Open Access Journals (Sweden)

    Shichang Xuan

    Full Text Available Application layer firewalls protect the trusted area network against information security risks. However, firewall performance may affect user experience. Therefore, performance analysis plays a significant role in the evaluation of application layer firewalls. This paper presents an analytic model of the application layer firewall, based on a system analysis to evaluate the capability of the firewall. In order to enable users to improve the performance of the application layer firewall with limited resources, resource allocation was evaluated to obtain the optimal resource allocation scheme in terms of throughput, delay, and packet loss rate. The proposed model employs the Erlangian queuing model to analyze the performance parameters of the system with regard to the three layers (network, transport, and application layers. Then, the analysis results of all the layers are combined to obtain the overall system performance indicators. A discrete event simulation method was used to evaluate the proposed model. Finally, limited service desk resources were allocated to obtain the values of the performance indicators under different resource allocation scenarios in order to determine the optimal allocation scheme. Under limited resource allocation, this scheme enables users to maximize the performance of the application layer firewall.

  9. Model description for calculating the source term of the Angra 1 environmental control system

    International Nuclear Information System (INIS)

    Oliveira, L.F.S. de; Amaral Neto, J.D.; Salles, M.R.

    1988-01-01

    This work presents the model used for evaluation of source term released from Angra 1 Nuclear Power Plant in case of an accident. After that, an application of the model for the case of a Fuel Assembly Drop Accident Inside the Fuel Handling Building during reactor refueling is presented. (author) [pt

  10. Baby Skyrme models without a potential term

    Science.gov (United States)

    Ashcroft, Jennifer; Haberichter, Mareike; Krusch, Steffen

    2015-05-01

    We develop a one-parameter family of static baby Skyrme models that do not require a potential term to admit topological solitons. This is a novel property as the standard baby Skyrme model must contain a potential term in order to have stable soliton solutions, though the Skyrme model does not require this. Our new models satisfy an energy bound that is linear in terms of the topological charge and can be saturated in an extreme limit. They also satisfy a virial theorem that is shared by the Skyrme model. We calculate the solitons of our new models numerically and observe that their form depends significantly on the choice of parameter. In one extreme, we find compactons while at the other there is a scale invariant model in which solitons can be obtained exactly as solutions to a Bogomolny equation. We provide an initial investigation into these solitons and compare them with the baby Skyrmions of other models.

  11. Long-Term Evaluation of Ocean Tidal Variation Models of Polar Motion and UT1

    Science.gov (United States)

    Karbon, Maria; Balidakis, Kyriakos; Belda, Santiago; Nilsson, Tobias; Hagedoorn, Jan; Schuh, Harald

    2018-04-01

    Recent improvements in the development of VLBI (very long baseline interferometry) and other space geodetic techniques such as the global navigation satellite systems (GNSS) require very precise a-priori information of short-period (daily and sub-daily) Earth rotation variations. One significant contribution to Earth rotation is caused by the diurnal and semi-diurnal ocean tides. Within this work, we developed a new model for the short-period ocean tidal variations in Earth rotation, where the ocean tidal angular momentum model and the Earth rotation variation have been setup jointly. Besides the model of the short-period variation of the Earth's rotation parameters (ERP), based on the empirical ocean tide model EOT11a, we developed also ERP models, that are based on the hydrodynamic ocean tide models FES2012 and HAMTIDE. Furthermore, we have assessed the effect of uncertainties in the elastic Earth model on the resulting ERP models. Our proposed alternative ERP model to the IERS 2010 conventional model considers the elastic model PREM and 260 partial tides. The choice of the ocean tide model and the determination of the tidal velocities have been identified as the main uncertainties. However, in the VLBI analysis all models perform on the same level of accuracy. From these findings, we conclude that the models presented here, which are based on a re-examined theoretical description and long-term satellite altimetry observation only, are an alternative for the IERS conventional model but do not improve the geodetic results.

  12. Evaluation models and evaluation use

    Science.gov (United States)

    Contandriopoulos, Damien; Brousselle, Astrid

    2012-01-01

    The use of evaluation results is at the core of evaluation theory and practice. Major debates in the field have emphasized the importance of both the evaluator’s role and the evaluation process itself in fostering evaluation use. A recent systematic review of interventions aimed at influencing policy-making or organizational behavior through knowledge exchange offers a new perspective on evaluation use. We propose here a framework for better understanding the embedded relations between evaluation context, choice of an evaluation model and use of results. The article argues that the evaluation context presents conditions that affect both the appropriateness of the evaluation model implemented and the use of results. PMID:23526460

  13. Evaluation of long term radiological impact on population close to remediated uranium mill tailings storages

    International Nuclear Information System (INIS)

    Kerouanton, David; Delgove, Laure

    2008-01-01

    A methodology is elaborated in order to evaluate the long term radiological impact of remediated uranium mill tailings storage. Different scenarios are chosen and modelled to cover future evolution of the tailings storages. Radiological impact is evaluated for different population such as adults and children living in the immediate vicinity or directly on the storage, road workers or walkers on the storage. Equation and methods are detailed. (author)

  14. Evaluating short-term hydro-meteorological fluxes using GRACE-derived water storage changes

    Science.gov (United States)

    Eicker, A.; Jensen, L.; Springer, A.; Kusche, J.

    2017-12-01

    Atmospheric and terrestrial water budgets, which represent important boundary conditions for both climate modeling and hydrological studies, are linked by evapotranspiration (E) and precipitation (P). These fields are provided by numerical weather prediction models and atmospheric reanalyses such as ERA-Interim and MERRA-Land; yet, in particular the quality of E is still not well evaluated. Via the terrestrial water budget equation, water storage changes derived from products of the Gravity Recovery and Climate Experiment (GRACE) mission, combined with runoff (R) data can be used to assess the realism of atmospheric models. In this contribution we will investigate the closure of the water balance for short-term fluxes, i.e. the agreement of GRACE water storage changes with P-E-R flux time series from different (global and regional) atmospheric reanalyses, land surface models, as well as observation-based data sets. Missing river runoff observations will be extrapolated using the calibrated rainfall-runoff model GR2M. We will perform a global analysis and will additionally focus on selected river basins in West Africa. The investigations will be carried out for various temporal scales, focusing on short-term fluxes down to daily variations to be detected in daily GRACE time series.

  15. Evaluation of Term Ranking Algorithms for Pseudo-Relevance Feedback in MEDLINE Retrieval.

    Science.gov (United States)

    Yoo, Sooyoung; Choi, Jinwook

    2011-06-01

    The purpose of this study was to investigate the effects of query expansion algorithms for MEDLINE retrieval within a pseudo-relevance feedback framework. A number of query expansion algorithms were tested using various term ranking formulas, focusing on query expansion based on pseudo-relevance feedback. The OHSUMED test collection, which is a subset of the MEDLINE database, was used as a test corpus. Various ranking algorithms were tested in combination with different term re-weighting algorithms. Our comprehensive evaluation showed that the local context analysis ranking algorithm, when used in combination with one of the reweighting algorithms - Rocchio, the probabilistic model, and our variants - significantly outperformed other algorithm combinations by up to 12% (paired t-test; p algorithm pairs, at least in the context of the OHSUMED corpus. Comparative experiments on term ranking algorithms were performed in the context of a subset of MEDLINE documents. With medical documents, local context analysis, which uses co-occurrence with all query terms, significantly outperformed various term ranking methods based on both frequency and distribution analyses. Furthermore, the results of the experiments demonstrated that the term rank-based re-weighting method contributed to a remarkable improvement in mean average precision.

  16. Evaluating Extensions to Coherent Mortality Forecasting Models

    Directory of Open Access Journals (Sweden)

    Syazreen Shair

    2017-03-01

    Full Text Available Coherent models were developed recently to forecast the mortality of two or more sub-populations simultaneously and to ensure long-term non-divergent mortality forecasts of sub-populations. This paper evaluates the forecast accuracy of two recently-published coherent mortality models, the Poisson common factor and the product-ratio functional models. These models are compared to each other and the corresponding independent models, as well as the original Lee–Carter model. All models are applied to age-gender-specific mortality data for Australia and Malaysia and age-gender-ethnicity-specific data for Malaysia. The out-of-sample forecast error of log death rates, male-to-female death rate ratios and life expectancy at birth from each model are compared and examined across groups. The results show that, in terms of overall accuracy, the forecasts of both coherent models are consistently more accurate than those of the independent models for Australia and for Malaysia, but the relative performance differs by forecast horizon. Although the product-ratio functional model outperforms the Poisson common factor model for Australia, the Poisson common factor is more accurate for Malaysia. For the ethnic groups application, ethnic-coherence gives better results than gender-coherence. The results provide evidence that coherent models are preferable to independent models for forecasting sub-populations’ mortality.

  17. Evaluation of burst pressure prediction models for line pipes

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Xian-Kui, E-mail: zhux@battelle.org [Battelle Memorial Institute, 505 King Avenue, Columbus, OH 43201 (United States); Leis, Brian N. [Battelle Memorial Institute, 505 King Avenue, Columbus, OH 43201 (United States)

    2012-01-15

    Accurate prediction of burst pressure plays a central role in engineering design and integrity assessment of oil and gas pipelines. Theoretical and empirical solutions for such prediction are evaluated in this paper relative to a burst pressure database comprising more than 100 tests covering a variety of pipeline steel grades and pipe sizes. Solutions considered include three based on plasticity theory for the end-capped, thin-walled, defect-free line pipe subjected to internal pressure in terms of the Tresca, von Mises, and ZL (or Zhu-Leis) criteria, one based on a cylindrical instability stress (CIS) concept, and a large group of analytical and empirical models previously evaluated by Law and Bowie (International Journal of Pressure Vessels and Piping, 84, 2007: 487-492). It is found that these models can be categorized into either a Tresca-family or a von Mises-family of solutions, except for those due to Margetson and Zhu-Leis models. The viability of predictions is measured via statistical analyses in terms of a mean error and its standard deviation. Consistent with an independent parallel evaluation using another large database, the Zhu-Leis solution is found best for predicting burst pressure, including consideration of strain hardening effects, while the Tresca strength solutions including Barlow, Maximum shear stress, Turner, and the ASME boiler code provide reasonably good predictions for the class of line-pipe steels with intermediate strain hardening response. - Highlights: Black-Right-Pointing-Pointer This paper evaluates different burst pressure prediction models for line pipes. Black-Right-Pointing-Pointer The existing models are categorized into two major groups of Tresca and von Mises solutions. Black-Right-Pointing-Pointer Prediction quality of each model is assessed statistically using a large full-scale burst test database. Black-Right-Pointing-Pointer The Zhu-Leis solution is identified as the best predictive model.

  18. Evaluation of burst pressure prediction models for line pipes

    International Nuclear Information System (INIS)

    Zhu, Xian-Kui; Leis, Brian N.

    2012-01-01

    Accurate prediction of burst pressure plays a central role in engineering design and integrity assessment of oil and gas pipelines. Theoretical and empirical solutions for such prediction are evaluated in this paper relative to a burst pressure database comprising more than 100 tests covering a variety of pipeline steel grades and pipe sizes. Solutions considered include three based on plasticity theory for the end-capped, thin-walled, defect-free line pipe subjected to internal pressure in terms of the Tresca, von Mises, and ZL (or Zhu-Leis) criteria, one based on a cylindrical instability stress (CIS) concept, and a large group of analytical and empirical models previously evaluated by Law and Bowie (International Journal of Pressure Vessels and Piping, 84, 2007: 487–492). It is found that these models can be categorized into either a Tresca-family or a von Mises-family of solutions, except for those due to Margetson and Zhu-Leis models. The viability of predictions is measured via statistical analyses in terms of a mean error and its standard deviation. Consistent with an independent parallel evaluation using another large database, the Zhu-Leis solution is found best for predicting burst pressure, including consideration of strain hardening effects, while the Tresca strength solutions including Barlow, Maximum shear stress, Turner, and the ASME boiler code provide reasonably good predictions for the class of line-pipe steels with intermediate strain hardening response. - Highlights: ► This paper evaluates different burst pressure prediction models for line pipes. ► The existing models are categorized into two major groups of Tresca and von Mises solutions. ► Prediction quality of each model is assessed statistically using a large full-scale burst test database. ► The Zhu-Leis solution is identified as the best predictive model.

  19. EVALUATION OF RADIONUCLIDE ACCUMULATION IN SOIL DUE TO LONG-TERM IRRIGATION

    International Nuclear Information System (INIS)

    De Wesley Wu

    2006-01-01

    Radionuclide accumulation in soil due to long-term irrigation is an important part of the model for predicting radiation dose in a long period of time. The model usually assumes an equilibrium condition in soil with a constant irrigation rate, so that radionuclide concentration in soil does not change with time and can be analytically solved. This method is currently being used for the dose assessment in the Yucca Mountain project, which requires evaluating radiation dose for a period of 10,000 years. There are several issues associated with the method: (1) time required for the equilibrium condition, (2) validity of constant irrigation rate, (3) agricultural land use for a long period of time, and (4) variation of a radionuclide concentration in water. These issues are evaluated using a numerical method with a simple model built in the GoldSim software. Some key radionuclides, Tc-99, Np-237, Pu-239, and Am-241 are selected as representative radionuclides. The results indicate that the equilibrium model is acceptable except for a radionuclide that requires long time to accumulate in soil and that its concentration in water changes dramatically with time (i.e. a sharp peak). Then the calculated dose for that radionuclide could be overestimated using the current equilibrium method

  20. Long-term functional outcomes and correlation with regional brain connectivity by MRI diffusion tractography metrics in a near-term rabbit model of intrauterine growth restriction.

    Science.gov (United States)

    Illa, Miriam; Eixarch, Elisenda; Batalle, Dafnis; Arbat-Plana, Ariadna; Muñoz-Moreno, Emma; Figueras, Francesc; Gratacos, Eduard

    2013-01-01

    Intrauterine growth restriction (IUGR) affects 5-10% of all newborns and is associated with increased risk of memory, attention and anxiety problems in late childhood and adolescence. The neurostructural correlates of long-term abnormal neurodevelopment associated with IUGR are unknown. Thus, the aim of this study was to provide a comprehensive description of the long-term functional and neurostructural correlates of abnormal neurodevelopment associated with IUGR in a near-term rabbit model (delivered at 30 days of gestation) and evaluate the development of quantitative imaging biomarkers of abnormal neurodevelopment based on diffusion magnetic resonance imaging (MRI) parameters and connectivity. At +70 postnatal days, 10 cases and 11 controls were functionally evaluated with the Open Field Behavioral Test which evaluates anxiety and attention and the Object Recognition Task that evaluates short-term memory and attention. Subsequently, brains were collected, fixed and a high resolution MRI was performed. Differences in diffusion parameters were analyzed by means of voxel-based and connectivity analysis measuring the number of fibers reconstructed within anxiety, attention and short-term memory networks over the total fibers. The results of the neurobehavioral and cognitive assessment showed a significant higher degree of anxiety, attention and memory problems in cases compared to controls in most of the variables explored. Voxel-based analysis (VBA) revealed significant differences between groups in multiple brain regions mainly in grey matter structures, whereas connectivity analysis demonstrated lower ratios of fibers within the networks in cases, reaching the statistical significance only in the left hemisphere for both networks. Finally, VBA and connectivity results were also correlated with functional outcome. The rabbit model used reproduced long-term functional impairments and their neurostructural correlates of abnormal neurodevelopment associated with IUGR

  1. Examination of constitutive model for evaluating long-term behavior of buffer. Document prepared by other institute, based on the trust contract

    International Nuclear Information System (INIS)

    Shigeno, Yoshimasa; Namikawa, Tsutomu; Takaji, Kazuhiko

    2002-02-01

    On the R and D of the high-level radioactive waste repository, it is essential that Engineered Barrier System (EBS) is stable mechanically over a long period of time for maintaining ability required to EBS. After closing the repository, the external force is affected to buffer for a long period of time. So, it is necessary to make clear the mechanical deformation behavior of buffer to the external force, because of carrying out safety assessment of EBS accurately. In this report, the applicable constitutive model have been narrowed down from within many one for clay proposed up to the present, from the viewpoint of adoption possibility to the evaluation of the long-term mechanical behavior for buffer. To put it concretely, the investigation of constitutive models for clay, the applicability confirmation of the representative model by analysis using test data, and the proposal of experimental method available for simulation analysis is carried out. The results of the elemental test simulations using Adachi-Oka model and Sekiguchi-Ohta model as the representative model of ''Over stress model'' and Flow surface model'' are as follows. Both of the models are not make much difference. A limited part of the test results is expressed suitably according to the appropriate decision of viscous parameters, but the blanket behavior of each tests is not expressed suitably in either model. (author)

  2. Simulation of ultra-long term behavior in HLW near-field by centrifugal model test. Part 1. Development of centrifugal equipment and centrifuge model test method

    International Nuclear Information System (INIS)

    Nishimoto, Soshi; Okada, Tetsuji; Sawada, Masataka

    2011-01-01

    The objective of this paper is to develop a centrifugal equipment which can continuously be run for a long time and a model test method in order to evaluate a long term behavior which is a coupled thermo-hydro-mechanical processes in the high level wastes geological disposal repository and the neighborhood (called 'near-field'). The centrifugal equipment of CRIEPI, 'CENTURY5000-THM', developed in the present study is able to run continuously up to six months. Therefore, a long term behavior in the near-field can be simulated in a short term, for instance, the behavior for 5000 equivalent years can be simulated in six months by centrifugalizing 100 G using a 1/100 size model. We carried out a test using a nylon specimen in a centrifugal force field of 30 G and confirmed the operations of CENTURY5000-THM, control and measurement for 11 days. As the results, it was able to control the stress in the pressure vessel and measure the values of strain, temperature and pressure. And, as a result of scanning the small model of near-field including the metal overpack, bentonite buffer and rock by a medical X-rays CT scanner, the internal structure of the model was able to be evaluated when the metal artifact was reduced. From these results, the evaluation for a long term behavior of a disposal repository by the method of centrifugal model test became possible. (author)

  3. Long-term consequences of non-intentional flows of substances: Modelling non-intentional flows of lead in the Dutch economic system and evaluating their environmental consequences

    International Nuclear Information System (INIS)

    Elshkaki, Ayman; Voet, Ester van der; Holderbeke, Mirja van; Timmermans, Veerle

    2009-01-01

    Substances may enter the economy and the environment through both intentional and non-intentional flows. These non-intentional flows, including the occurrence of substances as pollutants in mixed primary resources (metal ores, phosphate ores and fossil fuels) and their presence in re-used waste streams from intentional use may have environmental and economic consequences in terms of pollution and resource availability. On the one hand, these non-intentional flows may cause pollution problems. On the other hand, these flows have the potential to be a secondary source of substances. This article aims to quantify and model the non-intentional flows of lead, to evaluate their long-term environmental consequences, and compare these consequences to those of the intentional flows of lead. To meet this goal, the model combines all the sources of non-intentional flows of lead within one model, which also includes the intentional flows. Application of the model shows that the non-intentional flows of lead related to waste streams associated with intentional use are decreasing over time, due to the increased attention given to waste management. However, as contaminants in mixed primary resources application, lead flows are increasing as demand for these applications is increasing.

  4. Evaluation of turbulence models for turbomachinery unsteady three-dimensional flows simulation; Evaluation de modeles de turbulence pour la simulation d'ecoulements tridimensionnels instationnaires en turbomachines

    Energy Technology Data Exchange (ETDEWEB)

    Dano, C.

    2003-01-15

    The objective of this thesis is to evaluate k-e, k-l and k-w low Reynolds two equation turbulence models for. A quadratic nonlinear k-l model is also implemented in this study. We analyze the two equation turbulence models capacity to predict the turbomachinery flows and the wakes. We are interested more particularly in the unsteady three dimensional configuration with rotor-stator interactions. A Gaussian distribution reproduces the upstream wake. This analysis is carried out in term of prediction quality but also in term of numerical behavior. Turbines and compressors configurations are tested. (author)

  5. A mid-term, market-based power systems planning model

    International Nuclear Information System (INIS)

    Koltsaklis, Nikolaos E.; Dagoumas, Athanasios S.; Georgiadis, Michael C.; Papaioannou, George; Dikaiakos, Christos

    2016-01-01

    Highlights: • A mid-term Energy Planning along with a Unit Commitment model is developed. • The model identifies the optimum interconnection capacity. • Electricity interconnections affect the power mix and the day-ahead spot price. • Renewables’ penetration has impacts on the power reserves and the CO_2 emissions. • Energy policy and fuel pricing can have significant impacts on the power mix. - Abstract: This paper presents a generic Mixed Integer Linear Programming (MILP) model that integrates a Mid-term Energy Planning (MEP) model, which implements generation and transmission system planning at a yearly level, with a Unit Commitment (UC) model, which performs the simulation of the Day-Ahead Electricity Market. The applicability of the proposed model is illustrated in a case study of the Greek interconnected power system. The aim is to evaluate a critical project in the Ten Year Network Development Plan (TYNDP) of the Independent Power Transmission System Operator S.A. (ADMIE), namely the electric interconnection of the Crete Island with the mainland electric system. The proposed modeling framework identifies the implementation (or not) of the interconnection of the Crete Island with the mainland electric system, as well as the optimum interconnection capacity. It also quantifies the effects on the Day-Ahead electricity market and on the energy mix. The paper demonstrates that the model can provide useful insights into the strategic and challenging decisions to be determined by investors and/or policy makers at a national and/or regional level, by providing the optimal energy roadmap and management, as well as clear price signals on critical energy projects under real operating and design constraints.

  6. Murine model of long term obstructive jaundice

    Science.gov (United States)

    Aoki, Hiroaki; Aoki, Masayo; Yang, Jing; Katsuta, Eriko; Mukhopadhyay, Partha; Ramanathan, Rajesh; Woelfel, Ingrid A.; Wang, Xuan; Spiegel, Sarah; Zhou, Huiping; Takabe, Kazuaki

    2016-01-01

    Background With the recent emergence of conjugated bile acids as signaling molecules in cancer, a murine model of obstructive jaundice by cholestasis with long-term survival is in need. Here, we investigated the characteristics of 3 murine models of obstructive jaundice. Methods C57BL/6J mice were used for total ligation of the common bile duct (tCL), partial common bile duct ligation (pCL), and ligation of left and median hepatic bile duct with gallbladder removal (LMHL) models. Survival was assessed by Kaplan-Meier method. Fibrotic change was determined by Masson-Trichrome staining and Collagen expression. Results 70% (7/10) of tCL mice died by Day 7, whereas majority 67% (10/15) of pCL mice survived with loss of jaundice. 19% (3/16) of LMHL mice died; however, jaundice continued beyond Day 14, with survival of more than a month. Compensatory enlargement of the right lobe was observed in both pCL and LMHL models. The pCL model demonstrated acute inflammation due to obstructive jaundice 3 days after ligation but jaundice rapidly decreased by Day 7. The LHML group developed portal hypertension as well as severe fibrosis by Day 14 in addition to prolonged jaundice. Conclusion The standard tCL model is too unstable with high mortality for long-term studies. pCL may be an appropriate model for acute inflammation with obstructive jaundice but long term survivors are no longer jaundiced. The LHML model was identified to be the most feasible model to study the effect of long-term obstructive jaundice. PMID:27916350

  7. Backcasting long-term climate data: evaluation of hypothesis

    Science.gov (United States)

    Saghafian, Bahram; Aghbalaghi, Sara Ghasemi; Nasseri, Mohsen

    2018-05-01

    Most often than not, incomplete datasets or short-term recorded data in vast regions impedes reliable climate and water studies. Various methods, such as simple correlation with stations having long-term time series, are practiced to infill or extend the period of observation at stations with missing or short-term data. In the current paper and for the first time, the hypothesis on the feasibility of extending the downscaling concept to backcast local observation records using large-scale atmospheric predictors is examined. Backcasting is coined here to contrast forecasting/projection; the former is implied to reconstruct in the past, while the latter represents projection in the future. To assess our hypotheses, daily and monthly statistical downscaling models were employed to reconstruct past precipitation data and lengthen the data period. Urmia and Tabriz synoptic stations, located in northwestern Iran, constituted two case study stations. SDSM and data-mining downscaling model (DMDM) daily as well as the group method of data handling (GMDH) and model tree (Mp5) monthly downscaling models were trained with National Center for Environmental Prediction (NCEP) data. After training, reconstructed precipitation data of the past was validated against observed data. Then, the data was fully extended to the 1948 to 2009 period corresponding to available NCEP data period. The results showed that DMDM performed superior in generation of monthly average precipitation compared with the SDSM, Mp5, and GMDH models, although none of the models could preserve the monthly variance. This overall confirms practical value of the proposed approach in extension of the past historic data, particularly for long-term climatological and water budget studies.

  8. Evaluating litter decomposition and soil organic matter dynamics in earth system models: contrasting analysis of long-term litter decomposition and steady-state soil carbon

    Science.gov (United States)

    Bonan, G. B.; Wieder, W. R.

    2012-12-01

    litterfall and model-derived climatic decomposition index. While comparison with the LIDET 10-year litterbag study reveals sharp contrasts between CLM4 and DAYCENT, simulations of steady-state soil carbon show less difference between models. Both CLM4 and DAYCENT significantly underestimate soil carbon. Sensitivity analyses highlight causes of the low soil carbon bias. The terrestrial biogeochemistry of earth system models must be critically tested with observations, and the consequences of particular model choices must be documented. Long-term litter decomposition experiments such as LIDET provide a real-world process-oriented benchmark to evaluate models and can critically inform model development. Analysis of steady-state soil carbon estimates reveal additional, but here different, inferences about model performance.

  9. Evaluating and reducing a model of radiocaesium soil-plant uptake

    Energy Technology Data Exchange (ETDEWEB)

    Tarsitano, D.; Young, S.D. [School of Biosciences, University of Nottingham, University Park, Nottingham, NG7 2RD (United Kingdom); Crout, N.M.J., E-mail: neil.crout@nottingham.ac.u [School of Biosciences, University of Nottingham, University Park, Nottingham, NG7 2RD (United Kingdom)

    2011-03-15

    An existing model of radiocaesium transfer to grasses was extended to include wheat and barley and parameterised using data from a wide range of soils and contact times. The model structure was revised and evaluated using a subset of the available data which was not used for model parameterisation. The resulting model was then used as a basis for systematic model reduction to test the utility of the model components. This analysis suggested that the use of 4 model variables (relating to radiocaesium adsorption on organic matter and the pH sensitivity of soil solution potassium concentration) and 1 model input (pH) are not required. The results of this analysis were used to develop a reduced model which was further evaluated in terms of comparisons to observations. The reduced model had an improved empirical performance and fewer adjustable parameters and soil characteristic inputs. - Research highlights: {yields} A model of plant radiocesium uptake is evaluated and re-parameterised. {yields} The representation of time dependent changes in plant uptake is improved. {yields} Model reduction is applied to evaluate the model structure. {yields} A reduced model is identified which outperforms the previously reported model. {yields} The reduced model requires fewer soil specific inputs.

  10. Program Evaluation in Cost Benefit Terms.

    Science.gov (United States)

    Tanner, C. Kenneth

    This paper advances a model, called the expected opportunity loss model, for curriculum evaluation. This decision-making technique utilizes subjective data by ranking courses according to their expected contributions to the primary objective of the total program. The model also utilizes objective data in the form of component costs, and differs…

  11. Evaluation of long-term natural gas marketing agreements: An application of commodity forward and option pricing theory

    International Nuclear Information System (INIS)

    Salahor, G.S.; Laughton, D.G.

    1993-01-01

    Methods that have been empirically validated in the analysis of short-term traded securities are adapted to evaluate long-term natural gas direct-sale contracts. A sample contract is examined from the perspective of the producer, and analyzed as a series of forward and option contracts. The assessment of contract value is based on the gas price forecast, the volatility in that forecast, and the valuation of risk caused by that volatility. The method presented allows the gas producer to quantify these elements, and to evaluate the variety of terms encountered in direct-sale natural gas agreements, including features such as load factors and penalty charges. The analysis uses as inputs a probabilistic price forecast and a determination of a price of risk for gas prices. Once the forecast volatility is derived from the probabilistic forecast, the forward contracts imbedded in the long-term gas contract can be valued with a risk-discounting model, and optional aspects can be evaluated using the Black-Scholes option pricing method. 10 refs., 3 figs., 2 tabs

  12. Review of models used for determining consequences of UF6 release: Development of model evaluation criteria. Volume 1

    International Nuclear Information System (INIS)

    Nair, S.K.; Chambers, D.B.; Park, S.H.; Hoffman, F.O.

    1997-11-01

    The objective of this study is to examine the usefulness and effectiveness of currently existing models that simulate the release of uranium hexafluoride from UF 6 -handling facilities, subsequent reactions of UF 6 with atmospheric moisture, and the dispersion of UF 6 and reaction products in the atmosphere. The study evaluates screening-level and detailed public-domain models that were specifically developed for UF 6 and models that were originally developed for the treatment of dense gases but are applicable to UF 6 release, reaction, and dispersion. The model evaluation process is divided into three specific tasks: model-component evaluation; applicability evaluation; and user interface and quality assurance and quality control (QA/QC) evaluation. Within the model-component evaluation process, a model's treatment of source term, thermodynamics, and atmospheric dispersion are considered and model predictions are compared with actual observations. Within the applicability evaluation process, a model's applicability to Integrated Safety Analysis, Emergency Response Planning, and Post-Accident Analysis, and to site-specific considerations are assessed. Finally, within the user interface and QA/QC evaluation process, a model's user-friendliness, presence and clarity of documentation, ease of use, etc. are assessed, along with its handling of QA/QC. This document presents the complete methodology used in the evaluation process

  13. Murine model of long-term obstructive jaundice.

    Science.gov (United States)

    Aoki, Hiroaki; Aoki, Masayo; Yang, Jing; Katsuta, Eriko; Mukhopadhyay, Partha; Ramanathan, Rajesh; Woelfel, Ingrid A; Wang, Xuan; Spiegel, Sarah; Zhou, Huiping; Takabe, Kazuaki

    2016-11-01

    With the recent emergence of conjugated bile acids as signaling molecules in cancer, a murine model of obstructive jaundice by cholestasis with long-term survival is in need. Here, we investigated the characteristics of three murine models of obstructive jaundice. C57BL/6J mice were used for total ligation of the common bile duct (tCL), partial common bile duct ligation (pCL), and ligation of left and median hepatic bile duct with gallbladder removal (LMHL) models. Survival was assessed by Kaplan-Meier method. Fibrotic change was determined by Masson-Trichrome staining and Collagen expression. Overall, 70% (7 of 10) of tCL mice died by day 7, whereas majority 67% (10 of 15) of pCL mice survived with loss of jaundice. A total of 19% (3 of 16) of LMHL mice died; however, jaundice continued beyond day 14, with survival of more than a month. Compensatory enlargement of the right lobe was observed in both pCL and LMHL models. The pCL model demonstrated acute inflammation due to obstructive jaundice 3 d after ligation but jaundice rapidly decreased by day 7. The LHML group developed portal hypertension and severe fibrosis by day 14 in addition to prolonged jaundice. The standard tCL model is too unstable with high mortality for long-term studies. pCL may be an appropriate model for acute inflammation with obstructive jaundice, but long-term survivors are no longer jaundiced. The LHML model was identified to be the most feasible model to study the effect of long-term obstructive jaundice. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Low-level radioactive waste performance assessments: Source term modeling

    International Nuclear Information System (INIS)

    Icenhour, A.S.; Godbee, H.W.; Miller, L.F.

    1995-01-01

    Low-level radioactive wastes (LLW) generated by government and commercial operations need to be isolated from the environment for at least 300 to 500 yr. Most existing sites for the storage or disposal of LLW employ the shallow-land burial approach. However, the U.S. Department of Energy currently emphasizes the use of engineered systems (e.g., packaging, concrete and metal barriers, and water collection systems). Future commercial LLW disposal sites may include such systems to mitigate radionuclide transport through the biosphere. Performance assessments must be conducted for LUW disposal facilities. These studies include comprehensive evaluations of radionuclide migration from the waste package, through the vadose zone, and within the water table. Atmospheric transport mechanisms are also studied. Figure I illustrates the performance assessment process. Estimates of the release of radionuclides from the waste packages (i.e., source terms) are used for subsequent hydrogeologic calculations required by a performance assessment. Computer models are typically used to describe the complex interactions of water with LLW and to determine the transport of radionuclides. Several commonly used computer programs for evaluating source terms include GWSCREEN, BLT (Breach-Leach-Transport), DUST (Disposal Unit Source Term), BARRIER (Ref. 5), as well as SOURCE1 and SOURCE2 (which are used in this study). The SOURCE1 and SOURCE2 codes were prepared by Rogers and Associates Engineering Corporation for the Oak Ridge National Laboratory (ORNL). SOURCE1 is designed for tumulus-type facilities, and SOURCE2 is tailored for silo, well-in-silo, and trench-type disposal facilities. This paper focuses on the source term for ORNL disposal facilities, and it describes improved computational methods for determining radionuclide transport from waste packages

  15. Virtual Models of Long-Term Care

    Science.gov (United States)

    Phenice, Lillian A.; Griffore, Robert J.

    2012-01-01

    Nursing homes, assisted living facilities and home-care organizations, use web sites to describe their services to potential consumers. This virtual ethnographic study developed models representing how potential consumers may understand this information using data from web sites of 69 long-term-care providers. The content of long-term-care web…

  16. Use of an operational model evaluation system for model intercomparison

    Energy Technology Data Exchange (ETDEWEB)

    Foster, K. T., LLNL

    1998-03-01

    The Atmospheric Release Advisory Capability (ARAC) is a centralized emergency response system used to assess the impact from atmospheric releases of hazardous materials. As part of an on- going development program, new three-dimensional diagnostic windfield and Lagrangian particle dispersion models will soon replace ARAC`s current operational windfield and dispersion codes. A prototype model performance evaluation system has been implemented to facilitate the study of the capabilities and performance of early development versions of these new models relative to ARAC`s current operational codes. This system provides tools for both objective statistical analysis using common performance measures and for more subjective visualization of the temporal and spatial relationships of model results relative to field measurements. Supporting this system is a database of processed field experiment data (source terms and meteorological and tracer measurements) from over 100 individual tracer releases.

  17. A framework for evaluating forest landscape model predictions using empirical data and knowledge

    Science.gov (United States)

    Wen J. Wang; Hong S. He; Martin A. Spetich; Stephen R. Shifley; Frank R. Thompson; William D. Dijak; Qia. Wang

    2014-01-01

    Evaluation of forest landscape model (FLM) predictions is indispensable to establish the credibility of predictions. We present a framework that evaluates short- and long-term FLM predictions at site and landscape scales. Site-scale evaluation is conducted through comparing raster cell-level predictions with inventory plot data whereas landscape-scale evaluation is...

  18. A Method for Evaluation of Model-Generated Vertical Profiles of Meteorological Variables

    Science.gov (United States)

    2016-03-01

    evaluated WRF output for the boundary layer over Svalbard in the Arctic in terms of height above ground compared to tower and tethered balloon ...Valparaiso, Chile; 2011. Dutsch ML. Evaluation of the WRF model based on observations made by controlled meteorological balloons in the atmospheric

  19. The EMEFS model evaluation

    International Nuclear Information System (INIS)

    Barchet, W.R.; Dennis, R.L.; Seilkop, S.K.; Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K.; Byun, D.; McHenry, J.N.; Karamchandani, P.; Venkatram, A.; Fung, C.; Misra, P.K.; Hansen, D.A.; Chang, J.S.

    1991-12-01

    The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs

  20. The EMEFS model evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Barchet, W.R. (Pacific Northwest Lab., Richland, WA (United States)); Dennis, R.L. (Environmental Protection Agency, Research Triangle Park, NC (United States)); Seilkop, S.K. (Analytical Sciences, Inc., Durham, NC (United States)); Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K. (Atmospheric Environment Service, Downsview, ON (Canada)); Byun, D.; McHenry, J.N.

    1991-12-01

    The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs.

  1. Ability of the MACRO Model to Predict Long-Term Leaching of Metribuzin and Diketometribuzin

    DEFF Research Database (Denmark)

    Rosenbom, Annette E; Kjær, Jeanne; Henriksen, Trine

    2009-01-01

    In a regulatory context, numerical models are increasingly employed to quantify leaching of pesticides and their metabolites. Although the ability of these models to accurately simulate leaching of pesticides has been evaluated, little is known about their ability to accurately simulate long...... alternative kinetics (a two-site approach), we captured the observed leaching scenario, thus underlining the necessity of accounting for the long-term sorption and dissipation characteristics when using models to predict the risk of groundwater contamination.......-term leaching of metabolites. A Danish study on the dissipation and sorption of metribuzin, involving both monitoring and batch experiments, concluded that desorption and degradation of metribuzin and leaching of its primary metabolite diketometribuzin continued for 5-6 years after application, posing a risk...

  2. The role of the anomaly cancellation mechanism in the evaluation of the radiatively induced Chern-Simons term in extended QED

    International Nuclear Information System (INIS)

    Battistel, O.A.; Dallabona, G.

    2004-01-01

    We consider the possible role played by the anomaly cancellation mechanism in the evaluation of the radiatively induced Chern-Simons (CS) term, arising from the Lorentz and CPT non-invariant fermionic sector, of an extended version of QED. We explicit evaluate the most general mathematical structure associated to the AVV triangle amplitude, closely related to the one involved in the CS term evaluation, using for this purposes an alternative calculational strategy to handle divergences in QFT's. We show that the requirement of consistency with the choices made in the construction of the Standard Model's renormalizability, in the evaluation of the AVV Green function, leave no room for a nonvanishing radiatively induced CS term, independently of the regularization prescription or equivalent philosophy adopted, in accordance with what was previously conjectured by other authors. (orig.)

  3. Long-term functional outcomes and correlation with regional brain connectivity by MRI diffusion tractography metrics in a near-term rabbit model of intrauterine growth restriction.

    Directory of Open Access Journals (Sweden)

    Miriam Illa

    Full Text Available BACKGROUND: Intrauterine growth restriction (IUGR affects 5-10% of all newborns and is associated with increased risk of memory, attention and anxiety problems in late childhood and adolescence. The neurostructural correlates of long-term abnormal neurodevelopment associated with IUGR are unknown. Thus, the aim of this study was to provide a comprehensive description of the long-term functional and neurostructural correlates of abnormal neurodevelopment associated with IUGR in a near-term rabbit model (delivered at 30 days of gestation and evaluate the development of quantitative imaging biomarkers of abnormal neurodevelopment based on diffusion magnetic resonance imaging (MRI parameters and connectivity. METHODOLOGY: At +70 postnatal days, 10 cases and 11 controls were functionally evaluated with the Open Field Behavioral Test which evaluates anxiety and attention and the Object Recognition Task that evaluates short-term memory and attention. Subsequently, brains were collected, fixed and a high resolution MRI was performed. Differences in diffusion parameters were analyzed by means of voxel-based and connectivity analysis measuring the number of fibers reconstructed within anxiety, attention and short-term memory networks over the total fibers. PRINCIPAL FINDINGS: The results of the neurobehavioral and cognitive assessment showed a significant higher degree of anxiety, attention and memory problems in cases compared to controls in most of the variables explored. Voxel-based analysis (VBA revealed significant differences between groups in multiple brain regions mainly in grey matter structures, whereas connectivity analysis demonstrated lower ratios of fibers within the networks in cases, reaching the statistical significance only in the left hemisphere for both networks. Finally, VBA and connectivity results were also correlated with functional outcome. CONCLUSIONS: The rabbit model used reproduced long-term functional impairments and their

  4. Long-Term Functional Outcomes and Correlation with Regional Brain Connectivity by MRI Diffusion Tractography Metrics in a Near-Term Rabbit Model of Intrauterine Growth Restriction

    Science.gov (United States)

    Illa, Miriam; Eixarch, Elisenda; Batalle, Dafnis; Arbat-Plana, Ariadna; Muñoz-Moreno, Emma; Figueras, Francesc; Gratacos, Eduard

    2013-01-01

    Background Intrauterine growth restriction (IUGR) affects 5–10% of all newborns and is associated with increased risk of memory, attention and anxiety problems in late childhood and adolescence. The neurostructural correlates of long-term abnormal neurodevelopment associated with IUGR are unknown. Thus, the aim of this study was to provide a comprehensive description of the long-term functional and neurostructural correlates of abnormal neurodevelopment associated with IUGR in a near-term rabbit model (delivered at 30 days of gestation) and evaluate the development of quantitative imaging biomarkers of abnormal neurodevelopment based on diffusion magnetic resonance imaging (MRI) parameters and connectivity. Methodology At +70 postnatal days, 10 cases and 11 controls were functionally evaluated with the Open Field Behavioral Test which evaluates anxiety and attention and the Object Recognition Task that evaluates short-term memory and attention. Subsequently, brains were collected, fixed and a high resolution MRI was performed. Differences in diffusion parameters were analyzed by means of voxel-based and connectivity analysis measuring the number of fibers reconstructed within anxiety, attention and short-term memory networks over the total fibers. Principal Findings The results of the neurobehavioral and cognitive assessment showed a significant higher degree of anxiety, attention and memory problems in cases compared to controls in most of the variables explored. Voxel-based analysis (VBA) revealed significant differences between groups in multiple brain regions mainly in grey matter structures, whereas connectivity analysis demonstrated lower ratios of fibers within the networks in cases, reaching the statistical significance only in the left hemisphere for both networks. Finally, VBA and connectivity results were also correlated with functional outcome. Conclusions The rabbit model used reproduced long-term functional impairments and their neurostructural

  5. Evaluation of Long-term Performance of Enhanced Anaerobic Source Zone Bioremediation using mass flux

    Science.gov (United States)

    Haluska, A.; Cho, J.; Hatzinger, P.; Annable, M. D.

    2017-12-01

    Chlorinated ethene DNAPL source zones in groundwater act as potential long term sources of contamination as they dissolve yielding concentrations well above MCLs, posing an on-going public health risk. Enhanced bioremediation has been applied to treat many source zones with significant promise, but long-term sustainability of this technology has not been thoroughly assessed. This study evaluated the long-term effectiveness of enhanced anaerobic source zone bioremediation at chloroethene contaminated sites to determine if the treatment prevented contaminant rebound and removed NAPL from the source zone. Long-term performance was evaluated based on achieving MCL-based contaminant mass fluxes in parent compound concentrations during different monitoring periods. Groundwater concertation versus time data was compiled for 6-sites and post-remedial contaminant mass flux data was then measured using passive flux meters at wells both within and down-gradient of the source zone. Post-remedial mass flux data was then combined with pre-remedial water quality data to estimate pre-remedial mass flux. This information was used to characterize a DNAPL dissolution source strength function, such as the Power Law Model and the Equilibrium Stream tube model. The six-sites characterized for this study were (1) Former Charleston Air Force Base, Charleston, SC; (2) Dover Air Force Base, Dover, DE; (3) Treasure Island Naval Station, San Francisco, CA; (4) Former Raritan Arsenal, Edison, NJ; (5) Naval Air Station, Jacksonville, FL; and, (6) Former Naval Air Station, Alameda, CA. Contaminant mass fluxes decreased for all the sites by the end of the post-treatment monitoring period and rebound was limited within the source zone. Post remedial source strength function estimates suggest that decreases in contaminant mass flux will continue to occur at these sites, but a mass flux based on MCL levels may never be exceeded. Thus, site clean-up goals should be evaluated as order

  6. Evaluating Internal Model Strength and Performance of Myoelectric Prosthesis Control Strategies.

    Science.gov (United States)

    Shehata, Ahmed W; Scheme, Erik J; Sensinger, Jonathon W

    2018-05-01

    On-going developments in myoelectric prosthesis control have provided prosthesis users with an assortment of control strategies that vary in reliability and performance. Many studies have focused on improving performance by providing feedback to the user but have overlooked the effect of this feedback on internal model development, which is key to improve long-term performance. In this paper, the strength of internal models developed for two commonly used myoelectric control strategies: raw control with raw feedback (using a regression-based approach) and filtered control with filtered feedback (using a classifier-based approach), were evaluated using two psychometric measures: trial-by-trial adaptation and just-noticeable difference. The performance of both strategies was also evaluated using Schmidt's style target acquisition task. Results obtained from 24 able-bodied subjects showed that although filtered control with filtered feedback had better short-term performance in path efficiency ( ), raw control with raw feedback resulted in stronger internal model development ( ), which may lead to better long-term performance. Despite inherent noise in the control signals of the regression controller, these findings suggest that rich feedback associated with regression control may be used to improve human understanding of the myoelectric control system.

  7. An interfacial shear term evaluation study for adiabatic dispersed air–water two-phase flow with the two-fluid model using CFD

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, S.L., E-mail: sharma55@purdue.edu [School of Nuclear Engineering, Purdue University, West Lafayette, IN (United States); Hibiki, T.; Ishii, M. [School of Nuclear Engineering, Purdue University, West Lafayette, IN (United States); Schlegel, J.P. [Department of Mining and Nuclear Engineering, Missouri University of Science and Technology, Rolla, MO (United States); Buchanan, J.R.; Hogan, K.J. [Bettis Laboratory, Naval Nuclear Laboratory, West Mifflin, PA (United States); Guilbert, P.W. [ANSYS UK Ltd, Oxfordshire (United Kingdom)

    2017-02-15

    Highlights: • Closure form of the interfacial shear term in three-dimensional form is investigated. • Assessment against adiabatic upward bubbly air–water flow data using CFD. • Effect of addition of the interfacial shear term on the phase distribution. - Abstract: In commercially available Computational Fluid Dynamics (CFD) codes such as ANSYS CFX and Fluent, the interfacial shear term is missing in the field momentum equations. The derivation of the two-fluid model (Ishii and Hibiki, 2011) indicates the presence of this term as a momentum source in the right hand side of the field momentum equation. The inclusion of this term is considered important for proper modeling of the interfacial momentum coupling between phases. For separated flows, such as annular flow, the importance of the shear term is understood in the one-dimensional (1-D) form as the major mechanism by which the wall shear is transferred to the gas phase (Ishii and Mishima, 1984). For gas dispersed two-phase flow CFD simulations, it is important to assess the significance of this term in the prediction of phase distributions. In the first part of this work, the closure of this term in three-dimensional (3-D) form in a CFD code is investigated. For dispersed gas–liquid flow, such as bubbly or churn-turbulent flow, bubbles are dispersed in the shear layer of the continuous phase. The continuous phase shear stress is mainly due to the presence of the wall and the modeling of turbulence through the Boussinesq hypothesis. In a 3-D simulation, the continuous phase shear stress can be calculated from the continuous fluid velocity gradient, so that the interfacial shear term can be closed using the local values of the volume fraction and the total stress of liquid phase. This form also assures that the term acts as an action-reaction force for multiple phases. In the second part of this work, the effect of this term on the volume fraction distribution is investigated. For testing the model two

  8. Evaluation of effects of long term exposure on lethal toxicity with mammals.

    Science.gov (United States)

    Verma, Vibha; Yu, Qiming J; Connell, Des W

    2014-02-01

    The relationship between exposure time (LT50) and lethal exposure concentration (LC50) has been evaluated over relatively long exposure times using a novel parameter, Normal Life Expectancy (NLT), as a long term toxicity point. The model equation, ln(LT50) = aLC50(ν) + b, where a, b and ν are constants, was evaluated by plotting lnLT50 against LC50 using available toxicity data based on inhalation exposure from 7 species of mammals. With each specific toxicant a single consistent relationship was observed for all mammals with ν always mammals and then be extended to estimate toxicity at any exposure time with other mammals. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  9. Evaluation of models of particulate suspension for a thorium ore stockpile

    International Nuclear Information System (INIS)

    Smith, W.J.

    1983-01-01

    Fifteen mathematical models of particle saltation, suspension, and resuspension were reviewed and categorized. Appropriate models were applied to the estimation of particulate releases from a hypothetical thorium ore storage pile. An assumed location (near Lemhi Pass, Montana) was used to permit the development of site specific information on ore characteristics and environmental influences. The available models were characterized in terms of suitability for representing aspects of the ore pile, such as rough surface features, wide particle size range, and site specific climate. Five models were selected for detailed study. A computer code for each of these is given. Site specific data for the assumed ore stockpile location were prepared. These data were manipulated to provide the input values required for each of the five models. Representative values and ranges for model variables are tabulated. The response of each model to input data for selected variables was determined. Each model was evaluated in terms of the physical realism of its response of each model to input data for selected variables was determined. Each model was evaluated in terms of the physical realism of its responses and its overall ability to represent the features of an ore stockpile. The two models providing the best representation were a modified version of the dust suspension subroutine TAILPS from the computer code MILDOS, and the dust suspension formulation from the computer code REDIST. Their responses are physically reasonable, although different from each other for two parameters: ore moisture and surface roughness. With the input values judged most representative of an ore pile near Lemhi Pass, the estimate of the release of suspended particulates is on the order of 1 g/m 2 -yr

  10. Monitoring and modeling of long-term settlements of an experimental landfill in Brazil.

    Science.gov (United States)

    Simões, Gustavo Ferreira; Catapreta, Cícero Antônio Antunes

    2013-02-01

    Settlement evaluation in sanitary landfills is a complex process, due to the waste heterogeneity, time-varying properties and influencing factors and mechanisms, such as mechanical compression due to load application and creep, and physical-chemical and biological processes caused by the wastes decomposition. Many empirical models for the analysis of long-term settlement in landfills are reported in the literature. This paper presents the results of a settlement monitoring program carried out during 6 years in Belo Horizonte experimental landfill. Different sets of field data were used to calibrate three long-term settlement prediction models (rheological, hyperbolic and composite). The parameters obtained in the calibration were used to predict the settlements and to compare with actual field data. During the monitoring period of 6 years, significant vertical strains were observed (of up to 31%) in relation to the initial height of the experimental landfill. The results for the long-term settlement prediction obtained by the hyperbolic and rheological models significantly underestimate the settlements, regardless the period of data used in the calibration. The best fits were obtained with the composite model, except when 1 year field data were used in the calibration. The results of the composite model indicate settlements stabilization at larger times and with larger final settlements when compared to the hyperbolic and rheological models. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Tapering off benzodiazepines in long-term users : an economic evaluation

    NARCIS (Netherlands)

    Oude Voshaar, Richard C; Krabbe, Paul F M; Gorgels, Wim J M J; Adang, Eddy M M; van Balkom, Anton J L M; van de Lisdonk, Eloy H; Zitman, Frans G

    2006-01-01

    BACKGROUND: Discontinuation of benzodiazepine usage has never been evaluated in economic terms. This study aimed to compare the relative costs and outcomes of tapering off long-term benzodiazepine use combined with group cognitive behavioural therapy (TO+CBT), tapering off alone (TOA) and usual

  12. Catalytic heat exchangers - a long-term evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Silversand, Fredrik A. [CATATOR AB, Lund (Sweden)

    2003-10-01

    A long-term evaluation concerning catalytic heat exchangers (CHEs) has been performed. The idea concerning CHEs was originally described in a number of reports issued by Catator almost a decade ago. The general idea with CHEs is to combust a fuel with a catalyst inside a heat exchanger to enable an effective heat transfer. The first design approaches demonstrated the function and the possibilities with CHEs but were defective concerning the heat exchanger design. Consequently, a heat exchanger company (SWEP International AB), which was specialised on brazed plate-type heat exchangers, joined the continued development project. Indeed, the new design approach containing Catator's wire-mesh catalysts and SWEP's plate-type heat exchangers enabled us to improve the concept considerably. The new design complied with a number of relevant technical demands, e.g.: Simplicity; Compactness and integration (few parts); High thermal efficiency; Low pressure drop; Excellent emissions; High turn-down ratio; Reasonable production cost. Spurred by the technical progresses, the importance of a long-term test under realistic conditions was clear. A long-term evaluation was initialised at Sydkraft Gas premises in Aastorp. The CHE was installed on a specially designed rig to enable accelerated testing with respect to the number of transients. The rig was operated continuously for 5000 hours and emission mapping was carried out at certain time intervals. Following some problems during the initial phase of the long-term evaluation, which unfortunately also delayed the project, the results indicated very stable conditions of operation. The emissions have been rather constant during the course of the test and we cannot see any tendencies to decreased performances. Indeed, the test verifies the function, operability and reliability of the CHE-concept. Apart from domestic boilers we foresee a number of interesting and relevant applications in heating and process technology. Since

  13. Ultrafine particles dispersion modeling in a street canyon: development and evaluation of a composite lattice Boltzmann model.

    Science.gov (United States)

    Habilomatis, George; Chaloulakou, Archontoula

    2013-10-01

    Recently, a branch of particulate matter research concerns on ultrafine particles found in the urban environment, which originate, to a significant extent, from traffic sources. In urban street canyons, dispersion of ultrafine particles affects pedestrian's short term exposure and resident's long term exposure as well. The aim of the present work is the development and the evaluation of a composite lattice Boltzmann model to study the dispersion of ultrafine particles, in urban street canyon microenvironment. The proposed model has the potential to penetrate into the physics of this complex system. In order to evaluate the model performance against suitable experimental data, ultrafine particles levels have been monitored on an hourly basis for a period of 35 days, in a street canyon, in Athens area. The results of the comparative analysis are quite satisfactory. Furthermore, our modeled results are in a good agreement with the results of other computational and experimental studies. This work is a first attempt to study the dispersion of an air pollutant by application of the lattice Boltzmann method. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Hydraulic/mechanical modeling of smectitic materials for HMC analytical evaluation of the long term performance of TRU geological repository - 59090

    International Nuclear Information System (INIS)

    Kobayashi, Ichizo; Owada, Hitoshi; Ishii, Tomoko

    2012-01-01

    Aiming at evaluation of the long term performance of transuranic (TRU) geological repositories, the hydraulic/ mechanical/chemical (HMC) analysis method has been studied. In this phase of research (four years) the hydraulic/mechanical modeling of smectitic materials for HMC analyses has been studied. In this paper, new experimental methods for investigation of the hydraulic/mechanical behavior of smectitic materials were developed. For hydraulic modeling, the measurement method of the specific surface area of compacted smectitic materials was developed using X-ray diffraction (XRD). The results of the method were applied to the Kozeny-Carman law. Since the specific surface area represents the microstructure of smectitic materials such as the degree of swelling, it was found that the Kozeny-Carman law using measured specific surface area of compacted smectitic materials was useful in evaluating the hydraulic performance of smectitic materials. Moreover, since the Kozeny-Carman law can take the alteration of content of pore water into consideration by not only a coefficient of viscosity but also by changes in specific surface area, the Kozeny-Carman law will be more suitable to chemical and mechanical couple analyses than the ordinary Darcy's law. For the mechanical modeling, the procedure of one dimensional exhausting compression test was developed. The tests gave the dry density and compression stress relation in the state of full-saturation of smectitic materials with varying water content. The relations between the dry density and compression stress in the state of fully saturation were termed fully saturation lines. The group of iso-grams of degree of saturation and water content were also given with this test. It was found that the fully-saturation line is consistent with swelling deformation-pressure relation in the equilibrium state. The results indicated that the swelling deformation-pressure relation does not depend on the saturation manner, such as the

  15. The Starobinsky model from superconformal D-term inflation

    International Nuclear Information System (INIS)

    Buchmuller, W.; Domcke, V.; Kamada, K.

    2013-06-01

    We point out that in the large field regime, the recently proposed superconformal D-term inflation model coincides with the Starobinsky model. In this regime, the inflaton field dominates over the Planck mass in the gravitational kinetic term in the Jordan frame. Slow-roll inflation is realized in the large field regime for sufficiently large gauge couplings. The Starobinsky model generally emerges as an effective description of slow-roll inflation if a Jordan frame exists where, for large inflaton field values, the action is scale invariant and the ratio λ of the inflaton self-coupling and the nonminimal coupling to gravity is tiny. The interpretation of this effective coupling is different in different models. In superconformal D-term inflation it is determined by the scale of grand unification, λ∝(Λ GUT /M P ) 4 .

  16. Investigation of Teachers' Perceptions of Organizational Citizenship Behavior and Their Evaluation in Terms of Educational Administration

    Science.gov (United States)

    Avci, Ahmet

    2016-01-01

    The aim of this study is to investigate teachers' perceptions of organizational citizenship behaviors and to evaluate them in terms of educational administration. Descriptive survey model was used in the research. The data of the research were obtained from 1,613 teachers working in public and private schools subjected to Ministry of National…

  17. Long-term modeling on HPV vaccination: do we really need any more?

    Science.gov (United States)

    Garattini, Livio; Curto, Alessandro; van de Vooren, Katelijne

    2015-04-01

    The human papillomavirus (HPV) is closely related to cervical cancer. In 2007, the EMA approved two vaccines, a bivalent and a quadrivalent one, launched at three-dose schedules and very high prices worldwide. We describe what happened in the EU and what might change in the near future from an economic perspective. HPV vaccination is now established in most EU countries. The main target group of the programs is girls aged 10-14 years. Many western countries used competitive tendering to purchase the two vaccines, achieving considerable savings. The extension to males has been a hotly debated issue. The sex limitation implies that this vaccination cannot by definition achieve a 'herd immunity' effect. EMA recently approved a two-dose schedule for both vaccines that should lead to savings, although it is hard to predict how the forthcoming nonavalent vaccine will affect the market situation. Several economic evaluations based on long-term models have been published on the HPV vaccination in the recent years, using official list prices as a baseline. Most of these models can be considered mere exercises in long-term forecasting. Recently, further long-term models have been published with two- and three-dose schedules as alternatives, and the nonavalent vaccine. We wonder what added value they give for public policy purposes.

  18. Modelling dynamic transport and adsorption of arsenic in soil-bed filters for long-term performance evaluation

    Science.gov (United States)

    Mondal, Sourav; Mondal, Raka; de, Sirshendu; Griffiths, Ian

    2017-11-01

    Purification of contaminated water following the safe water guidelines while generating sufficiently large throughput is a crucial requirement for the steady supply of safe water to large populations. Adsorption-based filtration processes using a multilayer soil bed has been posed as a viable method to achieve this goal. This work describes the theory of operation and prediction of the long-term behaviour of such a system. The fixed-bed column has a single input of contaminated water from the top and an output from the bottom. As the contaminant passes through the column, it is adsorbed by the medium. Like any other adsorption medium, the filter has a certain lifespan, beyond which the filtrate does not meet the safe limit of drinking water, which is defined as `breakthrough'. A mathematical model is developed that couples the fluid flow through the porous medium to the convective, diffusive and adsorptive transport of the contaminant. The results are validated with experimental observations and the model is then used to predict the breakthrough and lifetime of the filter. The key advantage of this model is that it can predict the long-term behaviour of any adsorption column system for any set of physical characteristics of the system. This worked was supported by the EPSRC Global Challenge Research Fund Institutional Sponsorship 2016.

  19. Evaluation of potential crushed-salt constitutive models

    International Nuclear Information System (INIS)

    Callahan, G.D.; Loken, M.C.; Sambeek, L.L. Van; Chen, R.; Pfeifle, T.W.; Nieland, J.D.; Hansen, F.D.

    1995-12-01

    Constitutive models describing the deformation of crushed salt are presented in this report. Ten constitutive models with potential to describe the phenomenological and micromechanical processes for crushed salt were selected from a literature search. Three of these ten constitutive models, termed Sjaardema-Krieg, Zeuch, and Spiers models, were adopted as candidate constitutive models. The candidate constitutive models were generalized in a consistent manner to three-dimensional states of stress and modified to include the effects of temperature, grain size, and moisture content. A database including hydrostatic consolidation and shear consolidation tests conducted on Waste Isolation Pilot Plant and southeastern New Mexico salt was used to determine material parameters for the candidate constitutive models. Nonlinear least-squares model fitting to data from the hydrostatic consolidation tests, the shear consolidation tests, and a combination of the shear and hydrostatic tests produces three sets of material parameter values for the candidate models. The change in material parameter values from test group to test group indicates the empirical nature of the models. To evaluate the predictive capability of the candidate models, each parameter value set was used to predict each of the tests in the database. Based on the fitting statistics and the ability of the models to predict the test data, the Spiers model appeared to perform slightly better than the other two candidate models. The work reported here is a first-of-its kind evaluation of constitutive models for reconsolidation of crushed salt. Questions remain to be answered. Deficiencies in models and databases are identified and recommendations for future work are made. 85 refs

  20. EVALUATION OF RAINFALL-RUNOFF MODELS FOR MEDITERRANEAN SUBCATCHMENTS

    Directory of Open Access Journals (Sweden)

    A. Cilek

    2016-06-01

    Full Text Available The development and the application of rainfall-runoff models have been a corner-stone of hydrological research for many decades. The amount of rainfall and its intensity and variability control the generation of runoff and the erosional processes operating at different scales. These interactions can be greatly variable in Mediterranean catchments with marked hydrological fluctuations. The aim of the study was to evaluate the performance of rainfall-runoff model, for rainfall-runoff simulation in a Mediterranean subcatchment. The Pan-European Soil Erosion Risk Assessment (PESERA, a simplified hydrological process-based approach, was used in this study to combine hydrological surface runoff factors. In total 128 input layers derived from data set includes; climate, topography, land use, crop type, planting date, and soil characteristics, are required to run the model. Initial ground cover was estimated from the Landsat ETM data provided by ESA. This hydrological model was evaluated in terms of their performance in Goksu River Watershed, Turkey. It is located at the Central Eastern Mediterranean Basin of Turkey. The area is approximately 2000 km2. The landscape is dominated by bare ground, agricultural and forests. The average annual rainfall is 636.4mm. This study has a significant importance to evaluate different model performances in a complex Mediterranean basin. The results provided comprehensive insight including advantages and limitations of modelling approaches in the Mediterranean environment.

  1. Behavioural Models of Motor Control and Short-Term Memory

    OpenAIRE

    Imanaka, Kuniyasu; Funase, Kozo; Yamauchi, Masaki

    1995-01-01

    We examined in this review article the behavioural and conceptual models of motor control and short-term memory which have intensively been investigated since the 1970s. First, we reviewed both the dual-storage model of short-term memory in which movement information is stored and a typical model of motor control which emphasizes the importance of efferent factors. We then examined two models of preselection effects: a cognitive model and a cognitive/ efferent model. Following this we reviewe...

  2. Study on dynamic team performance evaluation methodology based on team situation awareness model

    International Nuclear Information System (INIS)

    Kim, Suk Chul

    2005-02-01

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  3. Study on dynamic team performance evaluation methodology based on team situation awareness model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Suk Chul

    2005-02-15

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  4. The Starobinsky model from superconformal D-term inflation

    Energy Technology Data Exchange (ETDEWEB)

    Buchmuller, W.; Domcke, V.; Kamada, K.

    2013-06-15

    We point out that in the large field regime, the recently proposed superconformal D-term inflation model coincides with the Starobinsky model. In this regime, the inflaton field dominates over the Planck mass in the gravitational kinetic term in the Jordan frame. Slow-roll inflation is realized in the large field regime for sufficiently large gauge couplings. The Starobinsky model generally emerges as an effective description of slow-roll inflation if a Jordan frame exists where, for large inflaton field values, the action is scale invariant and the ratio {lambda} of the inflaton self-coupling and the nonminimal coupling to gravity is tiny. The interpretation of this effective coupling is different in different models. In superconformal D-term inflation it is determined by the scale of grand unification, {lambda}{proportional_to}({Lambda}{sub GUT}/M{sub P}){sup 4}.

  5. A nonlinear model for fluid flow in a multiple-zone composite reservoir including the quadratic gradient term

    International Nuclear Information System (INIS)

    Wang, Xiao-Lu; Fan, Xiang-Yu; Nie, Ren-Shi; Huang, Quan-Hua; He, Yong-Ming

    2013-01-01

    Based on material balance and Darcy's law, the governing equation with the quadratic pressure gradient term was deduced. Then the nonlinear model for fluid flow in a multiple-zone composite reservoir including the quadratic gradient term was established and solved using a Laplace transform. A series of standard log–log type curves of 1-zone (homogeneous), 2-zone and 3-zone reservoirs were plotted and nonlinear flow characteristics were analysed. The type curves governed by the coefficient of the quadratic gradient term (β) gradually deviate from those of a linear model with time elapsing. Qualitative and quantitative analyses were implemented to compare the solutions of the linear and nonlinear models. The results showed that differences of pressure transients between the linear and nonlinear models increase with elapsed time and β. At the end, a successful application of the theoretical model data against the field data shows that the nonlinear model will be a good tool to evaluate formation parameters more accurately. (paper)

  6. Long-term strategic asset allocation: An out-of-sample evaluation

    NARCIS (Netherlands)

    Diris, B.F.; Palm, F.C.; Schotman, P.C.

    We evaluate the out-of-sample performance of a long-term investor who follows an optimized dynamic trading strategy. Although the dynamic strategy is able to benefit from predictability out-of-sample, a short-term investor using a single-period market timing strategy would have realized an almost

  7. The IIR evaluation model

    DEFF Research Database (Denmark)

    Borlund, Pia

    2003-01-01

    An alternative approach to evaluation of interactive information retrieval (IIR) systems, referred to as the IIR evaluation model, is proposed. The model provides a framework for the collection and analysis of IR interaction data. The aim of the model is two-fold: 1) to facilitate the evaluation ...

  8. Study on crystalline rock for evaluating method of long-term behavior. FY2012 (Contract research)

    International Nuclear Information System (INIS)

    Fukui, Katsunori; Hashiba, Kimihiro; Tanno, Takeo; Hikima, Ryoichi; Sanada, Hiroyuki; Sato, Toshinori

    2013-12-01

    Rock shows time-dependent behavior such as creep/relaxation. With respect to high-level radioactive waste disposal, knowledge of the long-term mechanical stability of shafts and galleries excavated in rock are required, over a period of thousands of years after closure as well as during construction and operation. Therefore, it is very important to understand the time-dependent behavior of rock for evaluating long-term mechanical stability. The purpose of this study is to determine the mechanisms of time-dependent behavior of rock by the precise test (e.g. laboratory creep test), observation and measurement and to develop methods for evaluating long-term mechanical stability. In previous works, testing techniques were established and basic evaluation methods were developed. Recently, some parameters, which required for simulation of time-dependent behavior, were determined for the modeling of biotite granite (Toki granite) distributed around the Mizunami underground research laboratory. However, we were not able to obtain enough data to assess the reliability of the method to evaluate these parameters. This report describes the results of the research activities carried out in fiscal year 2012. In Chapter 1, we provide background and an overview of this study. In Chapter 2, the results of a long-term creep test on Tage tuff, started in fiscal year 1997, are described. In Chapter 3, the experimental results concerning the loading-rate dependency of rock strength were examined to understand the time-dependent behavior of rock. In Chapter 4, the stability of tunnels, under conditions which rock stress is larger than that around a circular tunnel, were examined to obtain useful information on the future plan for in-situ tests in the underground research laboratory. (author)

  9. Development of Integrity Evaluation Technology for the Long-term Spent Fuel Dry Storage System (1st year Report)

    International Nuclear Information System (INIS)

    Choi, Jong Won; Kook, Dong Hak; Kim, Jun Sub

    2010-05-01

    Korea has operated 16 Pressurized Water Reactors(PWR) and has a plan to construct additional nuclear power reactors as only PWR. This causes a big issue of PWR spent fuel accumulation problem now and in the future. KRMC(Korea Radioactive waste Management Coorporation) which was established in 2009 is charged with managing all kinds of radioactive waste that is produced in Korea. KRMC is considering spent fuel dry storage as an option to solve this spent fuel problem and developing the related engineering techniques. KAERI(Korea Atomic Energy Research Institute) also participated in this development and focused on evaluating the spent fuel dry storage system integrity for a long term operation. This report is the first year research product. The aims of the first year work scope are surveying and analyzing models which could anticipate degradation phenomena of the all dry storage components(spent fuel, structure materials, and equipment materials) and selecting items of the tests which are planned to perform in the next project stage. The major work areas consist of 'spent fuel degradation evaluation model development', 'test senario development', 'long-term evaluation of structural material characteristics', and 'dry storage system structure degradation model development'. These works were successfully achieved. This report is expected to contribute for the second year work which includes degradation model development and test senario development, and next project stage

  10. Modeling Wettability Variation during Long-Term Water Flooding

    Directory of Open Access Journals (Sweden)

    Renyi Cao

    2015-01-01

    Full Text Available Surface property of rock affects oil recovery during water flooding. Oil-wet polar substances adsorbed on the surface of the rock will gradually be desorbed during water flooding, and original reservoir wettability will change towards water-wet, and the change will reduce the residual oil saturation and improve the oil displacement efficiency. However there is a lack of an accurate description of wettability alternation model during long-term water flooding and it will lead to difficulties in history match and unreliable forecasts using reservoir simulators. This paper summarizes the mechanism of wettability variation and characterizes the adsorption of polar substance during long-term water flooding from injecting water or aquifer and relates the residual oil saturation and relative permeability to the polar substance adsorbed on clay and pore volumes of flooding water. A mathematical model is presented to simulate the long-term water flooding and the model is validated with experimental results. The simulation results of long-term water flooding are also discussed.

  11. Contribution and limits of geochemical calculation codes to evaluate the long term behavior of nuclear waste glasses; Apports et limites des modeles geochimiques pour l'evaluation du comportement a long terme des verres de confinement des dechets radioactifs

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, B; Crovisier, J L [Universite Louis Pasteur, Centre de Geochimie de la Surface, CNRS ULP, Ecole et Observatoire des Sciences de la Terre, 67 - Strasbourg (France)

    1997-07-01

    Geochemical models have been intensively developed by researchers since more than twenty five years in order to be able to better understand and/or predict the long term stability/instability of water-rock systems. These geochemical codes were ail built first on a thermodynamic approach deriving from the application of Mass Action Law. The resulting first generation of models allowed to detect or predict the possible mass transfers (thermodynamic models) between aqueous and mineral phases including irreversible dissolutions of primary minerals and/or precipitation near equilibrium of secondary mineral phases. The recent development of models based on combined thermodynamics and kinetics opens the field of Lime dependent reactions prediction. This is crucial if one thinks to combine geochemical and hydrological studies in the so-called coupled models for transport and reaction calculations. All these models are progressively applied to the prediction of long term behavior of mineral phases, and more specifically glasses. In order to succeed in chat specific extension of the models, but also the data bases, there is a great need for additional new data from experimental approaches and from natural analogues. The modelling approach appears than also very useful in order to interpret the results of experimental data and to relate them to long term data extracted from natural analogues. Specific functions for modelling solid solution phases mat' also be used for describing the products of glasses alterations. (authors)

  12. Modelling of chemical evolution of low pH cements at long term

    International Nuclear Information System (INIS)

    El Bitouri, Y.; Buffo-Lacarriere, L.; Sellier, A.; Bourbon, X.

    2015-01-01

    In the context of the underground radioactive waste repository, low-pH cements were developed to reduce interactions between concrete and clay barrier. These cements contain high proportions of mineral additions like silica fume, fly ash or blast furnace slag for example. The high ratio of cement replacement by pozzolanic additions allows to reduce the pH by a global reduction of Ca/Si ratio of the hydrates (according to the one observed on CEM I pastes). In order to predict the short term development of the hydration for each component of this cement, a multiphasic hydration model, previously developed, is used. The model predicts the evolution of hydration degree of each anhydrous phase and consequently the quantity of each hydrate in paste (CH, aluminates, CSH with different Ca/Si ratios). However, this model is not suitable to determine the long term mineralogical and chemical evolution of the material, due to the internal change induced by chemical imbalance between initial hydrates. In order to evaluate the chemical characteristics of low pH cement based materials, and thus assess its chemical stability in the context of radioactive waste storage, a complementary model of chemical evolution at long term is proposed. This original model is based on 'solid-solution' principles. It assumes that the microdiffusion of calcium plays a major role to explain how the different Ca/Si ratio of initial C-S-H tends together toward a medium stabilized value. The main mechanisms and full development of the model equations are presented first. Next, a comparison of the model with experimental data issue from EDS (Energy Dispersive X-ray Spectroscopy) analysis on low pH cement allows to test the model. (authors)

  13. Evaluating the Value of High Spatial Resolution in National Capacity Expansion Models using ReEDS: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Krishnan, Venkat; Cole, Wesley

    2016-07-01

    Power sector capacity expansion models (CEMs) have a broad range of spatial resolutions. This paper uses the Regional Energy Deployment System (ReEDS) model, a long-term national scale electric sector CEM, to evaluate the value of high spatial resolution for CEMs. ReEDS models the United States with 134 load balancing areas (BAs) and captures the variability in existing generation parameters, future technology costs, performance, and resource availability using very high spatial resolution data, especially for wind and solar modeled at 356 resource regions. In this paper we perform planning studies at three different spatial resolutions--native resolution (134 BAs), state-level, and NERC region level--and evaluate how results change under different levels of spatial aggregation in terms of renewable capacity deployment and location, associated transmission builds, and system costs. The results are used to ascertain the value of high geographically resolved models in terms of their impact on relative competitiveness among renewable energy resources.

  14. Variable Renewable Energy in Long-Term Planning Models: A Multi-Model Perspective

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Frew, Bethany [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sun, Yinong [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bistline, John [Electric Power Research Inst. (EPRI), Knoxville, TN (United States); Blanford, Geoffrey [Electric Power Research Inst. (EPRI), Knoxville, TN (United States); Young, David [Electric Power Research Inst. (EPRI), Knoxville, TN (United States); Marcy, Cara [U.S. Energy Information Administration, Washington, DC (United States); Namovicz, Chris [U.S. Energy Information Administration, Washington, DC (United States); Edelman, Risa [US Environmental Protection Agency (EPA), Washington, DC (United States); Meroney, Bill [US Environmental Protection Agency (EPA), Washington, DC (United States); Sims, Ryan [US Environmental Protection Agency (EPA), Washington, DC (United States); Stenhouse, Jeb [US Environmental Protection Agency (EPA), Washington, DC (United States); Donohoo-Vallett, Paul [Dept. of Energy (DOE), Washington DC (United States)

    2017-11-01

    Long-term capacity expansion models of the U.S. electricity sector have long been used to inform electric sector stakeholders and decision-makers. With the recent surge in variable renewable energy (VRE) generators — primarily wind and solar photovoltaics — the need to appropriately represent VRE generators in these long-term models has increased. VRE generators are especially difficult to represent for a variety of reasons, including their variability, uncertainty, and spatial diversity. This report summarizes the analyses and model experiments that were conducted as part of two workshops on modeling VRE for national-scale capacity expansion models. It discusses the various methods for treating VRE among four modeling teams from the Electric Power Research Institute (EPRI), the U.S. Energy Information Administration (EIA), the U.S. Environmental Protection Agency (EPA), and the National Renewable Energy Laboratory (NREL). The report reviews the findings from the two workshops and emphasizes the areas where there is still need for additional research and development on analysis tools to incorporate VRE into long-term planning and decision-making. This research is intended to inform the energy modeling community on the modeling of variable renewable resources, and is not intended to advocate for or against any particular energy technologies, resources, or policies.

  15. 'Semi-realistic'F-term inflation model building in supergravity

    International Nuclear Information System (INIS)

    Kain, Ben

    2008-01-01

    We describe methods for building 'semi-realistic' models of F-term inflation. By semi-realistic we mean that they are built in, and obey the requirements of, 'semi-realistic' particle physics models. The particle physics models are taken to be effective supergravity theories derived from orbifold compactifications of string theory, and their requirements are taken to be modular invariance, absence of mass terms and stabilization of moduli. We review the particle physics models, their requirements and tools and methods for building inflation models

  16. Evaluating the Value of High Spatial Resolution in National Capacity Expansion Models using ReEDS

    Energy Technology Data Exchange (ETDEWEB)

    Krishnan, Venkat; Cole, Wesley

    2016-07-18

    This poster is based on the paper of the same name, presented at the IEEE Power & Energy Society General Meeting, July18, 2016. Power sector capacity expansion models (CEMs) have a broad range of spatial resolutions. This paper uses the Regional Energy Deployment System (ReEDS) model, a long-term national scale electric sector CEM, to evaluate the value of high spatial resolution for CEMs. ReEDS models the United States with 134 load balancing areas (BAs) and captures the variability in existing generation parameters, future technology costs, performance, and resource availability using very high spatial resolution data, especially for wind and solar modeled at 356 resource regions. In this paper we perform planning studies at three different spatial resolutions - native resolution (134 BAs), state-level, and NERC region level - and evaluate how results change under different levels of spatial aggregation in terms of renewable capacity deployment and location, associated transmission builds, and system costs. The results are used to ascertain the value of high geographically resolved models in terms of their impact on relative competitiveness among renewable energy resources.

  17. Merons in a generally covariant model with Gursey term

    International Nuclear Information System (INIS)

    Akdeniz, K.G.; Smailagic, A.

    1982-10-01

    We study meron solutions of the generally covariant and Weyl invariant fermionic model with Gursey term. We find that, due to the presence of this term, merons can exist even without the cosmological constant. This is a new feature compared to previously studied models. (author)

  18. An application of RELAP5/MOD3 to the post-LOCA long term cooling performance evaluation

    International Nuclear Information System (INIS)

    Bang, Young Seok; Jung, Jae Won; Seul, Kwang Won; Kim, Hho Jung

    1998-01-01

    A realistic long-term calculation to be used in the post-LOCA long term cooling (LTC) analysis is described in this study, which was required to resolve the post-LOCA LTC issues including the concern on boric acid precipitation in the reactor core. The analysis scope is defined according to the LTC plan of UCN Units 3/4 and the plant calculation model are developed suitable to the LTC procedure. The LTC sequences following the cold leg small break LOCAs of 0.02 ft2 to 0.5 ft2 are calculated by RELAP5/ MOD3.2.2. Based on the calculation results, the establishment of shutdown cooling system entry condition and the behavior of boron transport are evaluated. The effect of model simplification is also investigated

  19. Dose evaluation model for radionuclides released from the spent nuclear fuel reprocessing plant in Rokkasho

    International Nuclear Information System (INIS)

    Hisamatsu, Shun'ichi; Iyogi, Takashi; Inaba, Jiro; Chiang, Jing-Hsien; Suwa, Hiroji; Koide, Mitsuo

    2007-01-01

    A dose evaluation model was developed for radionuclides released from the spent nuclear fuel reprocessing plant which is located in Rokkasho, Aomori Prefecture, and now undergoing test operation. The dose evaluation model suitable for medium- and long-term dose assessments for both prolonged and short-term releases of radionuclides to the atmosphere was developed on the PC. The ARAC-2, a particle tracing type dispersion model coupled with 3-D wind field calculation by a mass conservative model, was adopted as the atmospheric dispersion model. The terrestrial transfer model included movement in soil and groundwater as well as an agricultural and livestock farming system. The available site-specific social and environmental characteristics were incorporated in the model. Growing of the crops was also introduced and radionuclides absorbed were calculated from weight increase from the start of deposition to harvest, and transfer factors. Most of the computer code system of the models was completed by 2005, and this paper reports the results of the development. (author)

  20. Long-term reliability evaluation of nuclear containments with tendon force degradation

    International Nuclear Information System (INIS)

    Kim, Sang-Hyo; Choi, Moon-Seock; Joung, Jung-Yeun; Kim, Kun-Soo

    2013-01-01

    Highlights: • A probabilistic model on long-term degradation of tendon force is developed. • By using the model, we performed reliability evaluation of nuclear containment. • The analysis is also performed for the case with the strict maintenance programme. • We showed how to satisfy the target safety in the containments facing life extension. - Abstract: The long-term reliability of nuclear containment is important for operating nuclear power plants. In particular, long-term reliability should be clarified when the service life of nuclear containment is being extended. This study focuses not only on determining the reliability of nuclear containment but also presenting the reliability improvement by strengthening the containment itself or by running a strict maintenance programme. The degradation characteristics of tendon force are estimated from the data recorded during in-service inspection of containments. A reliability analysis is conducted for a limit state of through-wall cracking, which is conservative, but most crucial limit state. The results of this analysis indicate that reliability is the lowest at 3/4 height of the containment wall. Therefore, this location is the most vulnerable for the specific limit state considered in this analysis. Furthermore, changes in structural reliability owing to an increase in the number of inspecting tendons are analysed for verifying the effect of the maintenance program's intensity on expected containment reliability. In the last part of this study, an example of obtaining target reliability of nuclear containment by strengthening its structural resistance is presented. A case study is conducted for exemplifying the effect of strengthening work on containment reliability, especially during extended service life

  1. Challenges in integrating shrot-term behaviour in a mixed-fishery Management Strategies Evaluation frame: a case study of the North Sea flatfish fishery

    NARCIS (Netherlands)

    Andersen, B.S.; Vermard, Y.; Ulrich, C.; Hutton, T.; Poos, J.J.

    2010-01-01

    This study presents a fleet-based bioeconomic simulation model to the international mixed flatfish fishery in the North Sea. The model uses a Management Strategies Evaluation framework including a discrete choice model accounting for short-term temporal changes in effort allocation across fisheries.

  2. Measuring Success in Obesity Prevention: A Synthesis of Health Promotion Switzerland's Long-Term Monitoring and Evaluation Strategy

    Directory of Open Access Journals (Sweden)

    Günter Ackermann

    2015-01-01

    Full Text Available Aims: Since 2007, Health Promotion Switzerland has implemented a national priority program for a healthy body weight. This article provides insight into the methodological challenges and results of the program evaluation. Methods: Evaluation of the long-term program required targeted monitoring and evaluation projects addressing different outcome levels. The evaluation was carried out according to the Swiss Model for Outcome Classification (SMOC, a model designed to classify the effects of health promotion and prevention efforts. Results: The results presented in this article emphasize both content and methods. The national program successfully achieved outcomes on many different levels within complex societal structures. The evaluation system built around the SMOC enabled assessment of program progress and the development of key indicators. However, it is not possible to determine definitively to what extent the national program helped stabilize the prevalence of obesity in Switzerland. Conclusion: The model has shown its utility in providing a basis for evaluation and monitoring of the national program. Continuous analysis of data from evaluation and monitoring has made it possible to check the plausibility of suspected causal relationships as well as to establish an overall perspective and assessment of effectiveness supported by a growing body of evidence.

  3. Experimental models of tracheobronchial stenoses: a useful tool for evaluating airway stents.

    Science.gov (United States)

    Marquette, C H; Mensier, E; Copin, M C; Desmidt, A; Freitag, L; Witt, C; Petyt, L; Ramon, P

    1995-09-01

    Stent implantation is a conservative alternative to open operation for treating benign tracheobronchial strictures. Most of the presently available stents were primarily designed for endovascular use. Their respiratory use entails a risk of iatrogenic complications. From a scientific and from an ethical point of view these risks justify preclinical evaluation of new respiratory stents in experimental models of central airway stenoses. Therefore, an attempt was made to develop such models in piglets and adult minipigs. Tracheal stenoses were obtained by creating first a segmental tracheomalacia through extramucosal resection of cartilaginous arches. The fibrous component of the stenoses was then obtained through bronchoscopic application of a caustic agent causing progressive deep mucosal and submucosal injury. Stenoses of the main bronchi were created by topical application of the caustic agent only. These models demonstrated the typical features of benign fibromalacic tracheobronchial stenoses with constant recurrence after mechanical dilation. Preliminary experiments showed that short-term problems of tolerance of stent prototypes are easily demonstrable in these models. These experimental models, which simulate quite realistically human diseases, offer the opportunity to perfect new tracheobronchial stents specifically designed for respiratory use and to evaluate their long-term tolerance before their use in humans.

  4. [The Brazilian National Health Surveillance Agency performance evaluation at the management contract model].

    Science.gov (United States)

    Moreira, Elka Maltez de Miranda; Costa, Ediná Alves

    2010-11-01

    The Brazilian National Health Surveillance Agency (Anvisa) is supervised by the Ministry of Health by means of a management contract, a performance evaluation tool. This case study was aimed at describing and analyzing Anvisa's performance evaluation model based on the agency's institutional purpose, according to the following analytical categories: the management contract formalization, evaluation tools, evaluators and institutional performance. Semi-structured interviews and document analysis revealed that Anvisa signed only one management contract with the Ministry of Health in 1999, updated by four additive terms. The Collegiate Board of Directors and the Advisory Center for Strategic Management play the role of Anvisa's internal evaluators and an Assessing Committee, comprising the Ministry of Health, constitutes its external evaluator. Three phases were identified in the evaluation model: the structuring of the new management model (1999-2000), legitimation regarding the productive segment (2001-2004) and widespread legitimation (2005). The best performance was presented in 2000 (86.05%) and the worst in 2004 (40.00%). The evaluation model was shown to have contributed little towards the agency's institutional purpose and the effectiveness measurement of the implemented actions.

  5. Evaluation of short-term and long-term stability of emulsions by centrifugation and NMR

    International Nuclear Information System (INIS)

    Tcholakova, S.; Denkov, N.; Ivanov, I.; Marinov, R.

    2004-01-01

    The effect of storage time on the coalescence stability and drop size distribution of egg yolk and whey protein concentrate stabilized emulsions is studied. The emulsion stability is evaluated by centrifugation, whereas the drop size distribution is measured by means of NMR and optical microscopy. The experimental results show that there is no general relation between the emulsion stability and the changes in the mean drop diameter upon shelf-storage of protein emulsions. On the other hand, it is shown that the higher short-term stability, measured by centrifugation immediately after emulsion preparation, corresponds to higher long-term stability (after their self-storage up to 60 days) for emulsions stabilized by the same type of emulsifier. In this way, we are able to obtain information for the long-term stability of emulsions in a relatively short period of time.(authors)

  6. Modeling long-term dynamics of electricity markets

    International Nuclear Information System (INIS)

    Olsina, Fernando; Garces, Francisco; Haubrich, H.-J.

    2006-01-01

    In the last decade, many countries have restructured their electricity industries by introducing competition in their power generation sectors. Although some restructuring has been regarded as successful, the short experience accumulated with liberalized power markets does not allow making any founded assertion about their long-term behavior. Long-term prices and long-term supply reliability are now center of interest. This concerns firms considering investments in generation capacity and regulatory authorities interested in assuring the long-term supply adequacy and the stability of power markets. In order to gain significant insight into the long-term behavior of liberalized power markets, in this paper, a simulation model based on system dynamics is proposed and the underlying mathematical formulations extensively discussed. Unlike classical market models based on the assumption that market outcomes replicate the results of a centrally made optimization, the approach presented here focuses on replicating the system structure of power markets and the logic of relationships among system components in order to derive its dynamical response. The simulations suggest that there might be serious problems to adjust early enough the generation capacity necessary to maintain stable reserve margins, and consequently, stable long-term price levels. Because of feedback loops embedded in the structure of power markets and the existence of some time lags, the long-term market development might exhibit a quite volatile behavior. By varying some exogenous inputs, a sensitivity analysis is carried out to assess the influence of these factors on the long-run market dynamics

  7. Low-level radioactive waste source term model development and testing: Topical report

    International Nuclear Information System (INIS)

    Sullivan, T.M.; Kempf, C.R.; Suen, C.J.; Mughabghab, S.M.

    1988-08-01

    The Low-Level Waste Source Term Evaluation Project has the objective to develop a system model capable of predicting radionuclide release rates from a shallow land burial facility. The previous topical report for this project discussed the framework and methodology for developing a system model and divided the problem into four compartments: water flow, container degradation, waste form leaching, and radionuclide transport. Each of these compartments is described by submodels which will be coupled into the system model. From February 1987 to March 1988, computer models have been selected to predict water flow (FEMWATER) and radionuclide transport (FEMWASTE) and separate models have been developed to predict pitting corrosion of steel containers and leaching from porous waste forms contained in corrodible containers. This report discusses each of the models in detail and presents results obtained from applying the models to shallow land burial trenches over a range of expected conditions. 68 refs., 34 figs., 14 tabs

  8. Models and Rules of Evaluation in International Accounting

    Directory of Open Access Journals (Sweden)

    Niculae Feleaga

    2006-04-01

    Full Text Available The accounting procedures cannot be analyzed without a previous evaluation. Value is in general a very subjective issue, usually the result of a monetary evaluation made to a specific asset, group of assets or entities, or to some rendered services. Within the economic sciences, value comes from its very own deep history. In accounting, the concept of value had a late and fragile start. The term of value must not be misinterpreted as being the same thing with cost, even though value is frequently measured through costs. At the origin of the international accounting standards lays the framework for preparing, presenting and disclosing the financial statements. The framework stays as a reference matrix, as a standard of standards, as a constitution of financial accounting. According to the international framework, the financial statements use different evaluation basis: the hystorical cost, the current cost, the realisable (settlement value, the present value (the present value of cash flows. Choosing the evaluation basis and the capital maintenance concept will eventually determine the accounting evaluation model used in preparing the financial statements of a company. The multitude of accounting evaluation models differentiate themselves one from another through various relevance and reliable degrees of accounting information and therefore, accountants (the prepares of financial statements must try to equilibrate these two main qualitative characteristics of financial information.

  9. Models and Rules of Evaluation in International Accounting

    Directory of Open Access Journals (Sweden)

    Liliana Feleaga

    2006-06-01

    Full Text Available The accounting procedures cannot be analyzed without a previous evaluation. Value is in general a very subjective issue, usually the result of a monetary evaluation made to a specific asset, group of assets or entities, or to some rendered services. Within the economic sciences, value comes from its very own deep history. In accounting, the concept of value had a late and fragile start. The term of value must not be misinterpreted as being the same thing with cost, even though value is frequently measured through costs. At the origin of the international accounting standards lays the framework for preparing, presenting and disclosing the financial statements. The framework stays as a reference matrix, as a standard of standards, as a constitution of financial accounting. According to the international framework, the financial statements use different evaluation basis: the hystorical cost, the current cost, the realisable (settlement value, the present value (the present value of cash flows. Choosing the evaluation basis and the capital maintenance concept will eventually determine the accounting evaluation model used in preparing the financial statements of a company. The multitude of accounting evaluation models differentiate themselves one from another through various relevance and reliable degrees of accounting information and therefore, accountants (the prepares of financial statements must try to equilibrate these two main qualitative characteristics of financial information.

  10. The cointegrated vector autoregressive model with general deterministic terms

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    2017-01-01

    In the cointegrated vector autoregression (CVAR) literature, deterministic terms have until now been analyzed on a case-by-case, or as-needed basis. We give a comprehensive unified treatment of deterministic terms in the additive model X(t)=Z(t) Y(t), where Z(t) belongs to a large class...... of deterministic regressors and Y(t) is a zero-mean CVAR. We suggest an extended model that can be estimated by reduced rank regression and give a condition for when the additive and extended models are asymptotically equivalent, as well as an algorithm for deriving the additive model parameters from the extended...... model parameters. We derive asymptotic properties of the maximum likelihood estimators and discuss tests for rank and tests on the deterministic terms. In particular, we give conditions under which the estimators are asymptotically (mixed) Gaussian, such that associated tests are X 2 -distributed....

  11. The cointegrated vector autoregressive model with general deterministic terms

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Morten Ørregaard

    In the cointegrated vector autoregression (CVAR) literature, deterministic terms have until now been analyzed on a case-by-case, or as-needed basis. We give a comprehensive unified treatment of deterministic terms in the additive model X(t)= Z(t) + Y(t), where Z(t) belongs to a large class...... of deterministic regressors and Y(t) is a zero-mean CVAR. We suggest an extended model that can be estimated by reduced rank regression and give a condition for when the additive and extended models are asymptotically equivalent, as well as an algorithm for deriving the additive model parameters from the extended...... model parameters. We derive asymptotic properties of the maximum likelihood estimators and discuss tests for rank and tests on the deterministic terms. In particular, we give conditions under which the estimators are asymptotically (mixed) Gaussian, such that associated tests are khi squared distributed....

  12. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    International Nuclear Information System (INIS)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P.

    2012-09-01

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  13. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)

    2012-09-15

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  14. Models with oscillator terms in noncommutative quantum field theory

    International Nuclear Information System (INIS)

    Kronberger, E.

    2010-01-01

    The main focus of this Ph.D. thesis is on noncommutative models involving oscillator terms in the action. The first one historically is the successful Grosse-Wulkenhaar (G.W.) model which has already been proven to be renormalizable to all orders of perturbation theory. Remarkably it is furthermore capable of solving the Landau ghost problem. In a first step, we have generalized the G.W. model to gauge theories in a very straightforward way, where the action is BRS invariant and exhibits the good damping properties of the scalar theory by using the same propagator, the so-called Mehler kernel. To be able to handle some more involved one-loop graphs we have programmed a powerful Mathematica package, which is capable of analytically computing Feynman graphs with many terms. The result of those investigations is that new terms originally not present in the action arise, which led us to the conclusion that we should better start from a theory where those terms are already built in. Fortunately there is an action containing this complete set of terms. It can be obtained by coupling a gauge field to the scalar field of the G.W. model, integrating out the latter, and thus 'inducing' a gauge theory. Hence the model is called Induced Gauge Theory. Despite the advantage that it is by construction completely gauge invariant, it contains also some unphysical terms linear in the gauge field. Advantageously we could get rid of these terms using a special gauge dedicated to this purpose. Within this gauge we could again establish the Mehler kernel as gauge field propagator. Furthermore we where able to calculate the ghost propagator, which turned out to be very involved. Thus we were able to start with the first few loop computations showing the expected behavior. The next step is to show renormalizability of the model, where some hints towards this direction will also be given. (author) [de

  15. Model Test Bed for Evaluating Wave Models and Best Practices for Resource Assessment and Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Neary, Vincent Sinclair [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies; Yang, Zhaoqing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Coastal Sciences Division; Wang, Taiping [Pacific Northwest National Lab. (PNNL), Richland, WA (United States). Coastal Sciences Division; Gunawan, Budi [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies; Dallman, Ann Renee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Water Power Technologies

    2016-03-01

    A wave model test bed is established to benchmark, test and evaluate spectral wave models and modeling methodologies (i.e., best practices) for predicting the wave energy resource parameters recommended by the International Electrotechnical Commission, IEC TS 62600-101Ed. 1.0 ©2015. Among other benefits, the model test bed can be used to investigate the suitability of different models, specifically what source terms should be included in spectral wave models under different wave climate conditions and for different classes of resource assessment. The overarching goal is to use these investigations to provide industry guidance for model selection and modeling best practices depending on the wave site conditions and desired class of resource assessment. Modeling best practices are reviewed, and limitations and knowledge gaps in predicting wave energy resource parameters are identified.

  16. Discrete choice models with multiplicative error terms

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Bierlaire, Michel

    2009-01-01

    The conditional indirect utility of many random utility maximization (RUM) discrete choice models is specified as a sum of an index V depending on observables and an independent random term ε. In general, the universe of RUM consistent models is much larger, even fixing some specification of V due...

  17. Long-term predictive capability of erosion models

    Science.gov (United States)

    Veerabhadra, P.; Buckley, D. H.

    1983-01-01

    A brief overview of long-term cavitation and liquid impingement erosion and modeling methods proposed by different investigators, including the curve-fit approach is presented. A table was prepared to highlight the number of variables necessary for each model in order to compute the erosion-versus-time curves. A power law relation based on the average erosion rate is suggested which may solve several modeling problems.

  18. A systematic review of modelling approaches in economic evaluations of health interventions for drug and alcohol problems.

    Science.gov (United States)

    Hoang, Van Phuong; Shanahan, Marian; Shukla, Nagesh; Perez, Pascal; Farrell, Michael; Ritter, Alison

    2016-04-13

    The overarching goal of health policies is to maximize health and societal benefits. Economic evaluations can play a vital role in assessing whether or not such benefits occur. This paper reviews the application of modelling techniques in economic evaluations of drug and alcohol interventions with regard to (i) modelling paradigms themselves; (ii) perspectives of costs and benefits and (iii) time frame. Papers that use modelling approaches for economic evaluations of drug and alcohol interventions were identified by carrying out searches of major databases. Thirty eight papers met the inclusion criteria. Overall, the cohort Markov models remain the most popular approach, followed by decision trees, Individual based model and System dynamics model (SD). Most of the papers adopted a long term time frame to reflect the long term costs and benefits of health interventions. However, it was fairly common among the reviewed papers to adopt a narrow perspective that only takes into account costs and benefits borne by the health care sector. This review paper informs policy makers about the availability of modelling techniques that can be used to enhance the quality of economic evaluations for drug and alcohol treatment interventions.

  19. Evaluating Translational Research: A Process Marker Model

    Science.gov (United States)

    Trochim, William; Kane, Cathleen; Graham, Mark J.; Pincus, Harold A.

    2011-01-01

    Abstract Objective: We examine the concept of translational research from the perspective of evaluators charged with assessing translational efforts. One of the major tasks for evaluators involved in translational research is to help assess efforts that aim to reduce the time it takes to move research to practice and health impacts. Another is to assess efforts that are intended to increase the rate and volume of translation. Methods: We offer an alternative to the dominant contemporary tendency to define translational research in terms of a series of discrete “phases.”Results: We contend that this phased approach has been confusing and that it is insufficient as a basis for evaluation. Instead, we argue for the identification of key operational and measurable markers along a generalized process pathway from research to practice. Conclusions: This model provides a foundation for the evaluation of interventions designed to improve translational research and the integration of these findings into a field of translational studies. Clin Trans Sci 2011; Volume 4: 153–162 PMID:21707944

  20. Variable Renewable Energy in Long-Term Planning Models: A Multi-Model Perspective

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley J. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Frew, Bethany A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mai, Trieu T. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sun, Yinong [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Bistline, John [Electric Power Research Inst., Palo Alto, CA (United States); Blanford, Geoffrey [Electric Power Research Inst., Palo Alto, CA (United States); Young, David [Electric Power Research Inst., Palo Alto, CA (United States); Marcy, Cara [Energy Information Administration, Washington, DC (United States); Namovicz, Chris [Energy Information Administration, Washington, DC (United States); Edelman, Risa [Environmental Protection Agency, Washington, DC (United States); Meroney, Bill [Environmental Protection Agency; Sims, Ryan [Environmental Protection Agency; Stenhouse, Jeb [Environmental Protection Agency; Donohoo-Vallett, Paul [U.S. Department of Energy

    2017-11-03

    Long-term capacity expansion models of the U.S. electricity sector have long been used to inform electric sector stakeholders and decision makers. With the recent surge in variable renewable energy (VRE) generators - primarily wind and solar photovoltaics - the need to appropriately represent VRE generators in these long-term models has increased. VRE generators are especially difficult to represent for a variety of reasons, including their variability, uncertainty, and spatial diversity. To assess current best practices, share methods and data, and identify future research needs for VRE representation in capacity expansion models, four capacity expansion modeling teams from the Electric Power Research Institute, the U.S. Energy Information Administration, the U.S. Environmental Protection Agency, and the National Renewable Energy Laboratory conducted two workshops of VRE modeling for national-scale capacity expansion models. The workshops covered a wide range of VRE topics, including transmission and VRE resource data, VRE capacity value, dispatch and operational modeling, distributed generation, and temporal and spatial resolution. The objectives of the workshops were both to better understand these topics and to improve the representation of VRE across the suite of models. Given these goals, each team incorporated model updates and performed additional analyses between the first and second workshops. This report summarizes the analyses and model 'experiments' that were conducted as part of these workshops as well as the various methods for treating VRE among the four modeling teams. The report also reviews the findings and learnings from the two workshops. We emphasize the areas where there is still need for additional research and development on analysis tools to incorporate VRE into long-term planning and decision-making.

  1. Evaluating fugacity models for trace components in landfill gas

    Energy Technology Data Exchange (ETDEWEB)

    Shafi, Sophie [Integrated Waste Management Centre, Sustainable Systems Department, Building 61, School of Industrial and Manufacturing Science, Cranfield University, Cranfield, Bedfordshire MK43 0AL (United Kingdom); Sweetman, Andrew [Department of Environmental Science, Lancaster University, Lancaster LA1 4YQ (United Kingdom); Hough, Rupert L. [Integrated Waste Management Centre, Sustainable Systems Department, Building 61, School of Industrial and Manufacturing Science, Cranfield University, Cranfield, Bedfordshire MK43 0AL (United Kingdom); Smith, Richard [Integrated Waste Management Centre, Sustainable Systems Department, Building 61, School of Industrial and Manufacturing Science, Cranfield University, Cranfield, Bedfordshire MK43 0AL (United Kingdom); Rosevear, Alan [Science Group - Waste and Remediation, Environment Agency, Reading RG1 8DQ (United Kingdom); Pollard, Simon J.T. [Integrated Waste Management Centre, Sustainable Systems Department, Building 61, School of Industrial and Manufacturing Science, Cranfield University, Cranfield, Bedfordshire MK43 0AL (United Kingdom)]. E-mail: s.pollard@cranfield.ac.uk

    2006-12-15

    A fugacity approach was evaluated to reconcile loadings of vinyl chloride (chloroethene), benzene, 1,3-butadiene and trichloroethylene in waste with concentrations observed in landfill gas monitoring studies. An evaluative environment derived from fictitious but realistic properties such as volume, composition, and temperature, constructed with data from the Brogborough landfill (UK) test cells was used to test a fugacity approach to generating the source term for use in landfill gas risk assessment models (e.g. GasSim). SOILVE, a dynamic Level II model adapted here for landfills, showed greatest utility for benzene and 1,3-butadiene, modelled under anaerobic conditions over a 10 year simulation. Modelled concentrations of these components (95 300 {mu}g m{sup -3}; 43 {mu}g m{sup -3}) fell within measured ranges observed in gas from landfills (24 300-180 000 {mu}g m{sup -3}; 20-70 {mu}g m{sup -3}). This study highlights the need (i) for representative and time-referenced biotransformation data; (ii) to evaluate the partitioning characteristics of organic matter within waste systems and (iii) for a better understanding of the role that gas extraction rate (flux) plays in producing trace component concentrations in landfill gas. - Fugacity for trace component in landfill gas.

  2. Evaluation of some infiltration models and hydraulic parameters

    International Nuclear Information System (INIS)

    Haghighi, F.; Gorji, M.; Shorafa, M.; Sarmadian, F.; Mohammadi, M. H.

    2010-01-01

    The evaluation of infiltration characteristics and some parameters of infiltration models such as sorptivity and final steady infiltration rate in soils are important in agriculture. The aim of this study was to evaluate some of the most common models used to estimate final soil infiltration rate. The equality of final infiltration rate with saturated hydraulic conductivity (Ks) was also tested. Moreover, values of the estimated sorptivity from the Philips model were compared to estimates by selected pedotransfer functions (PTFs). The infiltration experiments used the doublering method on soils with two different land uses in the Taleghan watershed of Tehran province, Iran, from September to October, 2007. The infiltration models of Kostiakov-Lewis, Philip two-term and Horton were fitted to observed infiltration data. Some parameters of the models and the coefficient of determination goodness of fit were estimated using MATLAB software. The results showed that, based on comparing measured and model-estimated infiltration rate using root mean squared error (RMSE), Hortons model gave the best prediction of final infiltration rate in the experimental area. Laboratory measured Ks values gave significant differences and higher values than estimated final infiltration rates from the selected models. The estimated final infiltration rate was not equal to laboratory measured Ks values in the study area. Moreover, the estimated sorptivity factor by Philips model was significantly different to those estimated by selected PTFs. It is suggested that the applicability of PTFs is limited to specific, similar conditions. (Author) 37 refs.

  3. Examination of constitutive model for evaluating long-term mechanical behavior of buffer. 3

    International Nuclear Information System (INIS)

    Takaji, Kazuhiko; Shigeno, Yoshimasa; Shimogouchi, Takafumi; Shiratake, Toshikazu; Tamura, Hirokuni

    2004-02-01

    On the R and D of the high-level radioactive waste repository, it is essential that Engineered Barrier System (EBS) is stable mechanically over a long period of time for maintaining each ability required to EBS. After closing the repository, the various external forces will be affected to buffer intricately for a long period of time. So, to make clear the mechanical deformation behavior of buffer against the external force is important, because of carrying out safety assessment of EBS accurately. In this report, several sets of parameters are chosen for the previously selected two constitutive models, Sekiguchi-Ohta model and Adachi-Oka model, and the element tests and mock-up tests are simulated using these parameters. Through the simulation, applicability of the constitutive models and parameters is examined. Moreover, simulation analyses of EBS using these parameters is examined. Moreover, simulation analyses of EBS using these parameters were carried out, and mechanical behavior is evaluated over a long period of time. Analysis estimated the amount of settlement of the over pack, the stress state of buffer material, the reaction force to a base rock, etc., and the result that EBS is mechanically stable over a long period of time was obtained. Next, in order to prove analyses results a side, literature survey was conducted about geological age, the dynamics history of a Smectite layer. The outline plan was drawn up about the natural analogue verification method and preliminary examination was performed about the applicability of Freezing Sampling'. (author)

  4. Island-Model Genomic Selection for Long-Term Genetic Improvement of Autogamous Crops.

    Science.gov (United States)

    Yabe, Shiori; Yamasaki, Masanori; Ebana, Kaworu; Hayashi, Takeshi; Iwata, Hiroyoshi

    2016-01-01

    Acceleration of genetic improvement of autogamous crops such as wheat and rice is necessary to increase cereal production in response to the global food crisis. Population and pedigree methods of breeding, which are based on inbred line selection, are used commonly in the genetic improvement of autogamous crops. These methods, however, produce a few novel combinations of genes in a breeding population. Recurrent selection promotes recombination among genes and produces novel combinations of genes in a breeding population, but it requires inaccurate single-plant evaluation for selection. Genomic selection (GS), which can predict genetic potential of individuals based on their marker genotype, might have high reliability of single-plant evaluation and might be effective in recurrent selection. To evaluate the efficiency of recurrent selection with GS, we conducted simulations using real marker genotype data of rice cultivars. Additionally, we introduced the concept of an "island model" inspired by evolutionary algorithms that might be useful to maintain genetic variation through the breeding process. We conducted GS simulations using real marker genotype data of rice cultivars to evaluate the efficiency of recurrent selection and the island model in an autogamous species. Results demonstrated the importance of producing novel combinations of genes through recurrent selection. An initial population derived from admixture of multiple bi-parental crosses showed larger genetic gains than a population derived from a single bi-parental cross in whole cycles, suggesting the importance of genetic variation in an initial population. The island-model GS better maintained genetic improvement in later generations than the other GS methods, suggesting that the island-model GS can utilize genetic variation in breeding and can retain alleles with small effects in the breeding population. The island-model GS will become a new breeding method that enhances the potential of genomic

  5. Island-Model Genomic Selection for Long-Term Genetic Improvement of Autogamous Crops.

    Directory of Open Access Journals (Sweden)

    Shiori Yabe

    Full Text Available Acceleration of genetic improvement of autogamous crops such as wheat and rice is necessary to increase cereal production in response to the global food crisis. Population and pedigree methods of breeding, which are based on inbred line selection, are used commonly in the genetic improvement of autogamous crops. These methods, however, produce a few novel combinations of genes in a breeding population. Recurrent selection promotes recombination among genes and produces novel combinations of genes in a breeding population, but it requires inaccurate single-plant evaluation for selection. Genomic selection (GS, which can predict genetic potential of individuals based on their marker genotype, might have high reliability of single-plant evaluation and might be effective in recurrent selection. To evaluate the efficiency of recurrent selection with GS, we conducted simulations using real marker genotype data of rice cultivars. Additionally, we introduced the concept of an "island model" inspired by evolutionary algorithms that might be useful to maintain genetic variation through the breeding process. We conducted GS simulations using real marker genotype data of rice cultivars to evaluate the efficiency of recurrent selection and the island model in an autogamous species. Results demonstrated the importance of producing novel combinations of genes through recurrent selection. An initial population derived from admixture of multiple bi-parental crosses showed larger genetic gains than a population derived from a single bi-parental cross in whole cycles, suggesting the importance of genetic variation in an initial population. The island-model GS better maintained genetic improvement in later generations than the other GS methods, suggesting that the island-model GS can utilize genetic variation in breeding and can retain alleles with small effects in the breeding population. The island-model GS will become a new breeding method that enhances the

  6. Medium-term evaluation of total knee arthroplasty without patellar replacement

    Directory of Open Access Journals (Sweden)

    José Wanderley Vasconcelos

    2013-06-01

    Full Text Available OBJECTIVE: To mid-term evaluate patients who were submitted to total knee arthroplasty without patellar resurfacing. METHODS: It was realized a retrospective cross-sectional study of patients who were submitted to total knee arthroplasty without patellar resurfacing. In all patients clinical examination was done based on the protocol of the Knee Society Scoring System, which assessed pain, range of motion, stability, contraction, knee alignment and function, and radiological evaluation. RESULTS: A total of 36 patients were evaluated. Of these, 07 were operated only on left knee, 12 only on right knee and 17 were operated bilaterally, totaling 53 knees. Ages ranged from 26 to 84 years. Of the 53 knees evaluated, 33 (62.26% had no pain. The maximum flexion range of motion averaged 104.7°. No knee had difficulty in active extension. As to the alignment for anatomical axis twelve knees (22.64% showed deviation between 0° and 4° varus. Thirty-nine (75.49% knees showed pace without restriction and the femorotibial angle ranged between 3° varus and 13° valgus with an average of 5° valgus. The patellar index ranged from 0.2 to 1.1. CONCLUSION: Total knee arthroplasty whitout patellar resurfacing provides good results in mid-term evaluation.

  7. Evaluation of the St. Lucia geothermal resource: macroeconomic models

    Energy Technology Data Exchange (ETDEWEB)

    Burris, A.E.; Trocki, L.K.; Yeamans, M.K.; Kolstad, C.D.

    1984-08-01

    A macroeconometric model describing the St. Lucian economy was developed using 1970 to 1982 economic data. Results of macroeconometric forecasts for the period 1983 through 1985 show an increase in gross domestic product (GDP) for 1983 and 1984 with a decline in 1985. The rate of population growth is expected to exceed GDP growth so that a small decline in per capita GDP will occur. We forecast that garment exports will increase, providing needed employment and foreign exchange. To obtain a longer-term but more general outlook on St. Lucia's economy, and to evaluate the benefit of geothermal energy development, we applied a nonlinear programming model. The model maximizes discounted cumulative consumption.

  8. Large scale Bayesian nuclear data evaluation with consistent model defects

    International Nuclear Information System (INIS)

    Schnabel, G

    2015-01-01

    The aim of nuclear data evaluation is the reliable determination of cross sections and related quantities of the atomic nuclei. To this end, evaluation methods are applied which combine the information of experiments with the results of model calculations. The evaluated observables with their associated uncertainties and correlations are assembled into data sets, which are required for the development of novel nuclear facilities, such as fusion reactors for energy supply, and accelerator driven systems for nuclear waste incineration. The efficiency and safety of such future facilities is dependent on the quality of these data sets and thus also on the reliability of the applied evaluation methods. This work investigated the performance of the majority of available evaluation methods in two scenarios. The study indicated the importance of an essential component in these methods, which is the frequently ignored deficiency of nuclear models. Usually, nuclear models are based on approximations and thus their predictions may deviate from reliable experimental data. As demonstrated in this thesis, the neglect of this possibility in evaluation methods can lead to estimates of observables which are inconsistent with experimental data. Due to this finding, an extension of Bayesian evaluation methods is proposed to take into account the deficiency of the nuclear models. The deficiency is modeled as a random function in terms of a Gaussian process and combined with the model prediction. This novel formulation conserves sum rules and allows to explicitly estimate the magnitude of model deficiency. Both features are missing in available evaluation methods so far. Furthermore, two improvements of existing methods have been developed in the course of this thesis. The first improvement concerns methods relying on Monte Carlo sampling. A Metropolis-Hastings scheme with a specific proposal distribution is suggested, which proved to be more efficient in the studied scenarios than the

  9. Mid-term evaluation of ten National Research schools

    DEFF Research Database (Denmark)

    Gustafsson, Göran; Dahl, Hanne Marlene; Gustafsson, Christina

    the scheme was launched, the Research Council has issued three calls for proposals and allocated grants to a total of 22 national research schools. Five were started up in 2009, ten in 2013 and seven in 2015. A Nordic scientific programme committee was appointed in 2013, with responsibility for assessing...... grant applications, monitoring the progress of the FORSKERSKOLER scheme and serving as the evaluation panel for the mid-term evaluation in 2013 and in 2016/2017. The task of the evaluation panel has been to: 1) evaluate the quality of and progress achieved by the ten research schools which were awarded...... funding in 2012 and launched in 2013; and 2) to provide recommendations as to whether funding should be continued to cover the full eight-year period or terminated after five years. Continued funding is recommended for all ten schools to cover the full eight-year period, according to the proposed budget...

  10. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species.

    Science.gov (United States)

    Adams, Matthew P; Collier, Catherine J; Uthicke, Sven; Ow, Yan X; Langlois, Lucas; O'Brien, Katherine R

    2017-01-04

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (T opt ) for maximum photosynthetic rate (P max ). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike.

  11. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species

    Science.gov (United States)

    Adams, Matthew P.; Collier, Catherine J.; Uthicke, Sven; Ow, Yan X.; Langlois, Lucas; O'Brien, Katherine R.

    2017-01-01

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (Topt) for maximum photosynthetic rate (Pmax). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike.

  12. Evaluation of long term creep-fatigue life for type 304 stainless steel

    International Nuclear Information System (INIS)

    Kawasaki, Hirotsugu; Ueno, Fumiyoshi; Aoto, Kazumi; Ichimiya, Masakazu; Wada, Yusaku

    1992-01-01

    The long term creep-fatigue life of type 304 stainless steel was evaluated by the creep-fatigue life prediction method based on a linear damage fraction rule. The displacement controlled creep-fatigue tests were carried out, and the time to failure of longer than 10000 hours was obtained. The creep damage of long term creep-fatigue was evaluated by taking into account the stress relaxation behavior with elastic follow-up during the hold period. The relationship between life reduction of creep-fatigue and fracture mode was provided by the creep cavity growth. The results of this study are summarized as follows; (1) The long term creep-fatigue data can be reasonably evaluated by the present method. The predicted lives were within a factor of 3 of the observed ones. (2) The present method provides the capability to predict the long term creep-fatigue life at lower temperatures as well as that at the creep dominant temperature. (3) The value of creep damage for the long term creep-fatigue data increased by elastic follow-up. The creep-fatigue damage diagram intercepted between 0.3 and 1 can represent the observed creep-fatigue damages. (4) The cavity growth depends on the hold time. The fracture of long term creep-fatigue is caused by the intergranular cavity growth. The intergranular fracture of creep-fatigue is initiated by the cavity growth and followed by the microcrack propagation along grain boundaries starting from creep cavities. (author)

  13. REMI and ROUSE: Quantitative Models for Long-Term and Short-Term Priming in Perceptual Identification

    NARCIS (Netherlands)

    E.J. Wagenmakers (Eric-Jan); R. Zeelenberg (René); D.E. Huber (David); J.G.W. Raaijmakers (Jeroen)

    2003-01-01

    textabstractThe REM model originally developed for recognition memory (Shiffrin & Steyvers, 1997) has recently been extended to implicit memory phenomena observed during threshold identification of words. We discuss two REM models based on Bayesian principles: a model for long-term priming (REMI;

  14. Automated analysis of phantom images for the evaluation of long-term reproducibility in digital mammography

    International Nuclear Information System (INIS)

    Gennaro, G; Ferro, F; Contento, G; Fornasin, F; Di Maggio, C

    2007-01-01

    The performance of an automatic software package was evaluated with phantom images acquired by a full-field digital mammography unit. After the validation, the software was used, together with a Leeds TORMAS test object, to model the image acquisition process. Process modelling results were used to evaluate the sensitivity of the method in detecting changes of exposure parameters from routine image quality measurements in digital mammography, which is the ultimate purpose of long-term reproducibility tests. Image quality indices measured by the software included the mean pixel value and standard deviation of circular details and surrounding background, contrast-to-noise ratio and relative contrast; detail counts were also collected. The validation procedure demonstrated that the software localizes the phantom details correctly and the difference between automatic and manual measurements was within few grey levels. Quantitative analysis showed sufficient sensitivity to relate fluctuations in exposure parameters (kV p or mAs) to variations in image quality indices. In comparison, detail counts were found less sensitive in detecting image quality changes, even when limitations due to observer subjectivity were overcome by automatic analysis. In conclusion, long-term reproducibility tests provided by the Leeds TORMAS phantom with quantitative analysis of multiple IQ indices have been demonstrated to be effective in predicting causes of deviation from standard operating conditions and can be used to monitor stability in full-field digital mammography

  15. Introducing Program Evaluation Models

    Directory of Open Access Journals (Sweden)

    Raluca GÂRBOAN

    2008-02-01

    Full Text Available Programs and project evaluation models can be extremely useful in project planning and management. The aim is to set the right questions as soon as possible in order to see in time and deal with the unwanted program effects, as well as to encourage the positive elements of the project impact. In short, different evaluation models are used in order to minimize losses and maximize the benefits of the interventions upon small or large social groups. This article introduces some of the most recently used evaluation models.

  16. Developpement of a GoldSim Biosphere Model, Evaluation, and Its Verification

    International Nuclear Information System (INIS)

    Lee, Youn Myoung; Hwang, Yong Soo

    2009-12-01

    For the purpose of evaluating dose rate to individual due to long-term release of nuclides from the repository for an HLW or a pyroprocessing repository, a biosphere assessment model and the implemented program based on BIOMASS methodology have been developed by utilizing GoldSim, a general model developing tool. To show its practicability and usability as well as to see the sensitivity of parametric and scenario variations to the annual exposure, some probabilistic calculations are made and investigated. For the cases when changing the exposure groups and associated GBIs as well as varying selected input values, all of which seem important for the biosphere evaluation, dose rate per nuclide release rate is probabilistically calculated and analyzed. A series of comparison studies with JAEA, Japan have been also carried out to verify the model

  17. Evaluation and Improvement of Cloud and Convective Parameterizations from Analyses of ARM Observations and Models

    Energy Technology Data Exchange (ETDEWEB)

    Del Genio, Anthony D. [NASA Goddard Inst. for Space Studies (GISS), New York, NY (United States)

    2016-03-11

    Over this period the PI and his performed a broad range of data analysis, model evaluation, and model improvement studies using ARM data. These included cloud regimes in the TWP and their evolution over the MJO; M-PACE IOP SCM-CRM intercomparisons; simulations of convective updraft strength and depth during TWP-ICE; evaluation of convective entrainment parameterizations using TWP-ICE simulations; evaluation of GISS GCM cloud behavior vs. long-term SGP cloud statistics; classification of aerosol semi-direct effects on cloud cover; depolarization lidar constraints on cloud phase; preferred states of the winter Arctic atmosphere, surface, and sub-surface; sensitivity of convection to tropospheric humidity; constraints on the parameterization of mesoscale organization from TWP-ICE WRF simulations; updraft and downdraft properties in TWP-ICE simulated convection; insights from long-term ARM records at Manus and Nauru.

  18. Essays on financial econometrics : modeling the term structure of interest rates

    NARCIS (Netherlands)

    Bouwman, Kees Evert

    2008-01-01

    This dissertation bundles five studies in financial econometrics that are related to the theme of modeling the term structure of interest rates. The main contribution of this dissertation is a new arbitrage-free term structure model that is applied in an empirical analysis of the US term structure.

  19. Evaluation of Short Term Memory Span Function In Children

    Directory of Open Access Journals (Sweden)

    Barış ERGÜL

    2016-12-01

    Full Text Available Although details of the information encoded in the short-term memory where it is stored temporarily be recorded in the working memory in the next stage. Repeating the information mentally makes it remain in memory for a long time. Studies investigating the relationship between short-term memory and reading skills that are carried out to examine the relationship between short-term memory processes and reading comprehension. In this study information coming to short-term memory and the factors affecting operation of short term memory are investigated with regression model. The aim of the research is to examine the factors (age, IQ and reading skills that are expected the have an effect on short-term memory in children through regression analysis. One of the assumptions of regression analysis is to examine which has constant variance and normal distribution of the error term. In this study, because the error term is not normally distributed, robust regression techniques were applied. Also, for each technique; coefficient of determination is determined. According to the findings, the increase in age, IQ and reading skills caused the increase in short term memory in children. After applying robust regression techniques, the Winsorized Least Squares (WLS technique gives the highest coefficient of determination.

  20. Short-Term Solar Irradiance Forecasts Using Sky Images and Radiative Transfer Model

    Directory of Open Access Journals (Sweden)

    Juan Du

    2018-05-01

    Full Text Available In this paper, we propose a novel forecast method which addresses the difficulty in short-term solar irradiance forecasting that arises due to rapidly evolving environmental factors over short time periods. This involves the forecasting of Global Horizontal Irradiance (GHI that combines prediction sky images with a Radiative Transfer Model (RTM. The prediction images (up to 10 min ahead are produced by a non-local optical flow method, which is used to calculate the cloud motion for each pixel, with consecutive sky images at 1 min intervals. The Direct Normal Irradiance (DNI and the diffuse radiation intensity field under clear sky and overcast conditions obtained from the RTM are then mapped to the sky images. Through combining the cloud locations on the prediction image with the corresponding instance of image-based DNI and diffuse radiation intensity fields, the GHI can be quantitatively forecasted for time horizons of 1–10 min ahead. The solar forecasts are evaluated in terms of root mean square error (RMSE and mean absolute error (MAE in relation to in-situ measurements and compared to the performance of the persistence model. The results of our experiment show that GHI forecasts using the proposed method perform better than the persistence model.

  1. Evaluation and improvement of micro-surfacing mix design method and modelling of asphalt emulsion mastic in terms of filler-emulsion interaction

    Science.gov (United States)

    Robati, Masoud

    This Doctorate program focuses on the evaluation and improving the rutting resistance of micro-surfacing mixtures. There are many research problems related to the rutting resistance of micro-surfacing mixtures that still require further research to be solved. The main objective of this Ph.D. program is to experimentally and analytically study and improve rutting resistance of micro-surfacing mixtures. During this Ph.D. program major aspects related to the rutting resistance of micro-surfacing mixtures are investigated and presented as follow: 1) evaluation of a modification of current micro-surfacing mix design procedures: On the basis of this effort, a new mix design procedure is proposed for type III micro-surfacing mixtures as rut-fill materials on the road surface. Unlike the current mix design guidelines and specification, the new mix design is capable of selecting the optimum mix proportions for micro-surfacing mixtures; 2) evaluation of test methods and selection of aggregate grading for type III application of micro-surfacing: Within the term of this study, a new specification for selection of aggregate grading for type III application of micro-surfacing is proposed; 3) evaluation of repeatability and reproducibility of micro-surfacing mixture design tests: In this study, limits for repeatability and reproducibility of micro-surfacing mix design tests are presented; 4) a new conceptual model for filler stiffening effect on asphalt mastic of micro-surfacing: A new model is proposed, which is able to establish limits for minimum and maximum filler concentrations in the micro-surfacing mixture base on only the filler important physical and chemical properties; 5) incorporation of reclaimed asphalt pavement and post-fabrication asphalt shingles in micro-surfacing mixture: The effectiveness of newly developed mix design procedure for micro-surfacing mixtures is further validated using recycled materials. The results present the limits for the use of RAP and RAS

  2. Global off-line evaluation of the ISBA-TRIP flood model

    Energy Technology Data Exchange (ETDEWEB)

    Decharme, B.; Alkama, R.; Faroux, S.; Douville, H. [GAME-CNRM/CNRS - Meteo-France, Toulouse (France); Papa, F. [NOAA-CREST, City College of New York, New York, NY (United States); Institut de Recherche pour le Developpement IRD-LEGOS, Toulouse (France); Prigent, C. [CNRS/Laboratoire d' Etudes du Rayonnement et de la Matiere en Astrophysique, Observatoire de Paris, Paris (France)

    2012-04-15

    This study presents an off-line global evaluation of the ISBA-TRIP hydrological model including a two-way flood scheme. The flood dynamics is indeed described through the daily coupling between the ISBA land surface model and the TRIP river routing model including a prognostic flood reservoir. This reservoir fills when the river height exceeds the critical river bankfull height and vice versa. The flood interacts with the soil hydrology through infiltration and with the overlying atmosphere through precipitation interception and free water surface evaporation. The model is evaluated over a relatively long period (1986-2006) at 1 resolution using the Princeton University 3-hourly atmospheric forcing. Four simulations are performed in order to assess the model sensitivity to the river bankfull height. The evaluation is made against satellite-derived global inundation estimates as well as in situ river discharge observations at 122 gauging stations. First, the results show a reasonable simulation of the global distribution of simulated floodplains when compared to satellite-derived estimates. At basin scale, the comparison reveals some discrepancies, both in terms of climatology and interannual variability, but the results remain acceptable for a simple large-scale model. In addition, the simulated river discharges are improved in term of efficiency scores for more than 50% of the 122 stations and deteriorated for 4% only. Two mechanisms mainly explain this positive impact: an increase in evapotranspiration that limits the annual discharge overestimation found when flooding is not taking into account and a smoothed river peak flow when the floodplain storage is significant. Finally, the sensitivity experiments suggest that the river bankfull depth is potentially tunable according to the river discharge scores to control the accuracy of the simulated flooded areas and its related increase in land surface evaporation. Such a tuning could be relevant at least for climate

  3. Development and Evaluation of Amino Acid Molecular Models

    Directory of Open Access Journals (Sweden)

    Aparecido R. Silva

    2007-05-01

    Full Text Available The comprehension of structure and function of proteins has a tight relationshipwith the development of structural biology. However, biochemistry students usuallyfind difficulty to visualize the structures when they use only schematic drawings ofdidactic books. The representation of three-dimensional structures of somebiomolecules with ludic models, built with representative units, have supplied tothe students and teachers a successfully experience to better visualize andcorrelate the structures to the real molecules. The present work shows thedeveloped models and the process to produce the representative units of the mainamino acids in industrial scale. The design and applicability of the representativeunits were discussed with many teachers and some suggestions wereimplemented to the models. The preliminary evaluation and perspective ofutilization by researchers show that the work is in the right direction. At the actualstage, the models are defined, prototypes were made and will be presented in thismeeting. The moulds for the units are at the final stage of construction and trial inspecialized tool facilities. The last term will consist of an effective evaluation of thedidactic tool for the teaching/learning process in Structural Molecular Biology. Theevaluation protocol is being elaborated containing simple and objective questions,similar to those used in research on science teaching.

  4. Creating a Long-Term Diabetic Rabbit Model

    Directory of Open Access Journals (Sweden)

    Jianpu Wang

    2010-01-01

    Full Text Available This study was to create a long-term rabbit model of diabetes mellitus for medical studies of up to one year or longer and to evaluate the effects of chronic hyperglycemia on damage of major organs. A single dose of alloxan monohydrate (100 mg/kg was given intravenously to 20 young New Zealand White rabbits. Another 12 age-matched normal rabbits were used as controls. Hyperglycemia developed within 48 hours after treatment with alloxan. Insulin was given daily after diabetes developed. All animals gained some body weight, but the gain was much less than the age-matched nondiabetic rabbits. Hyperlipidemia, higher blood urea nitrogen and creatinine were found in the diabetic animals. Histologically, the pancreas showed marked beta cell damage. The kidneys showed significantly thickened afferent glomerular arterioles with narrowed lumens along with glomerular atrophy. Lipid accumulation in the cytoplasm of hepatocytes appeared as vacuoles. Full-thickness skin wound healing was delayed. In summary, with careful management, alloxan-induced diabetic rabbits can be maintained for one year or longer in reasonably good health for diabetic studies.

  5. Road network safety evaluation using Bayesian hierarchical joint model.

    Science.gov (United States)

    Wang, Jie; Huang, Helai

    2016-05-01

    Safety and efficiency are commonly regarded as two significant performance indicators of transportation systems. In practice, road network planning has focused on road capacity and transport efficiency whereas the safety level of a road network has received little attention in the planning stage. This study develops a Bayesian hierarchical joint model for road network safety evaluation to help planners take traffic safety into account when planning a road network. The proposed model establishes relationships between road network risk and micro-level variables related to road entities and traffic volume, as well as socioeconomic, trip generation and network density variables at macro level which are generally used for long term transportation plans. In addition, network spatial correlation between intersections and their connected road segments is also considered in the model. A road network is elaborately selected in order to compare the proposed hierarchical joint model with a previous joint model and a negative binomial model. According to the results of the model comparison, the hierarchical joint model outperforms the joint model and negative binomial model in terms of the goodness-of-fit and predictive performance, which indicates the reasonableness of considering the hierarchical data structure in crash prediction and analysis. Moreover, both random effects at the TAZ level and the spatial correlation between intersections and their adjacent segments are found to be significant, supporting the employment of the hierarchical joint model as an alternative in road-network-level safety modeling as well. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Protective effects of long-term lithium administration in a slowly progressive SMA mouse model.

    Science.gov (United States)

    Biagioni, Francesca; Ferrucci, Michela; Ryskalin, Larisa; Fulceri, Federica; Lazzeri, Gloria; Calierno, Maria Teresa; Busceti, Carla L; Ruffoli, Riccardo; Fornai, Francesco

    2017-12-01

    In the present study we evaluated the long-term effects of lithium administration to a knock-out double transgenic mouse model (Smn-/-; SMN1A2G+/-; SMN2+/+) of Spinal Muscle Atrophy type III (SMA-III). This model is characterized by very low levels of the survival motor neuron protein, slow disease progression and motor neuron loss, which enables to detect disease-modifying effects at delayed time intervals. Lithium administration attenuates the decrease in motor activity and provides full protection from motor neuron loss occurring in SMA-III mice, throughout the disease course. In addition, lithium prevents motor neuron enlargement and motor neuron heterotopy and suppresses the occurrence of radial-like glial fibrillary acidic protein immunostaining in the ventral white matter of SMA-III mice. In SMA-III mice long-term lithium administration determines a dramatic increase of survival motor neuron protein levels in the spinal cord. These data demonstrate that long-term lithium administration during a long-lasting motor neuron disorder attenuates behavioural deficit and neuropathology. Since low level of survival motor neuron protein is bound to disease severity in SMA, the robust increase in protein level produced by lithium provides solid evidence which calls for further investigations considering lithium in the long-term treatment of spinal muscle atrophy.

  7. D-term Spectroscopy in Realistic Heterotic-String Models

    CERN Document Server

    Dedes, Athanasios

    2000-01-01

    The emergence of free fermionic string models with solely the MSSM charged spectrum below the string scale provides further evidence to the assertion that the true string vacuum is connected to the Z_2 x Z_2 orbifold in the vicinity of the free fermionic point in the Narain moduli space. An important property of the Z_2 x Z_2 orbifold is the cyclic permutation symmetry between the three twisted sectors. If preserved in the three generations models the cyclic permutation symmetry results in a family universal anomalous U(1)_A, which is instrumental in explaining squark degeneracy, provided that the dominant component of supersymmetry breaking arises from the U(1)_A D-term. Interestingly, the contribution of the family--universal D_A-term to the squark masses may be intra-family non-universal, and may differ from the usual (universal) boundary conditions assumed in the MSSM. We contemplate how D_A--term spectroscopy may be instrumental in studying superstring models irrespective of our ignorance of the details ...

  8. Nuclear models relevant to evaluation

    International Nuclear Information System (INIS)

    Arthur, E.D.; Chadwick, M.B.; Hale, G.M.; Young, P.G.

    1991-01-01

    The widespread use of nuclear models continues in the creation of data evaluations. The reasons include extension of data evaluations to higher energies, creation of data libraries for isotopic components of natural materials, and production of evaluations for radiative target species. In these cases, experimental data are often sparse or nonexistent. As this trend continues, the nuclear models employed in evaluation work move towards more microscopically-based theoretical methods, prompted in part by the availability of increasingly powerful computational resources. Advances in nuclear models applicable to evaluation will be reviewed. These include advances in optical model theory, microscopic and phenomenological state and level density theory, unified models that consistently describe both equilibrium and nonequilibrium reaction mechanism, and improved methodologies for calculation of prompt radiation from fission. 84 refs., 8 figs

  9. Study on the system development for evaluating long-term alteration of hydraulic field in Near Field 2

    International Nuclear Information System (INIS)

    Okutu, Kazuo; Morikawa, Seiji; Takamura, Hisashi

    2003-02-01

    For the high performance evaluation of reliability of TRU waste repository, the system development for evaluating long-term alteration in consideration of the changes action of barrier materials of hydraulic field in Near Field is required. In this research, the system development for evaluating long-term alteration of hydraulic field in Near Field was examined. The model evaluating each phenomena and the prototype system for chemical/mechanical analysis system were developed, and the method of coupling chemical with dynamic analysis was examined. To improve accuracy and propriety of this analysis system in the future, necessary development elements were arranged. The research result of this year is shown below. 1) Knowledge concerning the chemical phenomena in the near field evolution was rearranged. Experimental approaches and analysis methods were applied to the phenomena of which the knowledge can be obtained. Approaches to focus the model were applied to the phenomena for which knowledge is essentially difficult to obtain. The analysis model was improved using knowledge from natural analog and computational analyses. An analysis system was developed and the propriety of the model was demonstrated. 2) The model of bentonite material was developed by focusing attention on nonlinear swelling behavior. And the model of cement material was developed by focusing attention on deformation behavior influenced by leaching of calcium element which cause reducing of rigidity and strength. With regard to the bentonite model, to testify its propriety, the trial analysis result compared with the consolidation properties test data. Furthermore, the dynamic alteration action analysis system consisted of bentonite and cement model was developed, and trial analysis was performed. In this trial analysis, parameters of cation exchange ratio of Na-bentonite for Ca ion and leaching ratio of Ca from cement material were considered. On the one hand, as concerns rock, to include the

  10. visCOS: An R-package to evaluate model performance of hydrological models

    Science.gov (United States)

    Klotz, Daniel; Herrnegger, Mathew; Wesemann, Johannes; Schulz, Karsten

    2016-04-01

    The evaluation of model performance is a central part of (hydrological) modelling. Much attention has been given to the development of evaluation criteria and diagnostic frameworks. (Klemeš, 1986; Gupta et al., 2008; among many others). Nevertheless, many applications exist for which objective functions do not yet provide satisfying summaries. Thus, the necessity to visualize results arises in order to explore a wider range of model capacities, be it strengths or deficiencies. Visualizations are usually devised for specific projects and these efforts are often not distributed to a broader community (e.g. via open source software packages). Hence, the opportunity to explicitly discuss a state-of-the-art presentation technique is often missed. We therefore present a comprehensive R-package for evaluating model performance by visualizing and exploring different aspects of hydrological time-series. The presented package comprises a set of useful plots and visualization methods, which complement existing packages, such as hydroGOF (Zambrano-Bigiarini et al., 2012). It is derived from practical applications of the hydrological models COSERO and COSEROreg (Kling et al., 2014). visCOS, providing an interface in R, represents an easy-to-use software package for visualizing and assessing model performance and can be implemented in the process of model calibration or model development. The package provides functions to load hydrological data into R, clean the data, process, visualize, explore and finally save the results in a consistent way. Together with an interactive zoom function of the time series, an online calculation of the objective functions for variable time-windows is included. Common hydrological objective functions, such as the Nash-Sutcliffe Efficiency and the Kling-Gupta Efficiency, can also be evaluated and visualized in different ways for defined sub-periods like hydrological years or seasonal sections. Many hydrologists use long-term water-balances as a

  11. Discretization of the Joule heating term for plasma discharge fluid models in unstructured meshes

    International Nuclear Information System (INIS)

    Deconinck, T.; Mahadevan, S.; Raja, L.L.

    2009-01-01

    The fluid (continuum) approach is commonly used for simulation of plasma phenomena in electrical discharges at moderate to high pressures (>10's mTorr). The description comprises governing equations for charged and neutral species transport and energy equations for electrons and the heavy species, coupled to equations for the electromagnetic fields. The coupling of energy from the electrostatic field to the plasma species is modeled by the Joule heating term which appears in the electron and heavy species (ion) energy equations. Proper numerical discretization of this term is necessary for accurate description of discharge energetics; however, discretization of this term poses a special problem in the case of unstructured meshes owing to the arbitrary orientation of the faces enclosing each cell. We propose a method for the numerical discretization of the Joule heating term using a cell-centered finite volume approach on unstructured meshes with closed convex cells. The Joule heating term is computed by evaluating both the electric field and the species flux at the cell center. The dot product of these two vector quantities is computed to obtain the Joule heating source term. We compare two methods to evaluate the species flux at the cell center. One is based on reconstructing the fluxes at the cell centers from the fluxes at the face centers. The other recomputes the flux at the cell center using the common drift-diffusion approximation. The reconstructed flux scheme is the most stable method and yields reasonably accurate results on coarse meshes.

  12. Quadratic Term Structure Models in Discrete Time

    OpenAIRE

    Marco Realdon

    2006-01-01

    This paper extends the results on quadratic term structure models in continuos time to the discrete time setting. The continuos time setting can be seen as a special case of the discrete time one. Recursive closed form solutions for zero coupon bonds are provided even in the presence of multiple correlated underlying factors. Pricing bond options requires simple integration. Model parameters may well be time dependent without scuppering such tractability. Model estimation does not require a r...

  13. Development, description and validation of a Tritium Environmental Release Model (TERM).

    Science.gov (United States)

    Jeffers, Rebecca S; Parker, Geoffrey T

    2014-01-01

    Tritium is a radioisotope of hydrogen that exists naturally in the environment and may also be released through anthropogenic activities. It bonds readily with hydrogen and oxygen atoms to form tritiated water, which then cycles through the hydrosphere. This paper seeks to model the migration of tritiated species throughout the environment - including atmospheric, river and coastal systems - more comprehensively and more consistently across release scenarios than is currently in the literature. A review of the features and underlying conceptual models of some existing tritium release models was conducted, and an underlying aggregated conceptual process model defined, which is presented. The new model, dubbed 'Tritium Environmental Release Model' (TERM), was then tested against multiple validation sets from literature, including experimental data and reference tests for tritium models. TERM has been shown to be capable of providing reasonable results which are broadly comparable with atmospheric HTO release models from the literature, spanning both continuous and discrete release conditions. TERM also performed well when compared with atmospheric data. TERM is believed to be a useful tool for examining discrete and continuous atmospheric releases or combinations thereof. TERM also includes further capabilities (e.g. river and coastal release scenarios) that may be applicable to certain scenarios that atmospheric models alone may not handle well. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Evaluated experimental database on critical heat flux in WWER FA models

    International Nuclear Information System (INIS)

    Artamonov, S.; Sergeev, V.; Volkov, S.

    2015-01-01

    The paper presents the description of the evaluated experimental database on critical heat flux in WWER FA models of new designs. This database was developed on the basis of the experimental data obtained in the years of 2009-2012. In the course of its development, the database was reviewed in terms of completeness of the information about the experiments and its compliance with the requirements of Rostekhnadzor regulatory documents. The description of the experimental FA model characteristics and experimental conditions was specified. Besides, the experimental data were statistically processed with the aim to reject incorrect ones and the sets of experimental data on critical heat fluxes (CHF) were compared for different FA models. As a result, for the fi rst time, the evaluated database on CHF in FA models of new designs was developed, that was complemented with analysis functions, and its main purpose is to be used in the process of development, verification and upgrading of calculation techniques. The developed database incorporates the data of 4183 experimental conditions obtained in 53 WWER FA models of various designs. Keywords: WWER reactor, fuel assembly, CHF, evaluated experimental data, database, statistical analysis. (author)

  15. Modeling Long-Term Fluvial Incision : Shall we Care for the Details of Short-Term Fluvial Dynamics?

    Science.gov (United States)

    Lague, D.; Davy, P.

    2008-12-01

    Fluvial incision laws used in numerical models of coupled climate, erosion and tectonics systems are mainly based on the family of stream power laws for which the rate of local erosion E is a power function of the topographic slope S and the local mean discharge Q : E = K Qm Sn. The exponents m and n are generally taken as (0.35, 0.7) or (0.5, 1), and K is chosen such that the predicted topographic elevation given the prevailing rates of precipitation and tectonics stay within realistic values. The resulting topographies are reasonably realistic, and the coupled system dynamics behaves somehow as expected : more precipitation induces increased erosion and localization of the deformation. Yet, if we now focus on smaller scale fluvial dynamics (the reach scale), recent advances have suggested that discharge variability, channel width dynamics or sediment flux effects may play a significant role in controlling incision rates. These are not factored in the simple stream power law model. In this work, we study how these short- term details propagate into long-term incision dynamics within the framework of surface/tectonics coupled numerical models. To upscale the short term dynamics to geological timescales, we use a numerical model of a trapezoidal river in which vertical and lateral incision processes are computed from fluid shear stress at a daily timescale, sediment transport and protection effects are factored in, as well as a variable discharge. We show that the stream power law model might still be a valid model but that as soon as realistic effects are included such as a threshold for sediment transport, variable discharge and dynamic width the resulting exponents m and n can be as high as 2 and 4. This high non-linearity has a profound consequence on the sensitivity of fluvial relief to incision rate. We also show that additional complexity does not systematically translates into more non-linear behaviour. For instance, considering only a dynamical width

  16. Statistical Models for Tornado Climatology: Long and Short-Term Views.

    Science.gov (United States)

    Elsner, James B; Jagger, Thomas H; Fricker, Tyler

    2016-01-01

    This paper estimates regional tornado risk from records of past events using statistical models. First, a spatial model is fit to the tornado counts aggregated in counties with terms that control for changes in observational practices over time. Results provide a long-term view of risk that delineates the main tornado corridors in the United States where the expected annual rate exceeds two tornadoes per 10,000 square km. A few counties in the Texas Panhandle and central Kansas have annual rates that exceed four tornadoes per 10,000 square km. Refitting the model after removing the least damaging tornadoes from the data (EF0) produces a similar map but with the greatest tornado risk shifted south and eastward. Second, a space-time model is fit to the counts aggregated in raster cells with terms that control for changes in climate factors. Results provide a short-term view of risk. The short-term view identifies a shift of tornado activity away from the Ohio Valley under El Niño conditions and away from the Southeast under positive North Atlantic oscillation conditions. The combined predictor effects on the local rates is quantified by fitting the model after leaving out the year to be predicted from the data. The models provide state-of-the-art views of tornado risk that can be used by government agencies, the insurance industry, and the general public.

  17. Evaluating Flight Crew Performance by a Bayesian Network Model

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2018-03-01

    Full Text Available Flight crew performance is of great significance in keeping flights safe and sound. When evaluating the crew performance, quantitative detailed behavior information may not be available. The present paper introduces the Bayesian Network to perform flight crew performance evaluation, which permits the utilization of multidisciplinary sources of objective and subjective information, despite sparse behavioral data. In this paper, the causal factors are selected based on the analysis of 484 aviation accidents caused by human factors. Then, a network termed Flight Crew Performance Model is constructed. The Delphi technique helps to gather subjective data as a supplement to objective data from accident reports. The conditional probabilities are elicited by the leaky noisy MAX model. Two ways of inference for the BN—probability prediction and probabilistic diagnosis are used and some interesting conclusions are drawn, which could provide data support to make interventions for human error management in aviation safety.

  18. Selection of models to calculate the LLW source term

    International Nuclear Information System (INIS)

    Sullivan, T.M.

    1991-10-01

    Performance assessment of a LLW disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). In turn, many of these physical processes are influenced by the design of the disposal facility (e.g., infiltration of water). The complexity of the problem and the absence of appropriate data prevent development of an entirely mechanistic representation of radionuclide release from a disposal facility. Typically, a number of assumptions, based on knowledge of the disposal system, are used to simplify the problem. This document provides a brief overview of disposal practices and reviews existing source term models as background for selecting appropriate models for estimating the source term. The selection rationale and the mathematical details of the models are presented. Finally, guidance is presented for combining the inventory data with appropriate mechanisms describing release from the disposal facility. 44 refs., 6 figs., 1 tab

  19. Short-term wind power prediction based on LSSVM–GSA model

    International Nuclear Information System (INIS)

    Yuan, Xiaohui; Chen, Chen; Yuan, Yanbin; Huang, Yuehua; Tan, Qingxiong

    2015-01-01

    Highlights: • A hybrid model is developed for short-term wind power prediction. • The model is based on LSSVM and gravitational search algorithm. • Gravitational search algorithm is used to optimize parameters of LSSVM. • Effect of different kernel function of LSSVM on wind power prediction is discussed. • Comparative studies show that prediction accuracy of wind power is improved. - Abstract: Wind power forecasting can improve the economical and technical integration of wind energy into the existing electricity grid. Due to its intermittency and randomness, it is hard to forecast wind power accurately. For the purpose of utilizing wind power to the utmost extent, it is very important to make an accurate prediction of the output power of a wind farm under the premise of guaranteeing the security and the stability of the operation of the power system. In this paper, a hybrid model (LSSVM–GSA) based on the least squares support vector machine (LSSVM) and gravitational search algorithm (GSA) is proposed to forecast the short-term wind power. As the kernel function and the related parameters of the LSSVM have a great influence on the performance of the prediction model, the paper establishes LSSVM model based on different kernel functions for short-term wind power prediction. And then an optimal kernel function is determined and the parameters of the LSSVM model are optimized by using GSA. Compared with the Back Propagation (BP) neural network and support vector machine (SVM) model, the simulation results show that the hybrid LSSVM–GSA model based on exponential radial basis kernel function and GSA has higher accuracy for short-term wind power prediction. Therefore, the proposed LSSVM–GSA is a better model for short-term wind power prediction

  20. PSA modeling of long-term accident sequences

    International Nuclear Information System (INIS)

    Georgescu, Gabriel; Corenwinder, Francois; Lanore, Jeanne-Marie

    2014-01-01

    In the context of the extension of PSA scope to include external hazards, in France, both operator (EDF) and IRSN work for the improvement of methods to better take into account in the PSA the accident sequences induced by initiators which affect a whole site containing several nuclear units (reactors, fuel pools,...). These methodological improvements represent an essential prerequisite for the development of external hazards PSA. However, it has to be noted that in French PSA, even before Fukushima, long term accident sequences were taken into account: many insight were therefore used, as complementary information, to enhance the safety level of the plants. IRSN proposed an external events PSA development program. One of the first steps of the program is the development of methods to model in the PSA the long term accident sequences, based on the experience gained. At short term IRSN intends to enhance the modeling of the 'long term' accident sequences induced by the loss of the heat sink or/and the loss of external power supply. The experience gained by IRSN and EDF from the development of several probabilistic studies treating long term accident sequences shows that the simple extension of the mission time of the mitigation systems from 24 hours to longer times is not sufficient to realistically quantify the risk and to obtain a correct ranking of the risk contributions and that treatment of recoveries is also necessary. IRSN intends to develop a generic study which can be used as a general methodology for the assessment of the long term accident sequences, mainly generated by external hazards and their combinations. This first attempt to develop this generic study allowed identifying some aspects, which may be hazard (or combinations of hazards) or related to initial boundary conditions, which should be taken into account for further developments. (authors)

  1. A website evaluation model by integration of previous evaluation models using a quantitative approach

    Directory of Open Access Journals (Sweden)

    Ali Moeini

    2015-01-01

    Full Text Available Regarding the ecommerce growth, websites play an essential role in business success. Therefore, many authors have offered website evaluation models since 1995. Although, the multiplicity and diversity of evaluation models make it difficult to integrate them into a single comprehensive model. In this paper a quantitative method has been used to integrate previous models into a comprehensive model that is compatible with them. In this approach the researcher judgment has no role in integration of models and the new model takes its validity from 93 previous models and systematic quantitative approach.

  2. Evaluation of different nitrous oxide production models with four continuous long-term wastewater treatment process data series.

    Science.gov (United States)

    Spérandio, Mathieu; Pocquet, Mathieu; Guo, Lisha; Ni, Bing-Jie; Vanrolleghem, Peter A; Yuan, Zhiguo

    2016-03-01

    Five activated sludge models describing N2O production by ammonium oxidising bacteria (AOB) were compared to four different long-term process data sets. Each model considers one of the two known N2O production pathways by AOB, namely the AOB denitrification pathway and the hydroxylamine oxidation pathway, with specific kinetic expressions. Satisfactory calibration could be obtained in most cases, but none of the models was able to describe all the N2O data obtained in the different systems with a similar parameter set. Variability of the parameters can be related to difficulties related to undescribed local concentration heterogeneities, physiological adaptation of micro-organisms, a microbial population switch, or regulation between multiple AOB pathways. This variability could be due to a dependence of the N2O production pathways on the nitrite (or free nitrous acid-FNA) concentrations and other operational conditions in different systems. This work gives an overview of the potentialities and limits of single AOB pathway models. Indicating in which condition each single pathway model is likely to explain the experimental observations, this work will also facilitate future work on models in which the two main N2O pathways active in AOB are represented together.

  3. Modeling and Evaluating Pilot Performance in NextGen: Review of and Recommendations Regarding Pilot Modeling Efforts, Architectures, and Validation Studies

    Science.gov (United States)

    Wickens, Christopher; Sebok, Angelia; Keller, John; Peters, Steve; Small, Ronald; Hutchins, Shaun; Algarin, Liana; Gore, Brian Francis; Hooey, Becky Lee; Foyle, David C.

    2013-01-01

    NextGen operations are associated with a variety of changes to the national airspace system (NAS) including changes to the allocation of roles and responsibilities among operators and automation, the use of new technologies and automation, additional information presented on the flight deck, and the entire concept of operations (ConOps). In the transition to NextGen airspace, aviation and air operations designers need to consider the implications of design or system changes on human performance and the potential for error. To ensure continued safety of the NAS, it will be necessary for researchers to evaluate design concepts and potential NextGen scenarios well before implementation. One approach for such evaluations is through human performance modeling. Human performance models (HPMs) provide effective tools for predicting and evaluating operator performance in systems. HPMs offer significant advantages over empirical, human-in-the-loop testing in that (1) they allow detailed analyses of systems that have not yet been built, (2) they offer great flexibility for extensive data collection, (3) they do not require experimental participants, and thus can offer cost and time savings. HPMs differ in their ability to predict performance and safety with NextGen procedures, equipment and ConOps. Models also vary in terms of how they approach human performance (e.g., some focus on cognitive processing, others focus on discrete tasks performed by a human, while others consider perceptual processes), and in terms of their associated validation efforts. The objectives of this research effort were to support the Federal Aviation Administration (FAA) in identifying HPMs that are appropriate for predicting pilot performance in NextGen operations, to provide guidance on how to evaluate the quality of different models, and to identify gaps in pilot performance modeling research, that could guide future research opportunities. This research effort is intended to help the FAA

  4. PROCEDURE FOR THE EVALUATION OF MEASURED DATA IN TERMS OF VIBRATION DIAGNOSTICS BY APPLICATION OF A MULTIDIMENSIONAL STATISTICAL MODEL

    Directory of Open Access Journals (Sweden)

    Tomas TOMKO

    2016-06-01

    Full Text Available The evaluation process of measured data in terms of vibration diagnosis is problematic for timeline constructors. The complexity of such an evaluation is compounded by the fact that it is a process involving a large amount of disparate measurement data. One of the most effective analytical approaches when dealing with large amounts of data is to engage in a process using multidimensional statistical methods, which can provide a picture of the current status of the flexibility of the machinery. The more methods that are used, the more precise the statistical analysis of measurement data, making it possible to obtain a better picture of the current condition of the machinery.

  5. Evaluation of integrated assessment model hindcast experiments: a case study of the GCAM 3.0 land use module

    Science.gov (United States)

    Snyder, Abigail C.; Link, Robert P.; Calvin, Katherine V.

    2017-11-01

    Hindcasting experiments (conducting a model forecast for a time period in which observational data are available) are being undertaken increasingly often by the integrated assessment model (IAM) community, across many scales of models. When they are undertaken, the results are often evaluated using global aggregates or otherwise highly aggregated skill scores that mask deficiencies. We select a set of deviation-based measures that can be applied on different spatial scales (regional versus global) to make evaluating the large number of variable-region combinations in IAMs more tractable. We also identify performance benchmarks for these measures, based on the statistics of the observational dataset, that allow a model to be evaluated in absolute terms rather than relative to the performance of other models at similar tasks. An ideal evaluation method for hindcast experiments in IAMs would feature both absolute measures for evaluation of a single experiment for a single model and relative measures to compare the results of multiple experiments for a single model or the same experiment repeated across multiple models, such as in community intercomparison studies. The performance benchmarks highlight the use of this scheme for model evaluation in absolute terms, providing information about the reasons a model may perform poorly on a given measure and therefore identifying opportunities for improvement. To demonstrate the use of and types of results possible with the evaluation method, the measures are applied to the results of a past hindcast experiment focusing on land allocation in the Global Change Assessment Model (GCAM) version 3.0. The question of how to more holistically evaluate models as complex as IAMs is an area for future research. We find quantitative evidence that global aggregates alone are not sufficient for evaluating IAMs that require global supply to equal global demand at each time period, such as GCAM. The results of this work indicate it is

  6. Studying Term Structure of SHIBOR with the Two-Factor Vasicek Model

    Directory of Open Access Journals (Sweden)

    Chaoqun Ma

    2014-01-01

    Full Text Available With the development of the Chinese interest rate market, SHIBOR is playing an increasingly important role. Based on principal component analysing SHIBOR, a two-factor Vasicek model is established to portray the change in SHIBOR with different terms. And parameters are estimated by using the Kalman filter. The model is also used to fit and forecast SHIBOR with different terms. The results show that two-factor Vasicek model fits SHIBOR well, especially for SHIBOR in terms of three months or more.

  7. Evaluation of oxidative status in short-term exercises of adolescent athletes

    Directory of Open Access Journals (Sweden)

    K Karacabey

    2010-09-01

    Full Text Available The aim of the study was to evaluate the effects of short-term exercise on total antioxidant status (TAS, lipid hydroperoxide (LOOHs, total oxidative status (TOS and oxidative stress index (OSI in adolescent athletes. A total of 62 adolescent participated in the study. Athletes were trained regularly 3 days a week for 2 hours. All subjects followed a circuit exercise program. Blood samples were collected just before and immediately after the exercise program. Antioxidant status was evaluated by measuring the TAS level in the plasma. Oxidative status was evaluated by measuring the total peroxide level. The percentage ratio of TAS to total peroxide level was accepted as the OSI. Plasma triglyceride, total cholesterol, LDL, HDL and VLDL were measured by automated chemical analyzer using commercially available kits.There was a significant increase in TOS (p<0.05 and OSI (p<0.01 levels and a significant decrease in TAS levels (p<0.01 compared to the resting state. There were no significant changes in LOOHs levels before and after the short-term exercise. After short-term exercise, the balance between oxidative stress and antioxidant status moves towards oxidative stress as a result of increasing oxidants and decreasing antioxidants.

  8. Short-term forecasting model for aggregated regional hydropower generation

    International Nuclear Information System (INIS)

    Monteiro, Claudio; Ramirez-Rosado, Ignacio J.; Fernandez-Jimenez, L. Alfredo

    2014-01-01

    Highlights: • Original short-term forecasting model for the hourly hydropower generation. • The use of NWP forecasts allows horizons of several days. • New variable to represent the capacity level for generating hydroelectric energy. • The proposed model significantly outperforms the persistence model. - Abstract: This paper presents an original short-term forecasting model of the hourly electric power production for aggregated regional hydropower generation. The inputs of the model are previously recorded values of the aggregated hourly production of hydropower plants and hourly water precipitation forecasts using Numerical Weather Prediction tools, as well as other hourly data (load demand and wind generation). This model is composed of three modules: the first one gives the prediction of the “monthly” hourly power production of the hydropower plants; the second module gives the prediction of hourly power deviation values, which are added to that obtained by the first module to achieve the final forecast of the hourly hydropower generation; the third module allows a periodic adjustment of the prediction of the first module to improve its BIAS error. The model has been applied successfully to the real-life case study of the short-term forecasting of the aggregated hydropower generation in Spain and Portugal (Iberian Peninsula Power System), achieving satisfactory results for the next-day forecasts. The model can be valuable for agents involved in electricity markets and useful for power system operations

  9. Evaluation of long-term geological and climatic changes in the Spanish programme

    International Nuclear Information System (INIS)

    Torres, T.; Ortiz, J.E.; Cortes, A.; Delgado, A.

    2004-01-01

    The Bio-molecular Stratigraphy Laboratory of the Madrid School of Mines has been largely involved in the analysis of long-term paleo-environmental changes in the Iberian Peninsula during the Quaternary. Some of the research projects were UE funded: Paleo-climatological Revision of Climate Evolution in Western Mediterranean Region. Evaluation of Altered Scenarios, Evidence from Quaternary Infill Paleo-hydrogeology, Sequential Biosphere modelling function of Climate evolution models; Paleo-hydrogeological Data Analysis and Model Testing. Other projects were funded by the National Company for Radioactive Waste Management (ENRESA) and the Spanish Nuclear Safety Council (CSN): 'Paleo-climate reconstruction from Middle Pleistocene times through dating and isotopic analysis of tufa deposits'; 'Paleo-environmental evolution of the southern part of the Iberian Peninsula'; 'Paleo-climate'. On a minor scale the laboratory was also involved in the study of some argillaceous media: 'Organic Geochemistry of some deep Spanish argillaceous formations' and 'Effects of climatic change on the argillaceous series of the Duero and Ebro basins'. Here we will present some of the results obtained from tufa deposits analysis and paleo-environmental information from the Guadix-Baza Basin composite-stratigraphical-type-section study. (authors)

  10. An Exemplar-Familiarity Model Predicts Short-Term and Long-Term Probe Recognition across Diverse Forms of Memory Search

    Science.gov (United States)

    Nosofsky, Robert M.; Cox, Gregory E.; Cao, Rui; Shiffrin, Richard M.

    2014-01-01

    Experiments were conducted to test a modern exemplar-familiarity model on its ability to account for both short-term and long-term probe recognition within the same memory-search paradigm. Also, making connections to the literature on attention and visual search, the model was used to interpret differences in probe-recognition performance across…

  11. An economic model of long-term use of celecoxib in patients with osteoarthritis

    Directory of Open Access Journals (Sweden)

    Rublee Dale

    2007-07-01

    Full Text Available Abstract Background Previous evaluations of the cost-effectiveness of the cyclooxygenase-2 selective inhibitor celecoxib (Celebrex, Pfizer Inc, USA have produced conflicting results. The recent controversy over the cardiovascular (CV risks of rofecoxib and other coxibs has renewed interest in the economic profile of celecoxib, the only coxib now available in the United States. The objective of our study was to evaluate the long-term cost-effectiveness of celecoxib compared with nonselective nonsteroidal anti-inflammatory drugs (nsNSAIDs in a population of 60-year-old osteoarthritis (OA patients with average risks of upper gastrointestinal (UGI complications who require chronic daily NSAID therapy. Methods We used decision analysis based on data from the literature to evaluate cost-effectiveness from a modified societal perspective over patients' lifetimes, with outcomes expressed as incremental costs per quality-adjusted life-year (QALY gained. Sensitivity tests were performed to evaluate the impacts of advancing age, CV thromboembolic event risk, different analytic horizons and alternate treatment strategies after UGI adverse events. Results Our main findings were: 1 the base model incremental cost-effectiveness ratio (ICER for celecoxib versus nsNSAIDs was $31,097 per QALY; 2 the ICER per QALY was $19,309 for a model in which UGI ulcer and ulcer complication event risks increased with advancing age; 3 the ICER per QALY was $17,120 in sensitivity analyses combining serious CV thromboembolic event (myocardial infarction, stroke, CV death risks with base model assumptions. Conclusion Our model suggests that chronic celecoxib is cost-effective versus nsNSAIDs in a population of 60-year-old OA patients with average risks of UGI events.

  12. Realistic minimum accident source terms - Evaluation, application, and risk acceptance

    International Nuclear Information System (INIS)

    Angelo, P. L.

    2009-01-01

    The evaluation, application, and risk acceptance for realistic minimum accident source terms can represent a complex and arduous undertaking. This effort poses a very high impact to design, construction cost, operations and maintenance, and integrated safety over the expected facility lifetime. At the 2005 Nuclear Criticality Safety Division (NCSD) Meeting in Knoxville Tenn., two papers were presented mat summarized the Y-12 effort that reduced the number of criticality accident alarm system (CAAS) detectors originally designed for the new Highly Enriched Uranium Materials Facility (HEUMF) from 258 to an eventual as-built number of 60. Part of that effort relied on determining a realistic minimum accident source term specific to the facility. Since that time, the rationale for an alternate minimum accident has been strengthened by an evaluation process that incorporates realism. A recent update to the HEUMF CAAS technical basis highlights the concepts presented here. (authors)

  13. A Multiscale Model Evaluates Screening for Neoplasia in Barrett's Esophagus.

    Directory of Open Access Journals (Sweden)

    Kit Curtius

    2015-05-01

    Full Text Available Barrett's esophagus (BE patients are routinely screened for high grade dysplasia (HGD and esophageal adenocarcinoma (EAC through endoscopic screening, during which multiple esophageal tissue samples are removed for histological analysis. We propose a computational method called the multistage clonal expansion for EAC (MSCE-EAC screening model that is used for screening BE patients in silico to evaluate the effects of biopsy sampling, diagnostic sensitivity, and treatment on disease burden. Our framework seamlessly integrates relevant cell-level processes during EAC development with a spatial screening process to provide a clinically relevant model for detecting dysplastic and malignant clones within the crypt-structured BE tissue. With this computational approach, we retain spatio-temporal information about small, unobserved tissue lesions in BE that may remain undetected during biopsy-based screening but could be detected with high-resolution imaging. This allows evaluation of the efficacy and sensitivity of current screening protocols to detect neoplasia (dysplasia and early preclinical EAC in the esophageal lining. We demonstrate the clinical utility of this model by predicting three important clinical outcomes: (1 the probability that small cancers are missed during biopsy-based screening, (2 the potential gains in neoplasia detection probabilities if screening occurred via high-resolution tomographic imaging, and (3 the efficacy of ablative treatments that result in the curative depletion of metaplastic and neoplastic cell populations in BE in terms of the long-term impact on reducing EAC incidence.

  14. Evaluating the quality of scenarios of short-term wind power generation

    International Nuclear Information System (INIS)

    Pinson, P.; Girard, R.

    2012-01-01

    Highlights: ► Presentation of the desirable properties of wind power generation scenarios. ► Description of various evaluation frameworks (univariate, multivariate, diagnostic). ► Highlighting of the properties of current approaches to scenario generation. ► Guidelines for future evaluation/benchmark exercises. -- Abstract: Scenarios of short-term wind power generation are becoming increasingly popular as input to multistage decision-making problems e.g. multivariate stochastic optimization and stochastic programming. The quality of these scenarios is intuitively expected to substantially impact the benefits from their use in decision-making. So far however, their verification is almost always focused on their marginal distributions for each individual lead time only, thus overlooking their temporal interdependence structure. The shortcomings of such an approach are discussed. Multivariate verification tools, as well as diagnostic approaches based on event-based verification are then presented. Their application to the evaluation of various sets of scenarios of short-term wind power generation demonstrates them as valuable discrimination tools.

  15. Evaluation of articulation simulation system using artificial maxillectomy models.

    Science.gov (United States)

    Elbashti, M E; Hattori, M; Sumita, Y I; Taniguchi, H

    2015-09-01

    Acoustic evaluation is valuable for guiding the treatment of maxillofacial defects and determining the effectiveness of rehabilitation with an obturator prosthesis. Model simulations are important in terms of pre-surgical planning and pre- and post-operative speech function. This study aimed to evaluate the acoustic characteristics of voice generated by an articulation simulation system using a vocal tract model with or without artificial maxillectomy defects. More specifically, we aimed to establish a speech simulation system for maxillectomy defect models that both surgeons and maxillofacial prosthodontists can use in guiding treatment planning. Artificially simulated maxillectomy defects were prepared according to Aramany's classification (Classes I-VI) in a three-dimensional vocal tract plaster model of a subject uttering the vowel /a/. Formant and nasalance acoustic data were analysed using Computerized Speech Lab and the Nasometer, respectively. Formants and nasalance of simulated /a/ sounds were successfully detected and analysed. Values of Formants 1 and 2 for the non-defect model were 675.43 and 976.64 Hz, respectively. Median values of Formants 1 and 2 for the defect models were 634.36 and 1026.84 Hz, respectively. Nasalance was 11% in the non-defect model, whereas median nasalance was 28% in the defect models. The results suggest that an articulation simulation system can be used to help surgeons and maxillofacial prosthodontists to plan post-surgical defects that will be facilitate maxillofacial rehabilitation. © 2015 John Wiley & Sons Ltd.

  16. Eliminating cubic terms in the pseudopotential lattice Boltzmann model for multiphase flow

    Science.gov (United States)

    Huang, Rongzong; Wu, Huiying; Adams, Nikolaus A.

    2018-05-01

    It is well recognized that there exist additional cubic terms of velocity in the lattice Boltzmann (LB) model based on the standard lattice. In this work, elimination of these cubic terms in the pseudopotential LB model for multiphase flow is investigated, where the force term and density gradient are considered. By retaining high-order (≥3 ) Hermite terms in the equilibrium distribution function and the discrete force term, as well as introducing correction terms in the LB equation, the additional cubic terms of velocity are entirely eliminated. With this technique, the computational simplicity of the pseudopotential LB model is well maintained. Numerical tests, including stationary and moving flat and circular interface problems, are carried out to show the effects of such cubic terms on the simulation of multiphase flow. It is found that the elimination of additional cubic terms is beneficial to reduce the numerical error, especially when the velocity is relatively large. Numerical results also suggest that these cubic terms mainly take effect in the interfacial region and that the density-gradient-related cubic terms are more important than the other cubic terms for multiphase flow.

  17. Models of evaluation of public joint-stock property management

    Science.gov (United States)

    Yakupova, N. M.; Levachkova, S.; Absalyamova, S. G.; Kvon, G.

    2017-12-01

    The paper deals with the models of evaluation of performance of both the management company and the individual subsidiaries on the basis of a combination of elements and multi-parameter and target approaches. The article shows that due to the power of multi-dimensional and multi-directional indicators of financial and economic activity it is necessary to assess the degree of achievement of the objectives with the use of multivariate ordinal model as a set of indicators, ordered by growth so that the maintenance of this order on a long interval of time will ensure the effective functioning of the enterprise in the long term. It is shown that these models can be regarded as the monitoring tools of implementation of strategies and guide the justification effectiveness of implementation of management decisions.

  18. Evaluation of nuclear power development scenarios in romania envisaging the long-term national energy sustainability

    International Nuclear Information System (INIS)

    Margeanu, C.; Apostol, M.; Visan, I.; Prodea, I.

    2015-01-01

    The paper summarizes the results of RATEN ICN Pitesti experts' activities in the IAEA's Collaborative Project INPRO-SYNERGIES. Romanian study proposes to evaluate and analyze development of the nuclear capacity and increasing of its share in national energy sector, envisaging the long term national and regional energy sustainability by keeping options open for the future while bringing solutions to short/medium-term challenges. The study focused on the modelling of national NES (Nuclear Energy System) development on short and medium-term (time horizon 2050), considering the existing NFC (Nuclear Fuel Cycle) infrastructure and legislation, provisions of strategic documents in force and including also the possibility of regional collaboration regarding U/fresh fuel supply and SF (Spent Fuel) storage, as services provided at international market prices. The energy system modelling was realized by using the IAEA's MESSAGE program. The study results offer a clear image and also the possible answer to several key questions regarding: potential of nuclear energy to participate with an important share in national energy mix, in conditions of cost competitiveness, safety and security of supply; impact on national energy mix portfolio of capacities and electricity production; impact on Uranium domestic resources; economic projection/investments needed for new nuclear capacities addition; fresh fuel requirements for nuclear capacities; SF annually discharged and transferred to interim wet storage for cooling; SF volume in interim dry storage, etc. (authors)

  19. A new ensemble model for short term wind power prediction

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albu, Razvan-Daniel; Felea, Ioan

    2012-01-01

    As the objective of this study, a non-linear ensemble system is used to develop a new model for predicting wind speed in short-term time scale. Short-term wind power prediction becomes an extremely important field of research for the energy sector. Regardless of the recent advancements in the re-search...... of prediction models, it was observed that different models have different capabilities and also no single model is suitable under all situations. The idea behind EPS (ensemble prediction systems) is to take advantage of the unique features of each subsystem to detain diverse patterns that exist in the dataset...

  20. Long-term Simulation of Photo-oxidants and Particulate Matter Over Europe With The Eurad Modeling System

    Science.gov (United States)

    Memmesheimer, M.; Friese, E.; Jakobs, H. J.; Feldmann, H.; Ebel, A.; Kerschgens, M. J.

    During recent years the interest in long-term applications of air pollution modeling systems (AQMS) has strongly increased. Most of these models have been developed for the application to photo-oxidant episodes during the last decade. In this contribu- tion a long-term application of the EURAD modeling sytem to the year 1997 is pre- sented. Atmospheric particles are included using the Modal Aerosol Dynamics Model for Europe (MADE). Meteorological fields are simulated by the mesoscale meteoro- logical model MM5, gas-phase chemistry has been treated with the RACM mecha- nism. The nesting option is used to zoom in areas of specific interest. Horizontal grid sizes are 125 km for the reginal scale, and 5 km for the local scale covering the area of North-Rhine-Westfalia (NRW). The results have been compared to observations of the air quality network of the environmental agency of NRW for the year 1997. The model results have been evaluated using the data quality objectives of the EU direc- tive 99/30. Further improvement for application of regional-scale air quality models is needed with respect to emission data bases, coupling to global models to improve the boundary values, interaction between aerosols and clouds and multiphase modeling.

  1. Model and economic uncertainties in balancing short-term and long-term objectives in water-flooding optimization.

    NARCIS (Netherlands)

    Siraj, M.M.; Hof, Van den P.M.J.; Jansen, J.D.

    2015-01-01

    Model-based optimization of oil production has a significant scope to increase ultimate recovery or financial life-cycle performance. The Net Present Value (NPV) objective in such an optimization framework, because of its nature, focuses on the long-term gains while the short-term production is not

  2. Aspects if stochastic models for short-term hydropower scheduling and bidding

    Energy Technology Data Exchange (ETDEWEB)

    Belsnes, Michael Martin [Sintef Energy, Trondheim (Norway); Follestad, Turid [Sintef Energy, Trondheim (Norway); Wolfgang, Ove [Sintef Energy, Trondheim (Norway); Fosso, Olav B. [Dep. of electric power engineering NTNU, Trondheim (Norway)

    2012-07-01

    This report discusses challenges met when turning from deterministic to stochastic decision support models for short-term hydropower scheduling and bidding. The report describes characteristics of the short-term scheduling and bidding problem, different market and bidding strategies, and how a stochastic optimization model can be formulated. A review of approaches for stochastic short-term modelling and stochastic modelling for the input variables inflow and market prices is given. The report discusses methods for approximating the predictive distribution of uncertain variables by scenario trees. Benefits of using a stochastic over a deterministic model are illustrated by a case study, where increased profit is obtained to a varying degree depending on the reservoir filling and price structure. Finally, an approach for assessing the effect of using a size restricted scenario tree to approximate the predictive distribution for stochastic input variables is described. The report is a summary of the findings of Work package 1 of the research project #Left Double Quotation Mark#Optimal short-term scheduling of wind and hydro resources#Right Double Quotation Mark#. The project aims at developing a prototype for an operational stochastic short-term scheduling model. Based on the investigations summarized in the report, it is concluded that using a deterministic equivalent formulation of the stochastic optimization problem is convenient and sufficient for obtaining a working prototype. (author)

  3. The IEA Model of Short-term Energy Security

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    Ensuring energy security has been at the centre of the IEA mission since its inception, following the oil crises of the early 1970s. While the security of oil supplies remains important, contemporary energy security policies must address all energy sources and cover a comprehensive range of natural, economic and political risks that affect energy sources, infrastructures and services. In response to this challenge, the IEA is currently developing a Model Of Short-term Energy Security (MOSES) to evaluate the energy security risks and resilience capacities of its member countries. The current version of MOSES covers short-term security of supply for primary energy sources and secondary fuels among IEA countries. It also lays the foundation for analysis of vulnerabilities of electricity and end-use energy sectors. MOSES contains a novel approach to analysing energy security, which can be used to identify energy security priorities, as a starting point for national energy security assessments and to track the evolution of a country's energy security profile. By grouping together countries with similar 'energy security profiles', MOSES depicts the energy security landscape of IEA countries. By extending the MOSES methodology to electricity security and energy services in the future, the IEA aims to develop a comprehensive policy-relevant perspective on global energy security. This Working Paper is intended for readers who wish to explore the MOSES methodology in depth; there is also a brochure which provides an overview of the analysis and results.

  4. Application of Multiple Evaluation Models in Brazil

    Directory of Open Access Journals (Sweden)

    Rafael Victal Saliba

    2008-07-01

    Full Text Available Based on two different samples, this article tests the performance of a number of Value Drivers commonly used for evaluating companies by finance practitioners, through simple regression models of cross-section type which estimate the parameters associated to each Value Driver, denominated Market Multiples. We are able to diagnose the behavior of several multiples in the period 1994-2004, with an outlook also on the particularities of the economic activities performed by the sample companies (and their impacts on the performance through a subsequent analysis with segregation of companies in the sample by sectors. Extrapolating simple multiples evaluation standards from analysts of the main financial institutions in Brazil, we find that adjusting the ratio formulation to allow for an intercept does not provide satisfactory results in terms of pricing errors reduction. Results found, in spite of evidencing certain relative and absolute superiority among the multiples, may not be generically representative, given samples limitation.

  5. Modelling techniques for predicting the long term consequences of radiation on natural aquatic populations

    International Nuclear Information System (INIS)

    Wallis, I.G.

    1978-01-01

    The purpose of this working paper is to describe modelling techniques for predicting the long term consequences of radiation on natural aquatic populations. Ideally, it would be possible to use aquatic population models: (1) to predict changes in the health and well-being of all aquatic populations as a result of changing the composition, amount and location of radionuclide discharges; (2) to compare the effects of steady, fluctuating and accidental releases of radionuclides; and (3) to evaluate the combined impact of the discharge of radionuclides and other wastes, and natural environmental stresses on aquatic populations. At the onset it should be stated that there is no existing model which can achieve this ideal performance. However, modelling skills and techniques are available to develop useful aquatic population models. This paper discusses the considerations involved in developing these models and briefly describes the various types of population models which have been developed to date

  6. Modelling cadmium contamination in paddy soils under long-term remediation measures: Model development and stochastic simulations.

    Science.gov (United States)

    Peng, Chi; Wang, Meie; Chen, Weiping

    2016-09-01

    A pollutant accumulation model (PAM) based on the mass balance theory was developed to simulate long-term changes of heavy metal concentrations in soil. When combined with Monte Carlo simulation, the model can predict the probability distributions of heavy metals in a soil-water-plant system with fluctuating environmental parameters and inputs from multiple pathways. The model was used for evaluating different remediation measures to deal with Cd contamination of paddy soils in Youxian county (Hunan province), China, under five scenarios, namely the default scenario (A), not returning paddy straw to the soil (B), reducing the deposition of Cd (C), liming (D), and integrating several remediation measures (E). The model predicted that the Cd contents of soil can lowered significantly by (B) and those of the plants by (D). However, in the long run, (D) will increase soil Cd. The concentrations of Cd in both soils and rice grains can be effectively reduced by (E), although it will take decades of effort. The history of Cd pollution and the major causes of Cd accumulation in soil were studied by means of sensitivity analysis and retrospective simulation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. A systematic and critical review of model-based economic evaluations of pharmacotherapeutics in patients with bipolar disorder.

    Science.gov (United States)

    Mohiuddin, Syed

    2014-08-01

    Bipolar disorder (BD) is a chronic and relapsing mental illness with a considerable health-related and economic burden. The primary goal of pharmacotherapeutics for BD is to improve patients' well-being. The use of decision-analytic models is key in assessing the added value of the pharmacotherapeutics aimed at treating the illness, but concerns have been expressed about the appropriateness of different modelling techniques and about the transparency in the reporting of economic evaluations. This paper aimed to identify and critically appraise published model-based economic evaluations of pharmacotherapeutics in BD patients. A systematic review combining common terms for BD and economic evaluation was conducted in MEDLINE, EMBASE, PSYCINFO and ECONLIT. Studies identified were summarised and critically appraised in terms of the use of modelling technique, model structure and data sources. Considering the prognosis and management of BD, the possible benefits and limitations of each modelling technique are discussed. Fourteen studies were identified using model-based economic evaluations of pharmacotherapeutics in BD patients. Of these 14 studies, nine used Markov, three used discrete-event simulation (DES) and two used decision-tree models. Most of the studies (n = 11) did not include the rationale for the choice of modelling technique undertaken. Half of the studies did not include the risk of mortality. Surprisingly, no study considered the risk of having a mixed bipolar episode. This review identified various modelling issues that could potentially reduce the comparability of one pharmacotherapeutic intervention with another. Better use and reporting of the modelling techniques in the future studies are essential. DES modelling appears to be a flexible and comprehensive technique for evaluating the comparability of BD treatment options because of its greater flexibility of depicting the disease progression over time. However, depending on the research question

  8. Knowledge management: Postgraduate Alternative Evaluation Model (MAPA in Brazil

    Directory of Open Access Journals (Sweden)

    Deisy Cristina Corrêa Igarashi

    2013-07-01

    Full Text Available The Brazilian stricto sensu postgraduate programs that include master and / or doctorate courses are evaluated by Coordination for the Improvement of Higher Education Personnel (CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior. The evaluation method used by CAPES is recognized in national and international context. However, several elements of the evaluation method can be improved. For example: to consider programs diversity, heterogeneity and specificities; to reduce subjectivity and to explain how indicators are grouped into different dimensions to generate a final result, which is scoring level reached by a program. This study aims to analyze the evaluation process by CAPES, presenting questions, difficulties and objections raised by researchers. From the analysis, the study proposes an alternative evaluation model for postgraduate (MAPA - Modelo de Avaliação para Pós graduação Alternativo which incorporates fuzzy logic in result analysis to minimize limitations identified. The MAPA was applied in three postgraduate programs, allowing: (1 better understanding of procedures used for the evaluation, (2 identifying elements that need regulation, (3 characterization of indicators that generate local evaluation, (4 support in medium and long term planning.

  9. The Spiral-Interactive Program Evaluation Model.

    Science.gov (United States)

    Khaleel, Ibrahim Adamu

    1988-01-01

    Describes the spiral interactive program evaluation model, which is designed to evaluate vocational-technical education programs in secondary schools in Nigeria. Program evaluation is defined; utility oriented and process oriented models for evaluation are described; and internal and external evaluative factors and variables that define each…

  10. Horizontal Residual Mean Circulation: Evaluation of Spatial Correlations in Coarse Resolution Ocean Models

    Science.gov (United States)

    Li, Y.; McDougall, T. J.

    2016-02-01

    Coarse resolution ocean models lack knowledge of spatial correlations between variables on scales smaller than the grid scale. Some researchers have shown that these spatial correlations play a role in the poleward heat flux. In order to evaluate the poleward transport induced by the spatial correlations at a fixed horizontal position, an equation is obtained to calculate the approximate transport from velocity gradients. The equation involves two terms that can be added to the quasi-Stokes streamfunction (based on temporal correlations) to incorporate the contribution of spatial correlations. Moreover, these new terms do not need to be parameterized and is ready to be evaluated by using model data directly. In this study, data from a high resolution ocean model have been used to estimate the accuracy of this HRM approach for improving the horizontal property fluxes in coarse-resolution ocean models. A coarse grid is formed by sub-sampling and box-car averaging the fine grid scale. The transport calculated on the coarse grid is then compared to the transport on original high resolution grid scale accumulated over a corresponding number of grid boxes. The preliminary results have shown that the estimate on coarse resolution grids roughly match the corresponding transports on high resolution grids.

  11. Creep model of unsaturated sliding zone soils and long-term deformation analysis of landslides

    Science.gov (United States)

    Zou, Liangchao; Wang, Shimei; Zhang, Yeming

    2015-04-01

    Sliding zone soil is a special soil layer formed in the development of a landslide. Its creep behavior plays a significant role in long-term deformation of landslides. Due to rainfall infiltration and reservoir water level fluctuation, the soils in the slide zone are often in unsaturated state. Therefore, the investigation of creep behaviors of the unsaturated sliding zone soils is of great importance for understanding the mechanism of the long-term deformation of a landslide in reservoir areas. In this study, the full-process creep curves of the unsaturated soils in the sliding zone in different net confining pressure, matric suctions and stress levels were obtained from a large number of laboratory triaxial creep tests. A nonlinear creep model for unsaturated soils and its three-dimensional form was then deduced based on the component model theory and unsaturated soil mechanics. This creep model was validated with laboratory creep data. The results show that this creep model can effectively and accurately describe the nonlinear creep behaviors of the unsaturated sliding zone soils. In order to apply this creep model to predict the long-term deformation process of landslides, a numerical model for simulating the coupled seepage and creep deformation of unsaturated sliding zone soils was developed based on this creep model through the finite element method (FEM). By using this numerical model, we simulated the deformation process of the Shuping landslide located in the Three Gorges reservoir area, under the cycling reservoir water level fluctuation during one year. The simulation results of creep displacement were then compared with the field deformation monitoring data, showing a good agreement in trend. The results show that the creeping deformations of landslides have strong connections with the changes of reservoir water level. The creep model of unsaturated sliding zone soils and the findings obtained by numerical simulations in this study are conducive to

  12. Human Thermal Model Evaluation Using the JSC Human Thermal Database

    Science.gov (United States)

    Bue, Grant; Makinen, Janice; Cognata, Thomas

    2012-01-01

    Human thermal modeling has considerable long term utility to human space flight. Such models provide a tool to predict crew survivability in support of vehicle design and to evaluate crew response in untested space environments. It is to the benefit of any such model not only to collect relevant experimental data to correlate it against, but also to maintain an experimental standard or benchmark for future development in a readily and rapidly searchable and software accessible format. The Human thermal database project is intended to do just so; to collect relevant data from literature and experimentation and to store the data in a database structure for immediate and future use as a benchmark to judge human thermal models against, in identifying model strengths and weakness, to support model development and improve correlation, and to statistically quantify a model s predictive quality. The human thermal database developed at the Johnson Space Center (JSC) is intended to evaluate a set of widely used human thermal models. This set includes the Wissler human thermal model, a model that has been widely used to predict the human thermoregulatory response to a variety of cold and hot environments. These models are statistically compared to the current database, which contains experiments of human subjects primarily in air from a literature survey ranging between 1953 and 2004 and from a suited experiment recently performed by the authors, for a quantitative study of relative strength and predictive quality of the models.

  13. Modeling and mining term association for improving biomedical information retrieval performance.

    Science.gov (United States)

    Hu, Qinmin; Huang, Jimmy Xiangji; Hu, Xiaohua

    2012-06-11

    The growth of the biomedical information requires most information retrieval systems to provide short and specific answers in response to complex user queries. Semantic information in the form of free text that is structured in a way makes it straightforward for humans to read but more difficult for computers to interpret automatically and search efficiently. One of the reasons is that most traditional information retrieval models assume terms are conditionally independent given a document/passage. Therefore, we are motivated to consider term associations within different contexts to help the models understand semantic information and use it for improving biomedical information retrieval performance. We propose a term association approach to discover term associations among the keywords from a query. The experiments are conducted on the TREC 2004-2007 Genomics data sets and the TREC 2004 HARD data set. The proposed approach is promising and achieves superiority over the baselines and the GSP results. The parameter settings and different indices are investigated that the sentence-based index produces the best results in terms of the document-level, the word-based index for the best results in terms of the passage-level and the paragraph-based index for the best results in terms of the passage2-level. Furthermore, the best term association results always come from the best baseline. The tuning number k in the proposed recursive re-ranking algorithm is discussed and locally optimized to be 10. First, modelling term association for improving biomedical information retrieval using factor analysis, is one of the major contributions in our work. Second, the experiments confirm that term association considering co-occurrence and dependency among the keywords can produce better results than the baselines treating the keywords independently. Third, the baselines are re-ranked according to the importance and reliance of latent factors behind term associations. These latent

  14. Near-Surface Meteorology During the Arctic Summer Cloud Ocean Study (ASCOS): Evaluation of Reanalyses and Global Climate Models.

    Science.gov (United States)

    De Boer, G.; Shupe, M.D.; Caldwell, P.M.; Bauer, Susanne E.; Persson, O.; Boyle, J.S.; Kelley, M.; Klein, S.A.; Tjernstrom, M.

    2014-01-01

    Atmospheric measurements from the Arctic Summer Cloud Ocean Study (ASCOS) are used to evaluate the performance of three atmospheric reanalyses (European Centre for Medium Range Weather Forecasting (ECMWF)- Interim reanalysis, National Center for Environmental Prediction (NCEP)-National Center for Atmospheric Research (NCAR) reanalysis, and NCEP-DOE (Department of Energy) reanalysis) and two global climate models (CAM5 (Community Atmosphere Model 5) and NASA GISS (Goddard Institute for Space Studies) ModelE2) in simulation of the high Arctic environment. Quantities analyzed include near surface meteorological variables such as temperature, pressure, humidity and winds, surface-based estimates of cloud and precipitation properties, the surface energy budget, and lower atmospheric temperature structure. In general, the models perform well in simulating large-scale dynamical quantities such as pressure and winds. Near-surface temperature and lower atmospheric stability, along with surface energy budget terms, are not as well represented due largely to errors in simulation of cloud occurrence, phase and altitude. Additionally, a development version of CAM5, which features improved handling of cloud macro physics, has demonstrated to improve simulation of cloud properties and liquid water amount. The ASCOS period additionally provides an excellent example of the benefits gained by evaluating individual budget terms, rather than simply evaluating the net end product, with large compensating errors between individual surface energy budget terms that result in the best net energy budget.

  15. Judging risk behaviour and risk preference: the role of the evaluative connotation of risk terms.

    NARCIS (Netherlands)

    van Schie, E.C.M.; van der Pligt, J.; van Baaren, K.

    1993-01-01

    Two experiments investigated the impact of the evaluative connotation of risk terms on the judgment of risk behavior and on risk preference. Exp 1 focused on the evaluation congruence of the risk terms with a general risk norm and with Ss' individual risk preference, and its effects on the extremity

  16. World Integrated Nuclear Evaluation System: Model documentation

    International Nuclear Information System (INIS)

    1991-12-01

    The World Integrated Nuclear Evaluation System (WINES) is an aggregate demand-based partial equilibrium model used by the Energy Information Administration (EIA) to project long-term domestic and international nuclear energy requirements. WINES follows a top-down approach in which economic growth rates, delivered energy demand growth rates, and electricity demand are projected successively to ultimately forecast total nuclear generation and nuclear capacity. WINES could be potentially used to produce forecasts for any country or region in the world. Presently, WINES is being used to generate long-term forecasts for the United States, and for all countries with commercial nuclear programs in the world, excluding countries located in centrally planned economic areas. Projections for the United States are developed for the period from 2010 through 2030, and for other countries for the period starting in 2000 or 2005 (depending on the country) through 2010. EIA uses a pipeline approach to project nuclear capacity for the period between 1990 and the starting year for which the WINES model is used. This approach involves a detailed accounting of existing nuclear generating units and units under construction, their capacities, their actual or estimated time of completion, and the estimated date of retirements. Further detail on this approach can be found in Appendix B of Commercial Nuclear Power 1991: Prospects for the United States and the World

  17. The EU model evaluation group

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1999-01-01

    The model evaluation group (MEG) was launched in 1992 growing out of the Major Technological Hazards Programme with EU/DG XII. The goal of MEG was to improve the culture in which models were developed, particularly by encouraging voluntary model evaluation procedures based on a formalised and consensus protocol. The evaluation intended to assess the fitness-for-purpose of the models being used as a measure of the quality. The approach adopted was focused on developing a generic model evaluation protocol and subsequent targeting this onto specific areas of application. Five such developments have been initiated, on heavy gas dispersion, liquid pool fires, gas explosions, human factors and momentum fires. The quality of models is an important element when complying with the 'Seveso Directive' requiring that the safety reports submitted to the authorities comprise an assessment of the extent and severity of the consequences of identified major accidents. Further, the quality of models become important in the land use planning process, where the proximity of industrial sites to vulnerable areas may be critical. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)

  18. Long-term outcome of eosinophilic fasciitis : A cross-sectional evaluation of 35 patients

    NARCIS (Netherlands)

    Mertens, JS; Thurlings, Rogier M; Kievit, Wietske; Seyger, Marieke M B; Radstake, Timothy R D; de Jong, Elke M G J

    BACKGROUND: Eosinophilic fasciitis (EF) is a connective tissue disease with an unknown long-term course. OBJECTIVE: To evaluate presence and determinants of residual disease damage in patients with EF after long-term follow-up. METHODS: Patients with biopsy-proven EF were included for this

  19. Long-term outcome of eosinophilic fasciitis: A cross-sectional evaluation of 35 patients

    NARCIS (Netherlands)

    Mertens, J.S.; Thurlings, R.M.; Kievit, W.; Seyger, M.M.B.; Radstake, T.R.D.J.; Jong, E.M.G.J. de

    2017-01-01

    BACKGROUND: Eosinophilic fasciitis (EF) is a connective tissue disease with an unknown long-term course. OBJECTIVE: To evaluate presence and determinants of residual disease damage in patients with EF after long-term follow-up. METHODS: Patients with biopsy-proven EF were included for this

  20. Observational uncertainty and regional climate model evaluation: A pan-European perspective

    Science.gov (United States)

    Kotlarski, Sven; Szabó, Péter; Herrera, Sixto; Räty, Olle; Keuler, Klaus; Soares, Pedro M.; Cardoso, Rita M.; Bosshard, Thomas; Pagé, Christian; Boberg, Fredrik; Gutiérrez, José M.; Jaczewski, Adam; Kreienkamp, Frank; Liniger, Mark. A.; Lussana, Cristian; Szepszo, Gabriella

    2017-04-01

    Local and regional climate change assessments based on downscaling methods crucially depend on the existence of accurate and reliable observational reference data. In dynamical downscaling via regional climate models (RCMs) observational data can influence model development itself and, later on, model evaluation, parameter calibration and added value assessment. In empirical-statistical downscaling, observations serve as predictand data and directly influence model calibration with corresponding effects on downscaled climate change projections. Focusing on the evaluation of RCMs, we here analyze the influence of uncertainties in observational reference data on evaluation results in a well-defined performance assessment framework and on a European scale. For this purpose we employ three different gridded observational reference grids, namely (1) the well-established EOBS dataset (2) the recently developed EURO4M-MESAN regional re-analysis, and (3) several national high-resolution and quality-controlled gridded datasets that recently became available. In terms of climate models five reanalysis-driven experiments carried out by five different RCMs within the EURO-CORDEX framework are used. Two variables (temperature and precipitation) and a range of evaluation metrics that reflect different aspects of RCM performance are considered. We furthermore include an illustrative model ranking exercise and relate observational spread to RCM spread. The results obtained indicate a varying influence of observational uncertainty on model evaluation depending on the variable, the season, the region and the specific performance metric considered. Over most parts of the continent, the influence of the choice of the reference dataset for temperature is rather small for seasonal mean values and inter-annual variability. Here, model uncertainty (as measured by the spread between the five RCM simulations considered) is typically much larger than reference data uncertainty. For

  1. Mark I containment, short term program. Safety evaluation report

    International Nuclear Information System (INIS)

    1977-12-01

    Presented is a Safety Evaluation Report (SER) prepared by the Office of Nuclear Reactor Regulation addressing the Short Term Program (STP) reassessment of the containment systems of operating Boiler Water Reactor (BWR) facilities with the Mark I containment system design. The information presented in this SER establishes the basis for the NRC staff's conclusion that licensed Mark I BWR facilities can continue to operate safely, without undue risk to the health and safety of the public, during an interim period of approximately two years while a methodical, comprehensive Long Term Program (LTP) is conducted. This SER also provides one of the basic foundations for the NRC staff review of the Mark I containment systems for facilities not yet licensed for operation

  2. Evidence used in model-based economic evaluations for evaluating pharmacogenetic and pharmacogenomic tests: a systematic review protocol.

    Science.gov (United States)

    Peters, Jaime L; Cooper, Chris; Buchanan, James

    2015-11-11

    Decision models can be used to conduct economic evaluations of new pharmacogenetic and pharmacogenomic tests to ensure they offer value for money to healthcare systems. These models require a great deal of evidence, yet research suggests the evidence used is diverse and of uncertain quality. By conducting a systematic review, we aim to investigate the test-related evidence used to inform decision models developed for the economic evaluation of genetic tests. We will search electronic databases including MEDLINE, EMBASE and NHS EEDs to identify model-based economic evaluations of pharmacogenetic and pharmacogenomic tests. The search will not be limited by language or date. Title and abstract screening will be conducted independently by 2 reviewers, with screening of full texts and data extraction conducted by 1 reviewer, and checked by another. Characteristics of the decision problem, the decision model and the test evidence used to inform the model will be extracted. Specifically, we will identify the reported evidence sources for the test-related evidence used, describe the study design and how the evidence was identified. A checklist developed specifically for decision analytic models will be used to critically appraise the models described in these studies. Variations in the test evidence used in the decision models will be explored across the included studies, and we will identify gaps in the evidence in terms of both quantity and quality. The findings of this work will be disseminated via a peer-reviewed journal publication and at national and international conferences. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  3. Source Term Model for Fine Particle Resuspension from Indoor Surfaces

    National Research Council Canada - National Science Library

    Kim, Yoojeong; Gidwani, Ashok; Sippola, Mark; Sohn, Chang W

    2008-01-01

    This Phase I effort developed a source term model for particle resuspension from indoor surfaces to be used as a source term boundary condition for CFD simulation of particle transport and dispersion in a building...

  4. A Multi-Stage Maturity Model for Long-Term IT Outsourcing Relationship Success

    Science.gov (United States)

    Luong, Ming; Stevens, Jeff

    2015-01-01

    The Multi-Stage Maturity Model for Long-Term IT Outsourcing Relationship Success, a theoretical stages-of-growth model, explains long-term success in IT outsourcing relationships. Research showed the IT outsourcing relationship life cycle consists of four distinct, sequential stages: contract, transition, support, and partnership. The model was…

  5. The EMEFS model evaluation. An interim report

    Energy Technology Data Exchange (ETDEWEB)

    Barchet, W.R. [Pacific Northwest Lab., Richland, WA (United States); Dennis, R.L. [Environmental Protection Agency, Research Triangle Park, NC (United States); Seilkop, S.K. [Analytical Sciences, Inc., Durham, NC (United States); Banic, C.M.; Davies, D.; Hoff, R.M.; Macdonald, A.M.; Mickle, R.E.; Padro, J.; Puckett, K. [Atmospheric Environment Service, Downsview, ON (Canada); Byun, D.; McHenry, J.N. [Computer Sciences Corp., Research Triangle Park, NC (United States); Karamchandani, P.; Venkatram, A. [ENSR Consulting and Engineering, Camarillo, CA (United States); Fung, C.; Misra, P.K. [Ontario Ministry of the Environment, Toronto, ON (Canada); Hansen, D.A. [Electric Power Research Inst., Palo Alto, CA (United States); Chang, J.S. [State Univ. of New York, Albany, NY (United States). Atmospheric Sciences Research Center

    1991-12-01

    The binational Eulerian Model Evaluation Field Study (EMEFS) consisted of several coordinated data gathering and model evaluation activities. In the EMEFS, data were collected by five air and precipitation monitoring networks between June 1988 and June 1990. Model evaluation is continuing. This interim report summarizes the progress made in the evaluation of the Regional Acid Deposition Model (RADM) and the Acid Deposition and Oxidant Model (ADOM) through the December 1990 completion of a State of Science and Technology report on model evaluation for the National Acid Precipitation Assessment Program (NAPAP). Because various assessment applications of RADM had to be evaluated for NAPAP, the report emphasizes the RADM component of the evaluation. A protocol for the evaluation was developed by the model evaluation team and defined the observed and predicted values to be used and the methods by which the observed and predicted values were to be compared. Scatter plots and time series of predicted and observed values were used to present the comparisons graphically. Difference statistics and correlations were used to quantify model performance. 64 refs., 34 figs., 6 tabs.

  6. Spectral model for long-term computation of thermodynamics and potential evaporation in shallow wetlands

    Science.gov (United States)

    de la Fuente, Alberto; Meruane, Carolina

    2017-09-01

    Altiplanic wetlands are unique ecosystems located in the elevated plateaus of Chile, Argentina, Peru, and Bolivia. These ecosystems are under threat due to changes in land use, groundwater extractions, and climate change that will modify the water balance through changes in precipitation and evaporation rates. Long-term prediction of the fate of aquatic ecosystems imposes computational constraints that make finding a solution impossible in some cases. In this article, we present a spectral model for long-term simulations of the thermodynamics of shallow wetlands in the limit case when the water depth tends to zero. This spectral model solves for water and sediment temperature, as well as heat, momentum, and mass exchanged with the atmosphere. The parameters of the model (water depth, thermal properties of the sediments, and surface albedo) and the atmospheric downscaling were calibrated using the MODIS product of the land surface temperature. Moreover, the performance of the daily evaporation rates predicted by the model was evaluated against daily pan evaporation data measured between 1964 and 2012. The spectral model was able to correctly represent both seasonal fluctuation and climatic trends observed in daily evaporation rates. It is concluded that the spectral model presented in this article is a suitable tool for assessing the global climate change effects on shallow wetlands whose thermodynamics is forced by heat exchanges with the atmosphere and modulated by the heat-reservoir role of the sediments.

  7. EVALUATING SHORT-TERM CLIMATE VARIABILITY IN THE LATE HOLOCENE OF THE NORTHERN GREAT PLAINS

    Energy Technology Data Exchange (ETDEWEB)

    Joseph H. Hartman

    1999-09-01

    Great Plains, northern hemisphere, and elsewhere. Finally these data can be integrated into a history of climate change and predictive climate models. This is not a small undertaking. The goals of researchers and the methods used vary considerably. The primary task of this project was literature research to (1) evaluate existing methodologies used in geologic climate change studies and evidence for short-term cycles produced by these methodologies and (2) evaluate late Holocene climate patterns and their interpretations.

  8. Short-Term Memory and Its Biophysical Model

    Science.gov (United States)

    Wang, Wei; Zhang, Kai; Tang, Xiao-wei

    1996-12-01

    The capacity of short-term memory has been studied using an integrate-and-fire neuronal network model. It is found that the storage of events depend on the manner of the correlation between the events, and the capacity is dominated by the value of after-depolarization potential. There is a monotonic increasing relationship between the value of after-depolarization potential and the memory numbers. The biophysics relevance of the network model is discussed and different kinds of the information processes are studied too.

  9. An Adapted Porter Diamond Model for the Evaluation of Transnational Education Host Countries

    Science.gov (United States)

    Tsiligiris, Vangelis

    2018-01-01

    Purpose: The purpose of this paper is to propose an adapted Porter Diamond Model (PDM) that can be used by transnational education (TNE) countries and institutions as an analytical framework for the strategic evaluation of TNE host countries in terms of attractiveness for exporting higher education. Design/methodology/approach: The study uses a…

  10. Radiological consequence evaluation of DBAs with alternative source term method for a Chinese PWR

    International Nuclear Information System (INIS)

    Li, J.X.; Cao, X.W.; Tong, L.L.; Huang, G.F.

    2012-01-01

    Highlights: ► Radiological consequence evaluation of DBAs with alternative source term method for a Chinese 900 MWe PWR has been investigated. ► Six typical DBA sequences are analyzed. ► The doses of control room, EAB and outer boundary of LPZ are acceptable. ► The differences between AST method and TID-14844 method are investigated. - Abstract: Since a large amount of fission products may releases into the environment, during the accident progression in nuclear power plants (NPPs), which is a potential hazard to public risk, the radiological consequence should be evaluated for alleviating the hazard. In most Chinese NPPs the method of TID-14844, in which the whole body and thyroid dose criteria is employed as dose criteria, is currently adopted to evaluate the radiological consequences for design-basis accidents (DBAs), but, due to the total effective dose equivalent is employed as dose criteria in alternative radiological source terms (AST) method, it is necessary to evaluate the radiological consequences for DBAs with AST method and to discuss the difference between two methods. By using an integral safety analysis code, an analytical model of the 900 MWe pressurized water reactor (PWR) is built and the radiological consequences in DBAs at control room (CR), exclusion area boundary (EAB), low population zone (LPZ) are analyzed, which includes LOCA and non-LOCA DBAs, such as fuel handling accident (FHA), rod ejection accident (REA), main steam line break (MSLB), steam generator tube rupture (SGTR), locked rotor accident (LRA) by using the guidance of the RG 1.183. The results show that the doses in CR, EAB and LPZ are acceptable compared with dose criteria in RG 1.183 and the differences between AST method and TID-14844 method are also discussed.

  11. An ARM data-oriented diagnostics package to evaluate the climate model simulation

    Science.gov (United States)

    Zhang, C.; Xie, S.

    2016-12-01

    A set of diagnostics that utilize long-term high frequency measurements from the DOE Atmospheric Radiation Measurement (ARM) program is developed for evaluating the regional simulation of clouds, radiation and precipitation in climate models. The diagnostics results are computed and visualized automatically in a python-based package that aims to serve as an easy entry point for evaluating climate simulations using the ARM data, as well as the CMIP5 multi-model simulations. Basic performance metrics are computed to measure the accuracy of mean state and variability of simulated regional climate. The evaluated physical quantities include vertical profiles of clouds, temperature, relative humidity, cloud liquid water path, total column water vapor, precipitation, sensible and latent heat fluxes, radiative fluxes, aerosol and cloud microphysical properties. Process-oriented diagnostics focusing on individual cloud and precipitation-related phenomena are developed for the evaluation and development of specific model physical parameterizations. Application of the ARM diagnostics package will be presented in the AGU session. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, IM release number is: LLNL-ABS-698645.

  12. Short-term carcinogenesis evaluation of Casearia sylvestris

    Directory of Open Access Journals (Sweden)

    Cleide A.S. Tirloni

    Full Text Available Abstract Casearia sylvestris Sw., Salicaceae, is an important medicinal plant widely used in Brazil for the treatment of various cardiovascular disorders. This species was included as of interest by Brazilian Unified Health System. Although preclinical studies described cardiovascular protective effects and apparent absence of toxicity, no studies have evaluated its carcinogenic potential. In this study, we proposed a short-term carcinogenesis evaluation of C. sylvestris in Wistar rats, aiming to check the safety of this species to use it as proposed by Brazilian Unified Health System. C. sylvestris leaves were obtained and the crude extract was prepared by maceration from methanol/water. Wistar rats were orally treated for 12 weeks with 50, 250 or 500 mg kg−1 of crude extract or vehicle. Body weight, daily morbidity and mortality were monitored. Blood and bone marrow samples were collect for micronucleus test, comet assay and tumor markers evaluation. Vital organs were removed to macro and histopathological analyses. The crude extract did not induce mutagenic and genotoxic effects and no alterations were observed in important tumor markers. Finally, no detectable signs of injury through gross pathology or histopathological examinations were observed. Our results certify the absence of the crude extract toxicity, indicating its safety, even at prolonged exposure as proposed by Brazilian Unified Health System.

  13. Testing the Bivalent Fear of Evaluation Model of Social Anxiety: The Relationship between Fear of Positive Evaluation, Social Anxiety, and Perfectionism.

    Science.gov (United States)

    Yap, Keong; Gibbs, Amy L; Francis, Andrew J P; Schuster, Sharynn E

    2016-01-01

    The Bivalent Fear of Evaluation (BFOE) model of social anxiety proposes that fear of negative evaluation (FNE) and fear of positive evaluation (FPE) play distinct roles in social anxiety. Research is however lacking in terms of how FPE is related to perfectionism and how these constructs interact to predict social anxiety. Participants were 382 individuals from the general community and included an oversampling of individuals with social anxiety. Measures of FPE, FNE, perfectionism, and social anxiety were administered. Results were mostly consistent with the predictions made by the BFOE model and showed that accounting for confounding variables, FPE correlated negatively with high standards but positively with maladaptive perfectionism. FNE was also positively correlated with maladaptive perfectionism, but there was no significant relationship between FNE and high standards. Also consistent with BFOE model, both FNE and FPE significantly moderated the relationship between maladaptive perfectionism and social anxiety with the relationship strengthened at high levels of FPE and FNE. These findings provide additional support for the BFOE model and implications are discussed.

  14. Long-Term Evaluation of SSL Field Performance in Select Interior Projects

    Energy Technology Data Exchange (ETDEWEB)

    Perrin, Tess E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Davis, Robert G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wilkerson, Andrea M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-02-28

    This GATEWAY project evaluated four field installations to better understand the long-term performance of a number of LED products, which can hopefully stimulate improvements in designing, manufacturing, specifying, procuring, and installing LED products. Field studies provide the opportunity to discover and investigate issues that cannot be simulated or uncovered in a laboratory, but the installed performance over time of commercially available LED products has not been well documented. Improving long-term performance can provide both direct energy savings by reducing the need to over-light to account for light loss and indirect energy savings through better market penetration due to SSL’s competitive advantages over less-efficient light source technologies. The projects evaluated for this report illustrate that SSL use is often motivated by advantages other than energy savings, including maintenance savings, easier integration with control systems, and improved lighting quality.

  15. Evaluation of Cyber Security and Modelling of Risk Propagation with Petri Nets

    Directory of Open Access Journals (Sweden)

    Marcin Szpyrka

    2017-02-01

    Full Text Available This article presents a new method of risk propagation among associated elements. On thebasis of coloured Petri nets, a new class called propagation nets is defined. This class providesa formal model of a risk propagation. The proposed method allows for model relations betweennodes forming the network structure. Additionally, it takes into account the bidirectional relationsbetween components as well as relations between isomorphic, symmetrical components in variousbranches of the network. This method is agnostic in terms of use in various systems and it canbe adapted to the propagation model of any systems’ characteristics; however, it is intentionallyproposed to assess the risk of critical infrastructures. In this paper, as a proof of concept example, weshow the formal model of risk propagation proposed within the project Cyberspace Security ThreatsEvaluation System of the Republic of Poland. In the article, the idea of the method is presented aswell as its use case for evaluation of risk for cyber threats. With the adaptation of Petri nets, it ispossible to evaluate the risk for the particular node and assess the impact of this risk for all relatednodes including hierarchic relations of components as well as isomorphism of elements.

  16. Evaluation of the long-term power generation mix: The case study of South Korea's energy policy

    International Nuclear Information System (INIS)

    Min, Daiki; Chung, Jaewoo

    2013-01-01

    This paper presents a practical portfolio model for the long-term power generation mix problem. The proposed model optimizes the power generation mix by striking a trade-off between the expected cost of power generation and its variability. We use Monte Carlo simulation techniques to consider the uncertainty associated with future electricity demand, fuel prices and their correlations, and the capital costs of power plants. Unlike in the case of conventional power generation mix models, we employ CVaR (Conditional Value-at-Risk) in designing variability to consider events that are rare but enormously expensive. A comprehensive analysis on South Korea's generation policy using the portfolio model shows that a large annual cost is additionally charged to substitute a portion of nuclear energy with other alternatives. Nonetheless, if Korea has to reduce its dependency on nuclear energy because of undermined social receptivity from the Fukushima disaster, it turns out that LNG or coal could be a secure candidate from an economic perspective. - Author-Highlights: • We develop a stochastic optimization model for long-term power generation mix. • Monte Carlo sampling method and scenario trees are used to solve the model. • The model is verified using the data provided by Korean government. • We evaluate Korea's existing nuclear expansion policy. • We analyze the cost of replacing nuclear energy with others in South Korea

  17. A BHLS model based moment analysis of muon g-2, and its use for lattice QCD evaluations of ahadμ

    International Nuclear Information System (INIS)

    Benayoun, M.; DelBuono, L.

    2016-05-01

    We present an up-to-date analysis of muon g-2 evaluations in terms of Mellin-Barnes moments as they might be useful for lattice QCD calculations of a μ . The moments up to 4th order are evaluated directly in terms of e + e - -annihilation data and improved within the Hidden Local Symmetry (HLS) Model, supplied with appropriate symmetry breaking mechanisms. The model provides a reliable Effective Lagrangian (BHLS) estimate of the two-body channels plus the πππ channel up to 1.05 GeV, just including the φ resonance. The HLS piece accounts for 80% of the contribution to a μ . The missing pieces are evaluated in the standard way directly in terms of the data. We find that the moment expansion converges well in terms of a few moments. The two types of moments which show up in the Mellin-Barnes representation are calculated in terms of hadronic cross-section data in the timelike region and in terms of the hadronic vacuum polarization (HVP) function in the spacelike region which is accessible to lattice QCD (LQCD). In the Euclidean the first type of moments are the usual Taylor coefficients of the HVP and we show that the second type of moments may be obtained as integrals over the appropriately Taylor truncated HVP function. Specific results for the isovector part of a had μ are determined by means of HLS model predictions in close relation to τ-decay spectra.

  18. Maximal monotone model with delay term of convolution

    Directory of Open Access Journals (Sweden)

    Claude-Henri Lamarque

    2005-01-01

    Full Text Available Mechanical models are governed either by partial differential equations with boundary conditions and initial conditions (e.g., in the frame of continuum mechanics or by ordinary differential equations (e.g., after discretization via Galerkin procedure or directly from the model description with the initial conditions. In order to study dynamical behavior of mechanical systems with a finite number of degrees of freedom including nonsmooth terms (e.g., friction, we consider here problems governed by differential inclusions. To describe effects of particular constitutive laws, we add a delay term. In contrast to previous papers, we introduce delay via a Volterra kernel. We provide existence and uniqueness results by using an Euler implicit numerical scheme; then convergence with its order is established. A few numerical examples are given.

  19. Hemispheric specialisation in selective attention and short-term memory: A fine-coarse model of left and right ear disadvantages

    Directory of Open Access Journals (Sweden)

    John E. Marsh

    2013-12-01

    Full Text Available Serial short-term memory is impaired by irrelevant sound, particularly when the sound changes acoustically. This acoustic effect is larger when the sound is presented to the left compared to the right ear (a left-ear disadvantage. Serial memory appears relatively insensitive to distraction from the semantic properties of a background sound. In contrast, short-term free recall of semantic-category exemplars is impaired by the semantic properties of background speech and relatively insensitive to the sound’s acoustic properties. This semantic effect is larger when the sound is presented to the right compared to the left ear (a right-ear disadvantage. In this paper, we outline a speculative neurocognitive fine-coarse model of these hemispheric differences in relation to short-term memory and selective attention, and explicate empirical directions in which this model can be critically evaluated.

  20. Biological ensemble modeling to evaluate potential futures of living marine resources

    DEFF Research Database (Denmark)

    Gårdmark, Anna; Lindegren, Martin; Neuenfeldt, Stefan

    2013-01-01

    ) as an example. The core of the approach is to expose an ensemble of models with different ecological assumptions to climate forcing, using multiple realizations of each climate scenario. We simulated the long-term response of cod to future fishing and climate change in seven ecological models ranging from...... model assumptions from the statistical uncertainty of future climate, and (3) identified results common for the whole model ensemble. Species interactions greatly influenced the simulated response of cod to fishing and climate, as well as the degree to which the statistical uncertainty of climate...... in all models, intense fishing prevented recovery, and climate change further decreased the cod population. Our study demonstrates how the biological ensemble modeling approach makes it possible to evaluate the relative importance of different sources of uncertainty in future species responses, as well...

  1. Towards a benchmark simulation model for plant-wide control strategy performance evaluation of WWTPs

    DEFF Research Database (Denmark)

    Jeppsson, Ulf; Rosen, Christian; Alex, Jens

    2006-01-01

    The COST/IWA benchmark simulation model has been available for seven years. Its primary purpose has been to create a platform for control strategy benchmarking of activated sludge processes. The fact that the benchmark has resulted in more than 100 publications, not only in Europe but also...... worldwide, demonstrates the interest in such a tool within the research community In this paper, an extension of the benchmark simulation model no 1 (BSM1) is proposed. This extension aims at facilitating control strategy development and performance evaluation at a plant-wide level and, consequently...... the changes, the evaluation period has been extended to one year. A prolonged evaluation period allows for long-term control strategies to be assessed and enables the use of control handles that cannot be evaluated in a realistic fashion in the one-week BSM1 evaluation period. In the paper, the extended plant...

  2. Numerical modelling of the long-term evolution of EDZ. Development of material models, implementation in finite-element codes, and validation

    International Nuclear Information System (INIS)

    Pudewills, A.

    2005-11-01

    Construction of deep underground structures disturbs the initial stress field in the surrounding rock. This effect can generate microcracks and alter the hydromechanical properties of the rock salt around the excavations. For the long-term performance of an underground repository in rock salt, the evolution of the 'Excavation Disturbed Zone' (EDZ) and the hydromechanical behaviour of this zone represent important issues with respect to the integrity of the geological and technical barriers. Within the framework of the NF-PRO project, WP 4.4, attention focuses on the mathematical modelling of the development and evolution of the EDZ in the rock near a disposal drift due to its relevance on the integrity of the geological and technical barriers. To perform this task, finite-element codes containing a set of time- and temperature-dependent constitutive models have been improved. A new viscoplastic constitutive model for rock salt that can describe the damage of the rock has been implemented in the finite-element codes available. The model parameters were evaluated based on experimental results. Additionally, the long-term evolution of the EDZ around a gallery in a salt mine at about 700 m below the surface was analysed and the numerical results were compared with in-situ measurements. The calculated room closure, stress distribution and the increase of rock permeability in the EDZ were compared with in-situ data, thus providing confidence in the model used. (orig.)

  3. Evaluation of an objective plan-evaluation model in the three dimensional treatment of nonsmall cell lung cancer

    International Nuclear Information System (INIS)

    Graham, Mary V.; Jain, Nilesh L.; Kahn, Michael G.; Drzymala, Robert E.; Purdy, James A.

    1996-01-01

    Purpose: Evaluation of three dimensional (3D) radiotherapy plans is difficult because it requires the review of vast amounts of data. Selecting the optimal plan from a set of competing plans involves making trade-offs among the doses delivered to the target volumes and normal tissues. The purpose of this study was to test an objective plan-evaluation model and evaluate its clinical usefulness in 3D treatment planning for nonsmall cell lung cancer. Methods and Materials: Twenty patients with inoperable nonsmall cell lung cancer treated with definitive radiotherapy were studied using full 3D techniques for treatment design and implementation. For each patient, the evaluator (the treating radiation oncologist) initially ranked three plans using room-view dose-surface isplays and dose-volume histograms, and identified the issues that needed to be improved. The three plans were then ranked by the objective plan-evaluation model. A figure of merit (FOM) was computed for each plan by combining the numerical score (utility in decision-theoretic terms) for each clinical issue. The utility was computed from a probability of occurrence of the issue and a physician-specific weight indicating its clinical relevance. The FOM was used to rank the competing plans for a patient, and the utility was used to identify issues that needed to be improved. These were compared with the initial evaluations of the physician and discrepancies were analyzed. The issues identified in the best treatment plan were then used to attempt further manual optimization of this plan. Results: For the 20 patients (60 plans) in the study, the final plan ranking produced by the plan-evaluation model had an initial 73% agreement with the ranking provided by the evaluator. After discrepant cases were reviewed by the physician, the model was usually judged more objective or 'correct'. In most cases the model was also able to correctly identify the issues that needed improvement in each plan. Subsequent

  4. Development of remediation/decontamination strategies reflecting local conditions by the EU long-term radiation exposure model for inhabited areas (ERMIN)

    International Nuclear Information System (INIS)

    Sakuma, Kazuyuki; Nanba, Kenji; Terada, Akihiko; Hosomi, Masaaki

    2015-01-01

    The European model for inhabited areas (ERMIN), developed for prediction of radioactive compounds Chernobyl accident, was applied at Tomioka in Fukushima Prefecture as a model region for decontamination to investigate its feasibility. The application of ERMIN to eight compartments with each 100 x 100 m in this region, where decontamination was actually performed, confirmed that observed air dose rates were within the calculated counterparts irrespective of presence or absence of a term on environment half life for two months. With simulation sets capable of reproducing an air dose rate during a decontamination term, decontamination strategies incorporating five evaluation items, i.e. reduction of radiation exposure to inhabitants, cost for decontamination, the amount of waste, work effort, and the amount of radiation exposure for a worker, were proposed and compared. By initiating decontamination 9 months after the accident in Fukushima, radiation exposure to inhabitants, who continue to live at the modeled region, is reduced by about 24 mSv for the ensuring 15 months. Furthermore, decontamination strategies were compared by prioritizing the five evaluation items. (author)

  5. Modelling of long-term and short-term mechanisms of arterial pressure control in the cardiovascular system: an object-oriented approach.

    Science.gov (United States)

    Fernandez de Canete, J; Luque, J; Barbancho, J; Munoz, V

    2014-04-01

    A mathematical model that provides an overall description of both the short- and long-term mechanisms of arterial pressure regulation is presented. Short-term control is exerted through the baroreceptor reflex while renal elimination plays a role in long-term control. Both mechanisms operate in an integrated way over the compartmental model of the cardiovascular system. The whole system was modelled in MODELICA, which uses a hierarchical object-oriented modelling strategy, under the DYMOLA simulation environment. The performance of the controlled system was analysed by simulation in light of the existing hypothesis and validation tests previously performed with physiological data, demonstrating the effectiveness of both regulation mechanisms under physiological and pathological conditions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. A report on evaluation of research and development subjects in fiscal year 2001. Evaluation subject on the 'Middle- and long-term business program'

    International Nuclear Information System (INIS)

    2001-09-01

    The middle- and long-term business program determined by the Japan Nuclear Cycle Development Institute (JNC) is for elucidation of middle- and long-term targets to be expanded by JNC and is a base to promote individual R and D. This program is to be revised at a chance established on new long-term plan on research, development and application of nuclear energy on November, 2000 by the Committee of Atomic Energy under consideration of condition change after March, 1999. This report is a summary of evaluation results on the present middle- and long-term business program established by JNC, especially at a center of its revised portion, as a form of opinion. The evaluated results are described on two forms of the subject evaluation committees on the fast reactor and fuel cycle and on the wastes processing and disposal. (G.K.)

  7. Using Random Forests to Select Optimal Input Variables for Short-Term Wind Speed Forecasting Models

    Directory of Open Access Journals (Sweden)

    Hui Wang

    2017-10-01

    Full Text Available Achieving relatively high-accuracy short-term wind speed forecasting estimates is a precondition for the construction and grid-connected operation of wind power forecasting systems for wind farms. Currently, most research is focused on the structure of forecasting models and does not consider the selection of input variables, which can have significant impacts on forecasting performance. This paper presents an input variable selection method for wind speed forecasting models. The candidate input variables for various leading periods are selected and random forests (RF is employed to evaluate the importance of all variable as features. The feature subset with the best evaluation performance is selected as the optimal feature set. Then, kernel-based extreme learning machine is constructed to evaluate the performance of input variables selection based on RF. The results of the case study show that by removing the uncorrelated and redundant features, RF effectively extracts the most strongly correlated set of features from the candidate input variables. By finding the optimal feature combination to represent the original information, RF simplifies the structure of the wind speed forecasting model, shortens the training time required, and substantially improves the model’s accuracy and generalization ability, demonstrating that the input variables selected by RF are effective.

  8. Individual model evaluation and probabilistic weighting of models

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-01-01

    This note stresses the importance of trying to assess the accuracy of each model individually. Putting a Bayesian probability distribution on a population of models faces conceptual and practical complications, and apparently can come only after the work of evaluating the individual models. Moreover, the primary issue is open-quotes How good is this modelclose quotes? Therefore, the individual evaluations are first in both chronology and importance. They are not easy, but some ideas are given here on how to perform them

  9. Modelling the short term herding behaviour of stock markets

    International Nuclear Information System (INIS)

    Shapira, Yoash; Berman, Yonatan; Ben-Jacob, Eshel

    2014-01-01

    Modelling the behaviour of stock markets has been of major interest in the past century. The market can be treated as a network of many investors reacting in accordance to their group behaviour, as manifested by the index and effected by the flow of external information into the system. Here we devise a model that encapsulates the behaviour of stock markets. The model consists of two terms, demonstrating quantitatively the effect of the individual tendency to follow the group and the effect of the individual reaction to the available information. Using the above factors we were able to explain several key features of the stock market: the high correlations between the individual stocks and the index; the Epps effect; the high fluctuating nature of the market, which is similar to real market behaviour. Furthermore, intricate long term phenomena are also described by this model, such as bursts of synchronized average correlation and the dominance of the index as demonstrated through partial correlation. (paper)

  10. A Parametric Factor Model of the Term Structure of Mortality

    DEFF Research Database (Denmark)

    Haldrup, Niels; Rosenskjold, Carsten Paysen T.

    The prototypical Lee-Carter mortality model is characterized by a single common time factor that loads differently across age groups. In this paper we propose a factor model for the term structure of mortality where multiple factors are designed to influence the age groups differently via...... on the loading functions, the factors are not designed to be orthogonal but can be dependent and can possibly cointegrate when the factors have unit roots. We suggest two estimation procedures similar to the estimation of the dynamic Nelson-Siegel term structure model. First, a two-step nonlinear least squares...... procedure based on cross-section regressions together with a separate model to estimate the dynamics of the factors. Second, we suggest a fully specified model estimated by maximum likelihood via the Kalman filter recursions after the model is put on state space form. We demonstrate the methodology for US...

  11. Evaluation on the long-term durability and leachability of cemented waste form

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ki Hong; Lee, Jae Won; Ryue, Young Gerl [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-08-01

    The long-term durability and leachability on the cemented waste form containing boric acids produced in domestic nuclear nuclear power plants were evaluated. Compressive strength of waste form after durability test such as thermal stability and water immersion cycle was higher than before test and consistently increased with increasing of test time in range of 83 of 286 kgf/cm{sup 2}. Long-term leachability was evaluated by standard test methods, leach affecting factors, and prediction of long-term leachability with result data of short-term leach test. In all leach tests, the release of Cs-137 was controlled by diffusion, whereas release of Co-60 was not controlled by diffusion. Leach rate of Cs-137 was relatively constant at standard leach test methods such as NAS 16.1, IAEA, ISO-6961, and MCC-1, but that of Co-60 increased with leachant-renewal frequencies. The leach rate of both Cs-137 and Co-60 increased as test temperature raised. The release of Co-137 decreased in simulated seawater as leachant, but increased with increasing leachant volume. The prediciton of long-term release of Cs-137 from large-scale waste form using the results from short-term leach test of small-scale waste form were within {+-} 5% of actual release. The leachability indexes of Cs-137 were between 6.5 and 7.5 and those of Co-60 were ranged from 11.6 to 13.3, increasing as cumulative leaching time increased. (author). 22 refs., 14 figs., 15 tabs.

  12. Index-based groundwater vulnerability mapping models using hydrogeological settings: A critical evaluation

    International Nuclear Information System (INIS)

    Kumar, Prashant; Bansod, Baban K.S.; Debnath, Sanjit K.; Thakur, Praveen Kumar; Ghanshyam, C.

    2015-01-01

    Groundwater vulnerability maps are useful for decision making in land use planning and water resource management. This paper reviews the various groundwater vulnerability assessment models developed across the world. Each model has been evaluated in terms of its pros and cons and the environmental conditions of its application. The paper further discusses the validation techniques used for the generated vulnerability maps by various models. Implicit challenges associated with the development of the groundwater vulnerability assessment models have also been identified with scientific considerations to the parameter relations and their selections. - Highlights: • Various index-based groundwater vulnerability assessment models have been discussed. • A comparative analysis of the models and its applicability in different hydrogeological settings has been discussed. • Research problems of underlying vulnerability assessment models are also reported in this review paper

  13. Index-based groundwater vulnerability mapping models using hydrogeological settings: A critical evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Prashant, E-mail: prashantkumar@csio.res.in [CSIR-Central Scientific Instruments Organisation, Chandigarh 160030 (India); Academy of Scientific and Innovative Research—CSIO, Chandigarh 160030 (India); Bansod, Baban K.S.; Debnath, Sanjit K. [CSIR-Central Scientific Instruments Organisation, Chandigarh 160030 (India); Academy of Scientific and Innovative Research—CSIO, Chandigarh 160030 (India); Thakur, Praveen Kumar [Indian Institute of Remote Sensing (ISRO), Dehradun 248001 (India); Ghanshyam, C. [CSIR-Central Scientific Instruments Organisation, Chandigarh 160030 (India); Academy of Scientific and Innovative Research—CSIO, Chandigarh 160030 (India)

    2015-02-15

    Groundwater vulnerability maps are useful for decision making in land use planning and water resource management. This paper reviews the various groundwater vulnerability assessment models developed across the world. Each model has been evaluated in terms of its pros and cons and the environmental conditions of its application. The paper further discusses the validation techniques used for the generated vulnerability maps by various models. Implicit challenges associated with the development of the groundwater vulnerability assessment models have also been identified with scientific considerations to the parameter relations and their selections. - Highlights: • Various index-based groundwater vulnerability assessment models have been discussed. • A comparative analysis of the models and its applicability in different hydrogeological settings has been discussed. • Research problems of underlying vulnerability assessment models are also reported in this review paper.

  14. Evaluation of oxidative status in short-term exercises of adolescent athletes

    OpenAIRE

    K Karacabey; A Atas; D Zeyrek; A Cakmak; R Kurkcu; F Yamaner

    2010-01-01

    The aim of the study was to evaluate the effects of short-term exercise on total antioxidant status (TAS), lipid hydroperoxide (LOOHs), total oxidative status (TOS) and oxidative stress index (OSI) in adolescent athletes. A total of 62 adolescent participated in the study. Athletes were trained regularly 3 days a week for 2 hours. All subjects followed a circuit exercise program. Blood samples were collected just before and immediately after the exercise program. Antioxidant status was evalu...

  15. Rock mechanics models evaluation report

    International Nuclear Information System (INIS)

    1987-08-01

    This report documents the evaluation of the thermal and thermomechanical models and codes for repository subsurface design and for design constraint analysis. The evaluation was based on a survey of the thermal and thermomechanical codes and models that are applicable to subsurface design, followed by a Kepner-Tregoe (KT) structured decision analysis of the codes and models. The primary recommendations of the analysis are that the DOT code be used for two-dimensional thermal analysis and that the STEALTH and HEATING 5/6 codes be used for three-dimensional and complicated two-dimensional thermal analysis. STEALTH and SPECTROM 32 are recommended for thermomechanical analyses. The other evaluated codes should be considered for use in certain applications. A separate review of salt creep models indicate that the commonly used exponential time law model is appropriate for use in repository design studies. 38 refs., 1 fig., 7 tabs

  16. An evaluation of attention models for use in SLAM

    Science.gov (United States)

    Dodge, Samuel; Karam, Lina

    2013-12-01

    In this paper we study the application of visual saliency models for the simultaneous localization and mapping (SLAM) problem. We consider visual SLAM, where the location of the camera and a map of the environment can be generated using images from a single moving camera. In visual SLAM, the interest point detector is of key importance. This detector must be invariant to certain image transformations so that features can be matched across di erent frames. Recent work has used a model of human visual attention to detect interest points, however it is unclear as to what is the best attention model for this purpose. To this aim, we compare the performance of interest points from four saliency models (Itti, GBVS, RARE, and AWS) with the performance of four traditional interest point detectors (Harris, Shi-Tomasi, SIFT, and FAST). We evaluate these detectors under several di erent types of image transformation and nd that the Itti saliency model, in general, achieves the best performance in terms of keypoint repeatability.

  17. Evaluation of animal models of neurobehavioral disorders

    Directory of Open Access Journals (Sweden)

    Nordquist Rebecca E

    2009-02-01

    Full Text Available Abstract Animal models play a central role in all areas of biomedical research. The process of animal model building, development and evaluation has rarely been addressed systematically, despite the long history of using animal models in the investigation of neuropsychiatric disorders and behavioral dysfunctions. An iterative, multi-stage trajectory for developing animal models and assessing their quality is proposed. The process starts with defining the purpose(s of the model, preferentially based on hypotheses about brain-behavior relationships. Then, the model is developed and tested. The evaluation of the model takes scientific and ethical criteria into consideration. Model development requires a multidisciplinary approach. Preclinical and clinical experts should establish a set of scientific criteria, which a model must meet. The scientific evaluation consists of assessing the replicability/reliability, predictive, construct and external validity/generalizability, and relevance of the model. We emphasize the role of (systematic and extended replications in the course of the validation process. One may apply a multiple-tiered 'replication battery' to estimate the reliability/replicability, validity, and generalizability of result. Compromised welfare is inherent in many deficiency models in animals. Unfortunately, 'animal welfare' is a vaguely defined concept, making it difficult to establish exact evaluation criteria. Weighing the animal's welfare and considerations as to whether action is indicated to reduce the discomfort must accompany the scientific evaluation at any stage of the model building and evaluation process. Animal model building should be discontinued if the model does not meet the preset scientific criteria, or when animal welfare is severely compromised. The application of the evaluation procedure is exemplified using the rat with neonatal hippocampal lesion as a proposed model of schizophrenia. In a manner congruent to

  18. Application feasibility study of evaluation technology for long-term rock behavior. 2. Parameter setting of variable compliance type model and application feasibility study for rock behavior evaluation

    International Nuclear Information System (INIS)

    Sato, Shin; Noda, Masaru; Niunoya, Sumio; Hata, Koji; Matsui, Hiroya; Mikake, Shinichiro

    2012-01-01

    Creep phenomenon is one of the long-term rock behaviors. In many of rock-creep studies, model and parameter have been verified in 2D analysis using model parameter acquired by uniaxial compression test etc considering rock types. Therefore, in this study model parameter was set by uniaxial compression test with classified rock samples which were taken from pilot boring when the main shaft was constructed. Then, comparison between measured value and 3D excavation analysis with identified parameter was made. By and large, the study showed that validity of identification methodology of parameter to identify reproduction of measured value and analysis method. (author)

  19. Development of a Watershed-Scale Long-Term Hydrologic Impact Assessment Model with the Asymptotic Curve Number Regression Equation

    Directory of Open Access Journals (Sweden)

    Jichul Ryu

    2016-04-01

    Full Text Available In this study, 52 asymptotic Curve Number (CN regression equations were developed for combinations of representative land covers and hydrologic soil groups. In addition, to overcome the limitations of the original Long-term Hydrologic Impact Assessment (L-THIA model when it is applied to larger watersheds, a watershed-scale L-THIA Asymptotic CN (ACN regression equation model (watershed-scale L-THIA ACN model was developed by integrating the asymptotic CN regressions and various modules for direct runoff/baseflow/channel routing. The watershed-scale L-THIA ACN model was applied to four watersheds in South Korea to evaluate the accuracy of its streamflow prediction. The coefficient of determination (R2 and Nash–Sutcliffe Efficiency (NSE values for observed versus simulated streamflows over intervals of eight days were greater than 0.6 for all four of the watersheds. The watershed-scale L-THIA ACN model, including the asymptotic CN regression equation method, can simulate long-term streamflow sufficiently well with the ten parameters that have been added for the characterization of streamflow.

  20. EVALUATION OF LATE ADVERSE EVENTS IN LONG-TERM WILMS' TUMOR SURVIVORS

    NARCIS (Netherlands)

    van Dijk, Irma W. E. M.; Oldenburger, Foppe; Cardous-Ubbink, Mathilde C.; Geenen, Maud M.; Heinen, Richard C.; de Kraker, Jan; van Leeuwen, Flora E.; van der Pal, Helena J. H.; Caron, Huib N.; Koning, Caro C. E.; Kremer, Leontien C. M.

    2010-01-01

    Purpose: To evaluate the prevalence and severity of adverse events (AEs) and treatment-related risk factors in long-term Wilms' tumor (WT) survivors, with special attention to radiotherapy. Methods and Materials: The single-center study cohort consisted of 185 WT survivors treated between 1966 and

  1. Benchmarking the New RESRAD-OFFSITE Source Term Model with DUST-MS and GoldSim - 13377

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, J.J.; Kamboj, S.; Gnanapragasam, E.; Yu, C. [Argonne National Laboratory, Argonne, IL 60439 (United States)

    2013-07-01

    RESRAD-OFFSITE is a computer code developed by Argonne National Laboratory under the sponsorship of U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC). It is designed on the basis of RESRAD (onsite) code, a computer code designated by DOE and NRC for evaluating soil-contaminated sites for compliance with human health protection requirements pertaining to license termination or environmental remediation. RESRAD-OFFSITE has enhanced capabilities of modeling radionuclide transport to offsite locations and calculating potential radiation exposure to offsite receptors. Recently, a new source term model was incorporated into RESRAD-OFFSITE to enhance its capability further. This new source term model allows simulation of radionuclide releases from different waste forms, in addition to the soil sources originally considered in RESRAD (onsite) and RESRAD-OFFSITE codes. With this new source term model, a variety of applications can be achieved by using RESRAD-OFFSITE, including but not limited to, assessing the performance of radioactive waste disposal facilities. This paper presents the comparison of radionuclide release rates calculated by the new source term model of RESRAD-OFFSITE versus those calculated by DUST-MS and GoldSim, respectively. The focus of comparison is on the release rates of radionuclides from the bottom of the contaminated zone that was assumed to contain radioactive source materials buried in soil. The transport of released contaminants outside of the primary contaminated zone is beyond the scope of this paper. Overall, the agreement between the RESRAD-OFFSITE results and the DUST-MS and GoldSim results is fairly good, with all three codes predicting identical or similar radionuclide release profiles over time. Numerical dispersion in the DUST-MS and GoldSim results was identified as potentially contributing to the disagreement in the release rates. In general, greater discrepancy in the release rates was found for short

  2. Benchmarking the New RESRAD-OFFSITE Source Term Model with DUST-MS and GoldSim - 13377

    International Nuclear Information System (INIS)

    Cheng, J.J.; Kamboj, S.; Gnanapragasam, E.; Yu, C.

    2013-01-01

    RESRAD-OFFSITE is a computer code developed by Argonne National Laboratory under the sponsorship of U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC). It is designed on the basis of RESRAD (onsite) code, a computer code designated by DOE and NRC for evaluating soil-contaminated sites for compliance with human health protection requirements pertaining to license termination or environmental remediation. RESRAD-OFFSITE has enhanced capabilities of modeling radionuclide transport to offsite locations and calculating potential radiation exposure to offsite receptors. Recently, a new source term model was incorporated into RESRAD-OFFSITE to enhance its capability further. This new source term model allows simulation of radionuclide releases from different waste forms, in addition to the soil sources originally considered in RESRAD (onsite) and RESRAD-OFFSITE codes. With this new source term model, a variety of applications can be achieved by using RESRAD-OFFSITE, including but not limited to, assessing the performance of radioactive waste disposal facilities. This paper presents the comparison of radionuclide release rates calculated by the new source term model of RESRAD-OFFSITE versus those calculated by DUST-MS and GoldSim, respectively. The focus of comparison is on the release rates of radionuclides from the bottom of the contaminated zone that was assumed to contain radioactive source materials buried in soil. The transport of released contaminants outside of the primary contaminated zone is beyond the scope of this paper. Overall, the agreement between the RESRAD-OFFSITE results and the DUST-MS and GoldSim results is fairly good, with all three codes predicting identical or similar radionuclide release profiles over time. Numerical dispersion in the DUST-MS and GoldSim results was identified as potentially contributing to the disagreement in the release rates. In general, greater discrepancy in the release rates was found for short

  3. Model Evaluation Report for Corrective Action Unit 98: Frenchman Flat, Nevada National Security Site, Nye County, Nevada, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Ruskauff, Greg; Marutzky, Sam

    2014-09-01

    Model evaluation focused solely on the PIN STRIPE and MILK SHAKE underground nuclear tests’ contaminant boundaries (CBs) because they had the largest extent, uncertainty, and potential consequences. The CAMBRIC radionuclide migration experiment also had a relatively large CB, but because it was constrained by transport data (notably Well UE-5n), there was little uncertainty, and radioactive decay reduced concentrations before much migration could occur. Each evaluation target and the associated data-collection activity were assessed in turn to determine whether the new data support, or demonstrate conservatism of, the CB forecasts. The modeling team—in this case, the same team that developed the Frenchman Flat geologic, source term, and groundwater flow and transport models—analyzed the new data and presented the results to a PER committee. Existing site understanding and its representation in numerical groundwater flow and transport models was evaluated in light of the new data and the ability to proceed to the CR stage of long-term monitoring and institutional control.

  4. Mathematical modeling and evaluation of radionuclide transport parameters from the ANL Laboratory Analog Program

    International Nuclear Information System (INIS)

    Chen, B.C.J.; Hull, J.R.; Seitz, M.G.; Sha, W.T.; Shah, V.L.; Soo, S.L.

    1984-07-01

    Computer model simulation is required to evaluate the performance of proposed or future high-level radioactive waste geological repositories. However, the accuracy of a model in predicting the real situation depends on how well the values of the transport properties are prescribed as input parameters. Knowledge of transport parameters is therefore essential. We have modeled ANL's Experiment Analog Program which was designed to simulate long-term radwaste migration process by groundwater flowing through a high-level radioactive waste repository. Using this model and experimental measurements, we have evaluated neptunium (actinide) deposition velocity and analyzed the complex phenomena of simultaneous deposition, erosion, and reentrainment of bentonite when groundwater is flowing through a narrow crack in a basalt rock. The present modeling demonstrates that we can obtain the values of transport parameters, as added information without any additional cost, from the available measurements of laboratory analog experiments. 8 figures, 3 tables

  5. A new global and comprehensive model for ICU ventilator performances evaluation.

    Science.gov (United States)

    Marjanovic, Nicolas S; De Simone, Agathe; Jegou, Guillaume; L'Her, Erwan

    2017-12-01

    This study aimed to provide a new global and comprehensive evaluation of recent ICU ventilators taking into account both technical performances and ergonomics. Six recent ICU ventilators were evaluated. Technical performances were assessed under two FIO 2 levels (100%, 50%), three respiratory mechanics combinations (Normal: compliance [C] = 70 mL cmH 2 O -1 /resistance [R] = 5 cmH 2 O L -1  s -1 ; Restrictive: C = 30/R = 10; Obstructive: C = 120/R = 20), four exponential levels of leaks (from 0 to 12.5 L min -1 ) and three levels of inspiratory effort (P0.1 = 2, 4 and 8 cmH 2 O), using an automated test lung. Ergonomics were evaluated by 20 ICU physicians using a global and comprehensive model involving physiological response to stress measurements (heart rate, respiratory rate, tidal volume variability and eye tracking), psycho-cognitive scales (SUS and NASA-TLX) and objective tasks completion. Few differences in terms of technical performance were observed between devices. Non-invasive ventilation modes had a huge influence on asynchrony occurrence. Using our global model, either objective tasks completion, psycho-cognitive scales and/or physiological measurements were able to depict significant differences in terms of devices' usability. The level of failure that was observed with some devices depicted the lack of adaptation of device's development to end users' requests. Despite similar technical performance, some ICU ventilators exhibit low ergonomics performance and a high risk of misusage.

  6. Ukraine National Energy Current State and Modelling its Long-Term Development

    International Nuclear Information System (INIS)

    Shulzhenko, S.

    2016-01-01

    Structure of Ukrainian energy sector, its current challenges, drivers of its development and possible long-term pathways, and methodological approaches and methods of mathematical modelling of long-term national energy development.(author).

  7. A model for evaluating the environmental benefits of elementary school facilities.

    Science.gov (United States)

    Ji, Changyoon; Hong, Taehoon; Jeong, Kwangbok; Leigh, Seung-Bok

    2014-01-01

    In this study, a model that is capable of evaluating the environmental benefits of a new elementary school facility was developed. The model is composed of three steps: (i) retrieval of elementary school facilities having similar characteristics as the new elementary school facility using case-based reasoning; (ii) creation of energy consumption and material data for the benchmark elementary school facility using the retrieved similar elementary school facilities; and (iii) evaluation of the environmental benefits of the new elementary school facility by assessing and comparing the environmental impact of the new and created benchmark elementary school facility using life cycle assessment. The developed model can present the environmental benefits of a new elementary school facility in terms of monetary values using Environmental Priority Strategy 2000, a damage-oriented life cycle impact assessment method. The developed model can be used for the following: (i) as criteria for a green-building rating system; (ii) as criteria for setting the support plan and size, such as the government's incentives for promoting green-building projects; and (iii) as criteria for determining the feasibility of green building projects in key business sectors. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. The Analytical Repository Source-Term (AREST) model: Description and documentation

    International Nuclear Information System (INIS)

    Liebetrau, A.M.; Apted, M.J.; Engel, D.W.; Altenhofen, M.K.; Strachan, D.M.; Reid, C.R.; Windisch, C.F.; Erikson, R.L.; Johnson, K.I.

    1987-10-01

    The geologic repository system consists of several components, one of which is the engineered barrier system. The engineered barrier system interfaces with natural barriers that constitute the setting of the repository. A model that simulates the releases from the engineered barrier system into the natural barriers of the geosphere, called a source-term model, is an important component of any model for assessing the overall performance of the geologic repository system. The Analytical Repository Source-Term (AREST) model being developed is one such model. This report describes the current state of development of the AREST model and the code in which the model is implemented. The AREST model consists of three component models and five process models that describe the post-emplacement environment of a waste package. All of these components are combined within a probabilistic framework. The component models are a waste package containment (WPC) model that simulates the corrosion and degradation processes which eventually result in waste package containment failure; a waste package release (WPR) model that calculates the rates of radionuclide release from the failed waste package; and an engineered system release (ESR) model that controls the flow of information among all AREST components and process models and combines release output from the WPR model with failure times from the WPC model to produce estimates of total release. 167 refs., 40 figs., 12 tabs

  9. A new long-term care facilities model in nova scotia, Canada: protocol for a mixed methods study of care by design.

    Science.gov (United States)

    Marshall, Emily Gard; Boudreau, Michelle Anne; Jensen, Jan L; Edgecombe, Nancy; Clarke, Barry; Burge, Frederick; Archibald, Greg; Taylor, Anthony; Andrew, Melissa K

    2013-11-29

    Prior to the implementation of a new model of care in long-term care facilities in the Capital District Health Authority, Halifax, Nova Scotia, residents entering long-term care were responsible for finding their own family physician. As a result, care was provided by many family physicians responsible for a few residents leading to care coordination and continuity challenges. In 2009, Capital District Health Authority (CDHA) implemented a new model of long-term care called "Care by Design" which includes: a dedicated family physician per floor, 24/7 on-call physician coverage, implementation of a standardized geriatric assessment tool, and an interdisciplinary team approach to care. In addition, a new Emergency Health Services program was implemented shortly after, in which specially trained paramedics dedicated to long-term care responses are able to address urgent care needs. These changes were implemented to improve primary and emergency care for vulnerable residents. Here we describe a comprehensive mixed methods research study designed to assess the impact of these programs on care delivery and resident outcomes. The results of this research will be important to guide primary care policy for long-term care. We aim to evaluate the impact of introducing a new model of a dedicated primary care physician and team approach to long-term care facilities in the CDHA using a mixed methods approach. As a mixed methods study, the quantitative and qualitative data findings will inform each other. Quantitatively we will measure a number of indicators of care in CDHA long-term care facilities pre and post-implementation of the new model. In the qualitative phase of the study we will explore the experience under the new model from the perspectives of stakeholders including family doctors, nurses, administration and staff as well as residents and family members. The proposed mixed method study seeks to evaluate and make policy recommendations related to primary care in long-term

  10. Data modeling and evaluation

    International Nuclear Information System (INIS)

    Bauge, E.; Hilaire, S.

    2006-01-01

    This lecture is devoted to the nuclear data evaluation process, during which the current knowledge (experimental or theoretical) of nuclear reactions is condensed and synthesised into a computer file (the evaluated data file) that application codes can process and use for simulation calculations. After an overview of the content of evaluated nuclear data files, we describe the different methods used for evaluating nuclear data. We specifically focus on the model based approach which we use to evaluate data in the continuum region. A few examples, coming from the day to day practice of data evaluation will illustrate this lecture. Finally, we will discuss the most likely perspectives for improvement of the evaluation process in the next decade. (author)

  11. Evaluation of HVS models in the application of medical image quality assessment

    Science.gov (United States)

    Zhang, L.; Cavaro-Menard, C.; Le Callet, P.

    2012-03-01

    In this study, four of the most widely used Human Visual System (HVS) models are applied on Magnetic Resonance (MR) images for signal detection task. Their performances are evaluated against gold standard derived from radiologists' majority decision. The task-based image quality assessment requires taking into account the human perception specificities, for which various HVS models have been proposed. However to our knowledge, no work was conducted to evaluate and compare the suitability of these models with respect to the assessment of medical image qualities. This pioneering study investigates the performances of different HVS models on medical images in terms of approximation to radiologist performance. We propose to score the performance of each HVS model using the AUC (Area Under the receiver operating characteristic Curve) and its variance estimate as the figure of merit. The radiologists' majority decision is used as gold standard so that the estimated AUC measures the distance between the HVS model and the radiologist perception. To calculate the variance estimate of AUC, we adopted the one-shot method that is independent of the HVS model's output range. The results of this study will help to provide arguments to the application of some HVS model on our future medical image quality assessment metric.

  12. Long-term earthquake forecasts based on the epidemic-type aftershock sequence (ETAS model for short-term clustering

    Directory of Open Access Journals (Sweden)

    Jiancang Zhuang

    2012-07-01

    Full Text Available Based on the ETAS (epidemic-type aftershock sequence model, which is used for describing the features of short-term clustering of earthquake occurrence, this paper presents some theories and techniques related to evaluating the probability distribution of the maximum magnitude in a given space-time window, where the Gutenberg-Richter law for earthquake magnitude distribution cannot be directly applied. It is seen that the distribution of the maximum magnitude in a given space-time volume is determined in the longterm by the background seismicity rate and the magnitude distribution of the largest events in each earthquake cluster. The techniques introduced were applied to the seismicity in the Japan region in the period from 1926 to 2009. It was found that the regions most likely to have big earthquakes are along the Tohoku (northeastern Japan Arc and the Kuril Arc, both with much higher probabilities than the offshore Nankai and Tokai regions.

  13. A simple rainfall-runoff model for the single and long term hydrological performance of green roofs

    DEFF Research Database (Denmark)

    Locatelli, Luca; Mark, Ole; Mikkelsen, Peter Steen

    Green roofs are being widely implemented for storm water control and runoff reduction. There is need for incorporating green roofs into urban drainage models in order to evaluate their impact. These models must have low computational costs and fine time resolution. This paper aims to develop...... a model of green roof hydrological performance. A simple conceptual model for the long term and single event hydrological performance of green roofs, shows to be capable of reproducing observed runoff measurements. The model has surface and subsurface storage components representing the overall retention...... capacity of the green roof. The runoff from the system is described by the non-linear reservoir method and the storage capacity of the green roof is continuously re-established by evapotranspiration. Runoff data from a green roof in Denmark are collected and used for parameter calibration....

  14. A retrospective evaluation of term infants treated with surfactant therapy

    Directory of Open Access Journals (Sweden)

    Özge Sürmeli-Onay

    2015-04-01

    Full Text Available Aim: To investigate the clinical and therapeutic characteristics and outcomes of term infants who received surfactant therapy (ST for severe respiratory failure in our neonatal intensive care unit (NICU. Methods: The medical records of term infants (gestational age ≥ 370/7 weeks who received ST between 2003-2012 in NICU of Hacettepe University Ihsan Dogramaci Children’s Hospital were evaluated retrospectively. Results: During ten years period, 32 term infants received ST; the mean gestational age was 38.1 ± 0.88 wk and the mean birth weight was 2,936 ± 665 g. The underlying lung diseases were severe congenital pneumonia (CP in 13 (40.6%, acute respiratory distress syndrome (ARDS in 5 (15.6%, meconium aspiration syndrome (MAS in 5 (15.6%, congenital diaphragmatic hernia (CDH in 4 (12.5%, respiratory distress syndrome in 3 (9.4% and pulmonary hemorrhage in 2 (6.3% infants. The median time of the first dose of ST was 7.75 (0.5-216 hours. Pulmonary hypertension accompanied the primary lung disease in 9 (28.1% infants. Mortality rate was 25%. Conclusion: In term infants, CP, ARDS and MAS were the main causes of respiratory failure requiring ST. However, further prospective studies are needed for defining optimal strategies of ST in term infants with respiratory failure.

  15. A review of typhoid fever transmission dynamic models and economic evaluations of vaccination.

    Science.gov (United States)

    Watson, Conall H; Edmunds, W John

    2015-06-19

    Despite a recommendation by the World Health Organization (WHO) that typhoid vaccines be considered for the control of endemic disease and outbreaks, programmatic use remains limited. Transmission models and economic evaluation may be informative in decision making about vaccine programme introductions and their role alongside other control measures. A literature search found few typhoid transmission models or economic evaluations relative to analyses of other infectious diseases of similar or lower health burden. Modelling suggests vaccines alone are unlikely to eliminate endemic disease in the short to medium term without measures to reduce transmission from asymptomatic carriage. The single identified data-fitted transmission model of typhoid vaccination suggests vaccines can reduce disease burden substantially when introduced programmatically but that indirect protection depends on the relative contribution of carriage to transmission in a given setting. This is an important source of epidemiological uncertainty, alongside the extent and nature of natural immunity. Economic evaluations suggest that typhoid vaccination can be cost-saving to health services if incidence is extremely high and cost-effective in other high-incidence situations, when compared to WHO norms. Targeting vaccination to the highest incidence age-groups is likely to improve cost-effectiveness substantially. Economic perspective and vaccine costs substantially affect estimates, with disease incidence, case-fatality rates, and vaccine efficacy over time also important determinants of cost-effectiveness and sources of uncertainty. Static economic models may under-estimate benefits of typhoid vaccination by omitting indirect protection. Typhoid fever transmission models currently require per-setting epidemiological parameterisation to inform their use in economic evaluation, which may limit their generalisability. We found no economic evaluation based on transmission dynamic modelling, and no

  16. Evaluation of a multiple linear regression model and SARIMA model in forecasting heat demand for district heating system

    International Nuclear Information System (INIS)

    Fang, Tingting; Lahdelma, Risto

    2016-01-01

    Highlights: • Social factor is considered for the linear regression models besides weather file. • Simultaneously optimize all the coefficients for linear regression models. • SARIMA combined with linear regression is used to forecast the heat demand. • The accuracy for both linear regression and time series models are evaluated. - Abstract: Forecasting heat demand is necessary for production and operation planning of district heating (DH) systems. In this study we first propose a simple regression model where the hourly outdoor temperature and wind speed forecast the heat demand. Weekly rhythm of heat consumption as a social component is added to the model to significantly improve the accuracy. The other type of model is the seasonal autoregressive integrated moving average (SARIMA) model with exogenous variables as a combination to take weather factors, and the historical heat consumption data as depending variables. One outstanding advantage of the model is that it peruses the high accuracy for both long-term and short-term forecast by considering both exogenous factors and time series. The forecasting performance of both linear regression models and time series model are evaluated based on real-life heat demand data for the city of Espoo in Finland by out-of-sample tests for the last 20 full weeks of the year. The results indicate that the proposed linear regression model (T168h) using 168-h demand pattern with midweek holidays classified as Saturdays or Sundays gives the highest accuracy and strong robustness among all the tested models based on the tested forecasting horizon and corresponding data. Considering the parsimony of the input, the ease of use and the high accuracy, the proposed T168h model is the best in practice. The heat demand forecasting model can also be developed for individual buildings if automated meter reading customer measurements are available. This would allow forecasting the heat demand based on more accurate heat consumption

  17. Educational Evaluators--A Model for Task Oriented Position Development

    Science.gov (United States)

    Rice, David; and others

    1970-01-01

    An outline of 44 evaluator tasks is discussed in terms of its usefulness in defining, evaluating, and improving the position of "educational evaluator ; in adapting the position to the needs of particular institutions; and in designing appropriate evaluator training programs. (JES)

  18. Oligomeric models for estimation of polydimethylsiloxane-water partition ratios with COSMO-RS theory: impact of the combinatorial term on absolute error.

    Science.gov (United States)

    Parnis, J Mark; Mackay, Donald

    2017-03-22

    A series of 12 oligomeric models for polydimethylsiloxane (PDMS) were evaluated for their effectiveness in estimating the PDMS-water partition ratio, K PDMS-w . Models ranging in size and complexity from the -Si(CH 3 ) 2 -O- model previously published by Goss in 2011 to octadeca-methyloctasiloxane (CH 3 -(Si(CH 3 ) 2 -O-) 8 CH 3 ) were assessed based on their RMS error with 253 experimental measurements of log K PDMS-w from six published works. The lowest RMS error for log K PDMS-w (0.40 in log K) was obtained with the cyclic oligomer, decamethyl-cyclo-penta-siloxane (D5), (-Si(CH 3 ) 2 -O-) 5 , with the mixing-entropy associated combinatorial term included in the chemical potential calculation. The presence or absence of terminal methyl groups on linear oligomer models is shown to have significant impact only for oligomers containing 1 or 2 -Si(CH 3 ) 2 -O- units. Removal of the combinatorial term resulted in a significant increase in the RMS error for most models, with the smallest increase associated with the largest oligomer studied. The importance of inclusion of the combinatorial term in the chemical potential for liquid oligomer models is discussed.

  19. Preliminary fiscal evaluation of Alberta oil sands terms

    International Nuclear Information System (INIS)

    Van Meurs, P.

    2007-01-01

    The cost of oil sands projects varies significantly. While costs have escalated considerably over the past few years, oil prices have gone significantly higher. This report provided an economic evaluation of the current fiscal terms applicable to Alberta oil sands. The analysis was done to evaluate the profitability of oil sand projects to investors under current conditions based on the generic royalty regime based on bitumen values. The objective of the royalty review was to determine whether Albertans received a fair share from their oil and gas resources. It discussed the wide variety of oil sands projects in Alberta using five case studies as examples. Cases involving steam assisted gravity drainage (SAGD) operations were assessed for both the Athabasca Mine and Cold Lake. The report provided a discussion of the economic assumptions including economic cases as well as production, costs and price data. It then provided the preliminary results of the economic-fiscal evaluation from the investor perspective including profitability indicators; international comparisons; internal rate of return; and net present value. The government perspective was also discussed with reference to attractiveness indicators; royalties as a percentage of bitumen values; and non-discounted and discounted government take. A royalty and tax feature analysis was also provided. Several issues for possible further review were also presented. tabs

  20. GATEWAY Report Brief: SSL Demonstration: Long-Term Evaluation of Indoor Field Performance

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2017-02-28

    Report brief summarizing a GATEWAY program evaluation of the long-term performance characteristics (chromaticity change, maintained illuminance, and operations and maintenance) of LED lighting systems in four field installations previously documented in separate DOE GATEWAY reports.

  1. Evaluation of AirGIS: a GIS-based air pollution and human exposure modelling system

    DEFF Research Database (Denmark)

    Ketzel, Matthias; Berkowicz, Ruwim; Hvidberg, Martin

    2011-01-01

    This study describes in brief the latest extensions of the Danish Geographic Information System (GIS)-based air pollution and human exposure modelling system (AirGIS), which has been developed in Denmark since 2001 and gives results of an evaluation with measured air pollution data. The system...... shows, in general, a good performance for both long-term averages (annual and monthly averages), short-term averages (hourly and daily) as well as when reproducing spatial variation in air pollution concentrations. Some shortcomings and future perspectives of the system are discussed too....

  2. A Long-Term Mathematical Model for Mining Industries

    OpenAIRE

    Achdou , Yves; Giraud , Pierre-Noel; Lasry , Jean-Michel; Lions , Pierre-Louis

    2016-01-01

    International audience; A parcimonious long term model is proposed for a mining industry. Knowing the dynamics of the global reserve, the strategy of each production unit consists of an optimal control problem with two controls, first the flux invested into prospection and the building of new extraction facilities, second the production rate. In turn, the dynamics of the global reserve depends on the individual strategies of the producers, so the models leads to an equilibrium, which is descr...

  3. Modelled long term trends of surface ozone over South Africa

    CSIR Research Space (South Africa)

    Naidoo, M

    2011-10-01

    Full Text Available timescale seeks to provide a spatially comprehensive view of trends while also creating a baseline for comparisons with future projections of air quality through the forcing of air quality models with modelled predicted long term meteorology. Previous...

  4. Evaluation of Models of the Reading Process.

    Science.gov (United States)

    Balajthy, Ernest

    A variety of reading process models have been proposed and evaluated in reading research. Traditional approaches to model evaluation specify the workings of a system in a simplified fashion to enable organized, systematic study of the system's components. Following are several statistical methods of model evaluation: (1) empirical research on…

  5. Long-term evaluation of treatment of chronic, therapeutically refractory tinnitus by neurostimulation

    NARCIS (Netherlands)

    Staal, M. J.; Holm, A. F.; Mooij, J. J. A.; Albers, F. W. J.; Bartels, H.

    2007-01-01

    Objective: Long-term evaluation of treatment of chronic, therapeutically refractory tinnitus by means of chronic electrical stimulation of the vestibulocochlear nerve. Patients: Inclusion criteria were severe, chronic, therapeutically refractory, unilateral tinnitus and severe hearing loss at the

  6. Source term identification in atmospheric modelling via sparse optimization

    Science.gov (United States)

    Adam, Lukas; Branda, Martin; Hamburger, Thomas

    2015-04-01

    Inverse modelling plays an important role in identifying the amount of harmful substances released into atmosphere during major incidents such as power plant accidents or volcano eruptions. Another possible application of inverse modelling lies in the monitoring the CO2 emission limits where only observations at certain places are available and the task is to estimate the total releases at given locations. This gives rise to minimizing the discrepancy between the observations and the model predictions. There are two standard ways of solving such problems. In the first one, this discrepancy is regularized by adding additional terms. Such terms may include Tikhonov regularization, distance from a priori information or a smoothing term. The resulting, usually quadratic, problem is then solved via standard optimization solvers. The second approach assumes that the error term has a (normal) distribution and makes use of Bayesian modelling to identify the source term. Instead of following the above-mentioned approaches, we utilize techniques from the field of compressive sensing. Such techniques look for a sparsest solution (solution with the smallest number of nonzeros) of a linear system, where a maximal allowed error term may be added to this system. Even though this field is a developed one with many possible solution techniques, most of them do not consider even the simplest constraints which are naturally present in atmospheric modelling. One of such examples is the nonnegativity of release amounts. We believe that the concept of a sparse solution is natural in both problems of identification of the source location and of the time process of the source release. In the first case, it is usually assumed that there are only few release points and the task is to find them. In the second case, the time window is usually much longer than the duration of the actual release. In both cases, the optimal solution should contain a large amount of zeros, giving rise to the

  7. The Performance of Structure-Controller Coupled Systems Analysis Using Probabilistic Evaluation and Identification Model Approach

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available This study evaluates the performance of passively controlled steel frame building under dynamic loads using time series analysis. A novel application is utilized for the time and frequency domains evaluation to analyze the behavior of controlling systems. In addition, the autoregressive moving average (ARMA neural networks are employed to identify the performance of the controller system. Three passive vibration control devices are utilized in this study, namely, tuned mass damper (TMD, tuned liquid damper (TLD, and tuned liquid column damper (TLCD. The results show that the TMD control system is a more reliable controller than TLD and TLCD systems in terms of vibration mitigation. The probabilistic evaluation and identification model showed that the probability analysis and ARMA neural network model are suitable to evaluate and predict the response of coupled building-controller systems.

  8. A model for Long-term Industrial Energy Forecasting (LIEF)

    Energy Technology Data Exchange (ETDEWEB)

    Ross, M. [Lawrence Berkeley Lab., CA (United States)]|[Michigan Univ., Ann Arbor, MI (United States). Dept. of Physics]|[Argonne National Lab., IL (United States). Environmental Assessment and Information Sciences Div.; Hwang, R. [Lawrence Berkeley Lab., CA (United States)

    1992-02-01

    The purpose of this report is to establish the content and structural validity of the Long-term Industrial Energy Forecasting (LIEF) model, and to provide estimates for the model`s parameters. The model is intended to provide decision makers with a relatively simple, yet credible tool to forecast the impacts of policies which affect long-term energy demand in the manufacturing sector. Particular strengths of this model are its relative simplicity which facilitates both ease of use and understanding of results, and the inclusion of relevant causal relationships which provide useful policy handles. The modeling approach of LIEF is intermediate between top-down econometric modeling and bottom-up technology models. It relies on the following simple concept, that trends in aggregate energy demand are dependent upon the factors: (1) trends in total production; (2) sectoral or structural shift, that is, changes in the mix of industrial output from energy-intensive to energy non-intensive sectors; and (3) changes in real energy intensity due to technical change and energy-price effects as measured by the amount of energy used per unit of manufacturing output (KBtu per constant $ of output). The manufacturing sector is first disaggregated according to their historic output growth rates, energy intensities and recycling opportunities. Exogenous, macroeconomic forecasts of individual subsector growth rates and energy prices can then be combined with endogenous forecasts of real energy intensity trends to yield forecasts of overall energy demand. 75 refs.

  9. Mobility Models for Systems Evaluation

    Science.gov (United States)

    Musolesi, Mirco; Mascolo, Cecilia

    Mobility models are used to simulate and evaluate the performance of mobile wireless systems and the algorithms and protocols at the basis of them. The definition of realistic mobility models is one of the most critical and, at the same time, difficult aspects of the simulation of applications and systems designed for mobile environments. There are essentially two possible types of mobility patterns that can be used to evaluate mobile network protocols and algorithms by means of simulations: traces and synthetic models [130]. Traces are obtained by means of measurements of deployed systems and usually consist of logs of connectivity or location information, whereas synthetic models are mathematical models, such as sets of equations, which try to capture the movement of the devices.

  10. Swarm Intelligence-Based Hybrid Models for Short-Term Power Load Prediction

    Directory of Open Access Journals (Sweden)

    Jianzhou Wang

    2014-01-01

    Full Text Available Swarm intelligence (SI is widely and successfully applied in the engineering field to solve practical optimization problems because various hybrid models, which are based on the SI algorithm and statistical models, are developed to further improve the predictive abilities. In this paper, hybrid intelligent forecasting models based on the cuckoo search (CS as well as the singular spectrum analysis (SSA, time series, and machine learning methods are proposed to conduct short-term power load prediction. The forecasting performance of the proposed models is augmented by a rolling multistep strategy over the prediction horizon. The test results are representative of the out-performance of the SSA and CS in tuning the seasonal autoregressive integrated moving average (SARIMA and support vector regression (SVR in improving load forecasting, which indicates that both the SSA-based data denoising and SI-based intelligent optimization strategy can effectively improve the model’s predictive performance. Additionally, the proposed CS-SSA-SARIMA and CS-SSA-SVR models provide very impressive forecasting results, demonstrating their strong robustness and universal forecasting capacities in terms of short-term power load prediction 24 hours in advance.

  11. Evaluation of the LMFBR cover gas source term and synthesis of the associated R and D

    International Nuclear Information System (INIS)

    Balard, F.; Carluec, B.

    1996-01-01

    At the end of the seventies and the beginning of the eighties, there appeared a pressing need of experimental results to assess the LMFBR's safety level. Because of the urgency, analytical studies were not systematically undertaken and maximum credible cover gas instantaneous source terms (radionuclides core release fraction) were got directly from crude out-of-pile experiment interpretations. Two types of studies and mock-ups were undertaken depending on the timescale of the phenomena: instantaneous source terms (corresponding to an unlikely energetic core disruptive accident CDA), and delayed ones (tens of minutes to some hours). The experiments performed in this frame are reviewed in this presentation: 1) instantaneous source term: - FAUST experiments: I, Cs, UO2 source terms (FzK, Germany), - FAST experiments : pool depth influence on non volatile source term (USA), - CARAVELLE experiments: nonvolatile source term in SPX1 geometry (CEA, France); 2) delayed source term: - NALA experiments: I, Cs, Sr, UO2 source term (FzK, Germany), - PAVE experiments: I source term (CEA, France), - NACOWA experiments: cover gas aerosols enrichment in I and Cs (FzK, Germany) - other French experiments in COPACABANA and GULLIVER facilities. The volatile fission products release is tightly bound to sodium evaporation and a large part of the fission products is dissolved in the liquid sodium aerosols present in the cover gas. Thus the knowledge of the amount of aerosol release to the cover gas is important for the evaluation of the source term. The maximum credible cover gas instantaneous source terms deduced from the experiments have led to conservative source terms to be taken into account in safety analysis. Nevertheless modelling attempts of the observed (in-pile or out-of-pile) physico-chemical phenomena have been undertaken for extrapolation to the reactor case. The main topics of this theoretical research are as follows: fission products evaporation in the cover gas (Fz

  12. Modeling flood events for long-term stability

    International Nuclear Information System (INIS)

    Schruben, T.; Portillo, R.

    1985-01-01

    The primary objective for the disposal of uranium mill tailings in the Uranium Mill Tailings Remedial Action (UMTRA) Project is isolation and stabilization to prevent their misuse by man and dispersal by natural forces such as wind, rain, and flood waters (40 CFR-192). Stabilization of sites that are located in or near flood plains presents unique problems in design for long-term performance. This paper discusses the process involved with the selection and hydrologic modeling of the design flood event; and hydraulic modeling with geomorphic considerations of the design flood event. The Gunnison, Colorado, and Riverton, Wyoming, sites will be used as examples in describing the process

  13. Bioadhesive agents in addition to oral contrast media - evaluation in an animal model

    International Nuclear Information System (INIS)

    Conrad, R.; Schneider, G.; Textor, J.; Schild, H.H.; Fimmers, R.

    1998-01-01

    Purpose: To evaluate the additional effect of bioadhesives in combination with iotrolan and barium as oral contrast media in an animal model. Method: The bioadhesives Noveon, CMC, Tylose and Carbopol 934 were added to iotrolan and barium. The solutions were administered to rabbits by a feeding tube. The animals were investigated by computed tomography (CT) and radiography after 0,5, 4, 12, 24 and in part after 48 hours. Mucosal coating and contrast filling of the bowel were evaluated. Results: Addition of bioadhesives to oral contrast media effected long-term contrast in the small intestine and colon, but no improvement in continuous filling and coating of the gastrointestinal tract was detected. Mucosal coating was seen only in short regions of the caecum and small intestine. In CT the best results for coating were observed with tylose and CMC, in radiography additionally with carbopol and noveon. All contrast medium solutions were well tolerated. Conclusion: The evaluated contrast medium solutions with bioadhesives have shown long-term contrast but no improvement in coating in comparison to conventional oral contrast media. (orig.) [de

  14. Modelling the long-term deployment of electricity storage in the global energy system

    International Nuclear Information System (INIS)

    Despres, Jacques

    2015-01-01

    The current development of wind and solar power sources calls for an improvement of long-term energy models. Indeed, high shares of variable wind and solar productions have short- and long-term impacts on the power system, requiring the development of flexibility options: fast-reacting power plants, demand response, grid enhancement or electricity storage. Our first main contribution is the modelling of electricity storage and grid expansion in the POLES model (Prospective Outlook on Long-term Energy Systems). We set up new investment mechanisms, where storage development is based on several combined economic values. After categorising the long-term energy models and the power sector modelling tools in a common typology, we showed the need for a better integration of both approaches. Therefore, the second major contribution of our work is the yearly coupling of POLES to a short-term optimisation of the power sector operation, with the European Unit Commitment and Dispatch model (EUCAD). The two-way data exchange allows the long-term coherent scenarios of POLES to be directly backed by the short-term technical detail of EUCAD. Our results forecast a strong and rather quick development of the cheapest flexibility options: grid interconnections, pumped hydro storage and demand response programs, including electric vehicle charging optimisation and vehicle-to-grid storage. The more expensive battery storage presumably finds enough system value in the second half of the century. A sensitivity analysis shows that improving the fixed costs of batteries impacts more the investments than improving their efficiency. We also show the explicit dependency between storage and variable renewable energy sources. (author) [fr

  15. Evaluating topic models with stability

    CSIR Research Space (South Africa)

    De Waal, A

    2008-11-01

    Full Text Available Topic models are unsupervised techniques that extract likely topics from text corpora, by creating probabilistic word-topic and topic-document associations. Evaluation of topic models is a challenge because (a) topic models are often employed...

  16. Long-Term Prediction of Emergency Department Revenue and Visitor Volume Using Autoregressive Integrated Moving Average Model

    Directory of Open Access Journals (Sweden)

    Chieh-Fan Chen

    2011-01-01

    Full Text Available This study analyzed meteorological, clinical and economic factors in terms of their effects on monthly ED revenue and visitor volume. Monthly data from January 1, 2005 to September 30, 2009 were analyzed. Spearman correlation and cross-correlation analyses were performed to identify the correlation between each independent variable, ED revenue, and visitor volume. Autoregressive integrated moving average (ARIMA model was used to quantify the relationship between each independent variable, ED revenue, and visitor volume. The accuracies were evaluated by comparing model forecasts to actual values with mean absolute percentage of error. Sensitivity of prediction errors to model training time was also evaluated. The ARIMA models indicated that mean maximum temperature, relative humidity, rainfall, non-trauma, and trauma visits may correlate positively with ED revenue, but mean minimum temperature may correlate negatively with ED revenue. Moreover, mean minimum temperature and stock market index fluctuation may correlate positively with trauma visitor volume. Mean maximum temperature, relative humidity and stock market index fluctuation may correlate positively with non-trauma visitor volume. Mean maximum temperature and relative humidity may correlate positively with pediatric visitor volume, but mean minimum temperature may correlate negatively with pediatric visitor volume. The model also performed well in forecasting revenue and visitor volume.

  17. Development of a dispatch model of the European power system for coupling with a long-term foresight energy model

    International Nuclear Information System (INIS)

    Despres, Jacques

    2015-12-01

    Renewable sources of electricity production are strongly increasing in many parts of the world. The production costs are going down quickly, thus accelerating the deployment of new solar and wind electricity generation. In the long-term, these variable sources of electricity could represent a high share of the power system. However, long-term foresight energy models have difficulties describing precisely the integration challenges of Variable Renewable Energy Sources (VRES) such as wind or solar. They just do not represent the short-term technical constraints of the power sector. The objective of this paper is to show a new approach of the representation of the challenges of variability in the long-term foresight energy model POLES (Prospective Outlook on Long-term Energy Systems). We develop a short-term optimization model for the power sector operation, EUCAD (European Unit Commitment and Dispatch) and we couple it to POLES year after year. The direct coupling, with bi-directional exchanges of information, brings technical precision to the long-term coherence of energy scenarios. (author)

  18. Solutions of several coupled discrete models in terms of Lamé ...

    Indian Academy of Sciences (India)

    The models discussed are: coupled Salerno model,; coupled Ablowitz–Ladik model,; coupled 4 model and; coupled 6 model. In all these cases we show that the coefficients of the Lamé polynomials are such that the Lamé polynomials can be re-expressed in terms of Chebyshev polynomials of the relevant Jacobi elliptic ...

  19. The use of nonlinear regression analysis for integrating pollutant concentration measurements with atmospheric dispersion modeling for source term estimation

    International Nuclear Information System (INIS)

    Edwards, L.L.; Freis, R.P.; Peters, L.G.; Gudiksen, P.H.; Pitovranov, S.E.

    1993-01-01

    The accuracy associated with assessing the environmental consequences of an accidental release of radioactivity is highly dependent on the knowledge of the source term characteristics, which are generally poorly known. The development of an automated numerical technique that integrates the radiological measurements with atmospheric dispersion modeling for more accurate source term estimation is reported. Often, this process of parameter estimation is performed by an emergency response assessor, who takes an intelligent first guess at the model parameters, then, comparing the model results with whatever measurements are available, makes an intuitive, informed next guess of the model parameters. This process may be repeated any number of times until the assessor feels that the model results are reasonable in terms of the measured observations. A new approach, based on a nonlinear least-squares regression scheme coupled with the existing Atmospheric Release Advisory Capability three-dimensional atmospheric dispersion models, is to supplement the assessor's intuition with automated mathematical methods that do not significantly increase the response time of the existing predictive models. The viability of the approach is evaluated by estimation of the known SF 6 tracer release rates associated with the Mesoscale Atmospheric Transport Studies tracer experiments conducted at the Savannah River Laboratory during 1983. These 19 experiments resulted in 14 successful, separate tracer releases with sampling of the tracer plumes along the cross-plume arc situated ∼30 km from the release site

  20. Model-Based Approach to the Evaluation of Task Complexity in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ham, Dong Han

    2007-02-01

    This study developed a model-based method for evaluating task complexity and examined the ways of evaluating the complexity of tasks designed for abnormal situations and daily task situations in NPPs. The main results of this study can be summarised as follows. First, this study developed a conceptual framework for studying complexity factors and a model of complexity factors that classifies complexity factors according to the types of knowledge that human operators use. Second, this study developed a more practical model of task complexity factors and identified twenty-one complexity factors based on the model. The model emphasizes that a task is a system to be designed and its complexity has several dimensions. Third, we developed a method of identifying task complexity factors and evaluating task complexity qualitatively based on the developed model of task complexity factors. This method can be widely used in various task situations. Fourth, this study examined the applicability of TACOM to abnormal situations and daily task situations, such as maintenance and confirmed that it can be reasonably used in those situations. Fifth, we developed application examples to demonstrate the use of the theoretical results of this study. Lastly, this study reinterpreted well-know principles for designing information displays in NPPs in terms of task complexity and suggested a way of evaluating the conceptual design of displays in an analytical way by using the concept of task complexity. All of the results of this study will be used as a basis when evaluating the complexity of tasks designed on procedures or information displays and designing ways of improving human performance in NPPs

  1. Re-evaluating neonatal-age models for ungulates: does model choice affect survival estimates?

    Directory of Open Access Journals (Sweden)

    Troy W Grovenburg

    Full Text Available New-hoof growth is regarded as the most reliable metric for predicting age of newborn ungulates, but variation in estimated age among hoof-growth equations that have been developed may affect estimates of survival in staggered-entry models. We used known-age newborns to evaluate variation in age estimates among existing hoof-growth equations and to determine the consequences of that variation on survival estimates. During 2001-2009, we captured and radiocollared 174 newborn (≤24-hrs old ungulates: 76 white-tailed deer (Odocoileus virginianus in Minnesota and South Dakota, 61 mule deer (O. hemionus in California, and 37 pronghorn (Antilocapra americana in South Dakota. Estimated age of known-age newborns differed among hoof-growth models and varied by >15 days for white-tailed deer, >20 days for mule deer, and >10 days for pronghorn. Accuracy (i.e., the proportion of neonates assigned to the correct age in aging newborns using published equations ranged from 0.0% to 39.4% in white-tailed deer, 0.0% to 3.3% in mule deer, and was 0.0% for pronghorns. Results of survival modeling indicated that variability in estimates of age-at-capture affected short-term estimates of survival (i.e., 30 days for white-tailed deer and mule deer, and survival estimates over a longer time frame (i.e., 120 days for mule deer. Conversely, survival estimates for pronghorn were not affected by estimates of age. Our analyses indicate that modeling survival in daily intervals is too fine a temporal scale when age-at-capture is unknown given the potential inaccuracies among equations used to estimate age of neonates. Instead, weekly survival intervals are more appropriate because most models accurately predicted ages within 1 week of the known age. Variation among results of neonatal-age models on short- and long-term estimates of survival for known-age young emphasizes the importance of selecting an appropriate hoof-growth equation and appropriately defining intervals (i

  2. Evaluating the effect of neighbourhood weight matrices on smoothing properties of Conditional Autoregressive (CAR models

    Directory of Open Access Journals (Sweden)

    Ryan Louise

    2007-11-01

    Full Text Available Abstract Background The Conditional Autoregressive (CAR model is widely used in many small-area ecological studies to analyse outcomes measured at an areal level. There has been little evaluation of the influence of different neighbourhood weight matrix structures on the amount of smoothing performed by the CAR model. We examined this issue in detail. Methods We created several neighbourhood weight matrices and applied them to a large dataset of births and birth defects in New South Wales (NSW, Australia within 198 Statistical Local Areas. Between the years 1995–2003, there were 17,595 geocoded birth defects and 770,638 geocoded birth records with available data. Spatio-temporal models were developed with data from 1995–2000 and their fit evaluated within the following time period: 2001–2003. Results We were able to create four adjacency-based weight matrices, seven distance-based weight matrices and one matrix based on similarity in terms of a key covariate (i.e. maternal age. In terms of agreement between observed and predicted relative risks, categorised in epidemiologically relevant groups, generally the distance-based matrices performed better than the adjacency-based neighbourhoods. In terms of recovering the underlying risk structure, the weight-7 model (smoothing by maternal-age 'Covariate model' was able to correctly classify 35/47 high-risk areas (sensitivity 74% with a specificity of 47%, and the 'Gravity' model had sensitivity and specificity values of 74% and 39% respectively. Conclusion We found considerable differences in the smoothing properties of the CAR model, depending on the type of neighbours specified. This in turn had an effect on the models' ability to recover the observed risk in an area. Prior to risk mapping or ecological modelling, an exploratory analysis of the neighbourhood weight matrix to guide the choice of a suitable weight matrix is recommended. Alternatively, the weight matrix can be chosen a priori

  3. Orthotopic model of canine osteosarcoma in athymic rats for evaluation of stereotactic radiotherapy.

    Science.gov (United States)

    Schwartz, Anthony L; Custis, James T; Harmon, Joseph F; Powers, Barbara E; Chubb, Laura S; LaRue, Susan M; Ehrhart, Nicole P; Ryan, Stewart D

    2013-03-01

    To develop an orthotopic model of canine osteosarcoma in athymic rats as a model for evaluating the effects of stereotactic radiotherapy (SRT) on osteosarcoma cells. 26 athymic nude rats. 3 experiments were performed. In the first 2 experiments, rats were injected with 1 × 10(6) Abrams canine osteosarcoma cells into the proximal aspect of the tibia (n = 12) or distal aspect of the femur (6). Tumor engraftment and progression were monitored weekly via radiography, luciferase imaging, and measurement of urine pyridinoline concentration for 5 weeks and histologic evaluation after euthanasia. In the third experiment, 8 rats underwent canine osteosarcoma cell injection into the distal aspect of the femur and SRT was administered to the affected area in three 12-Gy fractions delivered on consecutive days (total radiation dose, 36 Gy). Percentage tumor necrosis and urinary pyridinoline concentrations were used to assess local tumor control. The short-term effect of SRT on skin was also evaluated. Tumors developed in 10 of 12 tibial sites and all 14 femoral sites. Administration of SRT to rats with femoral osteosarcoma was feasible and successful. Mean tumor necrosis of 95% was achieved histologically, and minimal adverse skin effects were observed. The orthotopic model of canine osteosarcoma in rats developed in this study was suitable for evaluating the effects of local tumor control and can be used in future studies to evaluate optimization of SRT duration, dose, and fractionation schemes. The model could also allow evaluation of other treatments in combination with SRT, such as chemotherapy or bisphosphonate, radioprotectant, or parathyroid hormone treatment.

  4. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  5. Using long-term ARM observations to evaluate Arctic mixed-phased cloud representation in the GISS ModelE GCM

    Science.gov (United States)

    Lamer, K.; Fridlind, A. M.; Luke, E. P.; Tselioudis, G.; Ackerman, A. S.; Kollias, P.; Clothiaux, E. E.

    2016-12-01

    The presence of supercooled liquid in clouds affects surface radiative and hydrological budgets, especially at high latitudes. Capturing these effects is crucial to properly quantifying climate sensitivity. Currently, a number of CGMs disagree on the distribution of cloud phase. Adding to the challenge is a general lack of observations on the continuum of clouds, from high to low-level and from warm to cold. In the current study, continuous observations from 2011 to 2014 are used to evaluate all clouds produced by the GISS ModelE GCM over the ARM North Slope of Alaska site. The International Satellite Cloud Climatology Project (ISCCP) Global Weather State (GWS) approach reveals that fair-weather (GWS 7, 32% occurrence rate), as well as mid-level storm related (GWS 5, 28%) and polar (GWS 4, 14%) clouds, dominate the large-scale cloud patterns at this high latitude site. At higher spatial and temporal resolutions, ground-based cloud radar observations reveal a majority of single layer cloud vertical structures (CVS). While clear sky and low-level clouds dominate (each with 30% occurrence rate) a fair amount of shallow ( 10%) to deep ( 5%) convection are observed. Cloud radar Doppler spectra are used along with depolarization lidar observations in a neural network approach to detect the presence, layering and inhomogeneity of supercooled liquid layers. Preliminary analyses indicate that most of the low-level clouds sampled contain one or more supercooled liquid layers. Furthermore, the relationship between CVS and the presence of supercooled liquid is established, as is the relationship between the presence of supercool liquid and precipitation susceptibility. Two approaches are explored to bridge the gap between large footprint GCM simulations and high-resolution ground-based observations. The first approach consists of comparing model output and ground-based observations that exhibit the same column CVS type (i.e. same cloud depth, height and layering

  6. A model for Long-term Industrial Energy Forecasting (LIEF)

    Energy Technology Data Exchange (ETDEWEB)

    Ross, M. (Lawrence Berkeley Lab., CA (United States) Michigan Univ., Ann Arbor, MI (United States). Dept. of Physics Argonne National Lab., IL (United States). Environmental Assessment and Information Sciences Div.); Hwang, R. (Lawrence Berkeley Lab., CA (United States))

    1992-02-01

    The purpose of this report is to establish the content and structural validity of the Long-term Industrial Energy Forecasting (LIEF) model, and to provide estimates for the model's parameters. The model is intended to provide decision makers with a relatively simple, yet credible tool to forecast the impacts of policies which affect long-term energy demand in the manufacturing sector. Particular strengths of this model are its relative simplicity which facilitates both ease of use and understanding of results, and the inclusion of relevant causal relationships which provide useful policy handles. The modeling approach of LIEF is intermediate between top-down econometric modeling and bottom-up technology models. It relies on the following simple concept, that trends in aggregate energy demand are dependent upon the factors: (1) trends in total production; (2) sectoral or structural shift, that is, changes in the mix of industrial output from energy-intensive to energy non-intensive sectors; and (3) changes in real energy intensity due to technical change and energy-price effects as measured by the amount of energy used per unit of manufacturing output (KBtu per constant $ of output). The manufacturing sector is first disaggregated according to their historic output growth rates, energy intensities and recycling opportunities. Exogenous, macroeconomic forecasts of individual subsector growth rates and energy prices can then be combined with endogenous forecasts of real energy intensity trends to yield forecasts of overall energy demand. 75 refs.

  7. Modelling long-term oil price and extraction with a Hubbert approach: The LOPEX model

    International Nuclear Information System (INIS)

    Rehrl, Tobias; Friedrich, Rainer

    2006-01-01

    The LOPEX (Long-term Oil Price and EXtraction) model generates long-term scenarios about future world oil supply and corresponding price paths up to the year 2100. In order to determine oil production in non-OPEC countries, the model uses Hubbert curves. Hubbert curves reflect the logistic nature of the discovery process and the associated constraint on temporal availability of oil. Extraction paths and world oil price path are both derived endogenously from OPEC's intertemporally optimal cartel behaviour. Thereby OPEC is faced with both the price-dependent production of the non-OPEC competitive fringe and the price-dependent world oil demand. World oil demand is modelled with a constant price elasticity function and refers to a scenario from ACROPOLIS-POLES. LOPEX results indicate a significant higher oil price from around 2020 onwards compared to the reference scenario, and a stagnating market share of maximal 50% to be optimal for OPEC

  8. Viscous cosmological models with a variable cosmological term ...

    African Journals Online (AJOL)

    Einstein's field equations for a Friedmann-Lamaitre Robertson-Walker universe filled with a dissipative fluid with a variable cosmological term L described by full Israel-Stewart theory are considered. General solutions to the field equations for the flat case have been obtained. The solution corresponds to the dust free model ...

  9. Phenomenological study on crystalline rock for evaluating of long-term behavior (Contract research)

    International Nuclear Information System (INIS)

    Okubo, Seisuke; Fukui, Katsunori; Hashiba, Kimihiro; Hikima, Ryoichi; Tanno, Takeo; Sanada, Hiroyuki; Matsui, Hiroya; Sato, Toshinori

    2012-02-01

    Rock, under in situ conditions, shows time-dependent behavior such as creep/relaxation. With respect to high-level radioactive waste disposal, knowledge of the long-term mechanical stability of shafts and galleries excavated in rock is required, not only during construction and operation but also over a period of thousands of years after closure. Therefore, it is very important to understand the time-dependent behavior of rock for evaluating long-term mechanical stability. The purpose of this study is determining the mechanisms of time-dependent behavior of rock by precise testing, observation and measurement in order to develop methods for evaluating long-term mechanical stability of a rock mass. In the previous work, testing techniques have been established and basic evaluation methods were developed. Recently, some parameters needed for simulation of time-dependent behavior were determined at the Mizunami underground research facilities. However, sufficient data to check the reliability of the evaluation method for these parameters were not available. This report describes the results of the activities in fiscal year 2010. In Chapter 1, we provide an overview and the background to this study. In Chapter 2, the results of a long-term creep test on Tage tuff, started in fiscal year 1997 are described. In Chapter 3, the relation of loading-rate dependency of strength and stress dependency of creep life, the relation of time dependency, probability distribution and size effects are discussed to indicate more clearly the meaning of the value of 'n' to express the degree of time dependency of the rock. Furthermore, past studies concerning the value of 'n' are reviewed and the tests that could be carried out in future studies of mechanical properties and time dependency of Toki granite are considered in this Chapter. In Chapter 4, failure criterions of a rock mass considering time dependency are discussed. In Chapter 5, the FEM analysis implemented with a generalized

  10. Simple model for crop photosynthesis in terms of weather variables ...

    African Journals Online (AJOL)

    A theoretical mathematical model for describing crop photosynthetic rate in terms of the weather variables and crop characteristics is proposed. The model utilizes a series of efficiency parameters, each of which reflect the fraction of possible photosynthetic rate permitted by the different weather elements or crop architecture.

  11. Model for expressing leaf photosynthesis in terms of weather variables

    African Journals Online (AJOL)

    A theoretical mathematical model for describing photosynthesis in individual leaves in terms of weather variables is proposed. The model utilizes a series of efficiency parameters, each of which reflect the fraction of potential photosynthetic rate permitted by the different environmental elements. These parameters are useful ...

  12. MARKET EVALUATION MODEL: TOOL FORBUSINESS DECISIONS

    OpenAIRE

    Porlles Loarte, José; Yenque Dedios, Julio; Lavado Soto, Aurelio

    2014-01-01

    In the present work the concepts of potential market and global market are analyzed as the basis for strategic decisions of market with long term perspectives, when the implantation of a business in certain geographic area is evaluated. On this conceptual frame, the methodological tool is proposed to evaluate a commercial decision, for which it is taken as reference the case from the brewing industry in Peru, considering that this industry faces in the region entrepreneurial reorderings withi...

  13. Evaluation of Long Term Behaviour of Polymers for Offshore Oil and Gas Applications

    Directory of Open Access Journals (Sweden)

    Le Gac P.-Y.

    2015-02-01

    Full Text Available Polymers and composites are very attractive for underwater applications, but it is essential to evaluate their long term behaviour in sea water if structural integrity of offshore structures is to be guaranteed. Accelerated test procedures are frequently required, and this paper will present three examples showing how the durability of polymers, in the form of fibres, matrix resins in fibre reinforced composites for structural elements, and thermal insulation coatings of flow-lines, have been evaluated for offshore use. The influence of the ageing medium, temperature, and hydrostatic pressure will be discussed first, then an example of the application of ageing test results to predict long term behavior of the thermal insulation coating of a flowline will be presented.

  14. Evaluating model structure adequacy: The case of the Maggia Valley groundwater system, southern Switzerland

    Science.gov (United States)

    Hill, Mary C.; L. Foglia,; S. W. Mehl,; P. Burlando,

    2013-01-01

    Model adequacy is evaluated with alternative models rated using model selection criteria (AICc, BIC, and KIC) and three other statistics. Model selection criteria are tested with cross-validation experiments and insights for using alternative models to evaluate model structural adequacy are provided. The study is conducted using the computer codes UCODE_2005 and MMA (MultiModel Analysis). One recharge alternative is simulated using the TOPKAPI hydrological model. The predictions evaluated include eight heads and three flows located where ecological consequences and model precision are of concern. Cross-validation is used to obtain measures of prediction accuracy. Sixty-four models were designed deterministically and differ in representation of river, recharge, bedrock topography, and hydraulic conductivity. Results include: (1) What may seem like inconsequential choices in model construction may be important to predictions. Analysis of predictions from alternative models is advised. (2) None of the model selection criteria consistently identified models with more accurate predictions. This is a disturbing result that suggests to reconsider the utility of model selection criteria, and/or the cross-validation measures used in this work to measure model accuracy. (3) KIC displayed poor performance for the present regression problems; theoretical considerations suggest that difficulties are associated with wide variations in the sensitivity term of KIC resulting from the models being nonlinear and the problems being ill-posed due to parameter correlations and insensitivity. The other criteria performed somewhat better, and similarly to each other. (4) Quantities with high leverage are more difficult to predict. The results are expected to be generally applicable to models of environmental systems.

  15. Evaluation of modelling body burden of Cs-137

    International Nuclear Information System (INIS)

    Bergstroem, U.; Nordlinder, S.

    1996-05-01

    Within the IAEA/CEC VAMP-program one working group studied the precision in dose assessment models when calculating body burden of 137 Cs as a result of exposure through multiple exposure pathways. One scenario used data from southern Finland regarding contamination of various media due to the fallout from the Chernobyl accident. In this study, a time dependent multiple exposure pathway model was constructed based on compartment theory. Uncertainties in model responses due to uncertainties in input parameter values were studied. The initial predictions for body burden were good, within a factor of 2 of the observed, while the time dynamics of levels in milk and meat did not agree satisfactorily. Some results, nevertheless, showed good agreement with observations due to compensatory effects. After disclosure of additional observational data, major reasons for mispredictions were identified as lack of consideration of time dependence of fixation of 137 Cs in soils, and the selection of parameter values. When correction of this was made, a close agreement between predictions and observations was obtained. This study shows that the dose contribution due to 137 Cs in food products from the seminatural environment is important for long-term exposure to man. The evaluation provided a basis for improvements of crucial parts in the model. 14 refs, 18 figs, 8 tabs

  16. Building long-term and high spatio-temporal resolution precipitation and air temperature reanalyses by mixing local observations and global atmospheric reanalyses: the ANATEM model

    Directory of Open Access Journals (Sweden)

    A. Kuentz

    2015-06-01

    The ANATEM model has been also evaluated for the regional scale against independent long-term time series and was able to capture regional low-frequency variability over more than a century (1883–2010.

  17. Chaos in long-term behavior of some Bianchi-type VIII models

    Energy Technology Data Exchange (ETDEWEB)

    Halpern, P

    1987-01-01

    The long-term behavior of Bianchi-type VIII models with three different types of stress-energy tensors are examined and compared. The vacuum model, a matter-filled model, and a model with an electromagnetic field are considered. In each case the existence of chaotic behavior and transitions to chaotic behavior are discussed.

  18. Model evaluation methodology applicable to environmental assessment models

    International Nuclear Information System (INIS)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  19. A novel TRNSYS type for short-term borehole heat exchanger simulation: B2G model

    International Nuclear Information System (INIS)

    De Rosa, Mattia; Ruiz-Calvo, Félix; Corberán, José M.; Montagud, Carla; Tagliafico, Luca A.

    2015-01-01

    Highlights: • A novel dynamic borehole heat exchanger model is presented. • Theoretical approach for model parameters calculation is described. • The short-term model is validated against experimental data of a real GSHP. • Strong dynamic conditions due to the ON–OFF regulation are investigated. - Abstract: Models of ground source heat pump (GSHP) systems are used as an aid for the correct design and optimization of the system. For this purpose, it is necessary to develop models which correctly reproduce the dynamic thermal behavior of each component in a short-term basis. Since the borehole heat exchanger (BHE) is one of the main components, special attention should be paid to ensuring a good accuracy on the prediction of the short-term response of the boreholes. The BHE models found in literature which are suitable for short-term simulations usually present high computational costs. In this work, a novel TRNSYS type implementing a borehole-to-ground (B2G) model, developed for modeling the short-term dynamic performance of a BHE with low computational cost, is presented. The model has been validated against experimental data from a GSHP system located at Universitat Politècnica de València, Spain. Validation results show the ability of the model to reproduce the short-term behavior of the borehole, both for a step-test and under normal operating conditions

  20. A predictive framework for evaluating models of semantic organization in free recall

    Science.gov (United States)

    Morton, Neal W; Polyn, Sean M.

    2016-01-01

    Research in free recall has demonstrated that semantic associations reliably influence the organization of search through episodic memory. However, the specific structure of these associations and the mechanisms by which they influence memory search remain unclear. We introduce a likelihood-based model-comparison technique, which embeds a model of semantic structure within the context maintenance and retrieval (CMR) model of human memory search. Within this framework, model variants are evaluated in terms of their ability to predict the specific sequence in which items are recalled. We compare three models of semantic structure, latent semantic analysis (LSA), global vectors (GloVe), and word association spaces (WAS), and find that models using WAS have the greatest predictive power. Furthermore, we find evidence that semantic and temporal organization is driven by distinct item and context cues, rather than a single context cue. This finding provides important constraint for theories of memory search. PMID:28331243

  1. Evaluation of selected near-term energy-conservation options for the Midwest

    Energy Technology Data Exchange (ETDEWEB)

    Evans, A.R.; Colsher, C.S.; Hamilton, R.W.; Buehring, W.A.

    1978-11-01

    This report evaluates the potential for implementation of near-term energy-conservation practices for the residential, commercial, agricultural, industrial, transportation, and utility sectors of the economy in twelve states: Illinois, Indiana, Iowa, Kansas, Michigan, Minnesota, Missouri, Nebraska, North Dakota, Ohio, South Dakota, and Wisconsin. The information used to evaluate the magnitude of achievable energy savings includes regional energy use, the regulatory/legislative climate relating to energy conservation, technical characteristics of the measures, and their feasibility of implementation. This work is intended to provide baseline information for an ongoing regional assessment of energy and environmental impacts in the Midwest. 80 references.

  2. Tritium: a model for low level long-term ionizing radiation exposure

    International Nuclear Information System (INIS)

    Carsten, A.L.

    1984-01-01

    The somatic, cytogenetic and genetic effects of single and chronic tritiated water (HTO) ingestion in mice was investigated. This study serves not only as an evaluation of tritium toxicity (TRITOX) but due to its design involving long-term low concentration ingestion of HTO may serve as a model for low level long-term ionizing radiation exposure in general. Long-term studies involved animals maintained on HTO at concentrations of 0.3 μCi/ml, 1.0 μCi/ml, 3.0 μCi/ml or depth dose equivalent chronic external exposures to 137 Cs gamma rays. Maintenance on 3.0 μCi/ml resulted in no effect on growth, life-time shortening or bone marrow cellularity, but did result in a reduction of bone marrow stem cells, an increase in DLM's in second generation animals maintained on this regimen and cytogenetic effects as indicated by increased sister chromatid exchanges (SCE's) in bone marrow cells, increased chromosome aberrations in the regenerating liver and an increase in micronuclei in red blood cells. Biochemical and microdosimetry studies showed that animals placed on the HTO regimen reached tritium equilibrium in the body water in approximately 17 to 21 days with a more gradual increase in bound tritium. When animals maintained for 180 days on 3.0 μCi/ml HTO were placed on a tap water regimen, the tritium level in tissue dropped from the equilibrium value of 2.02 μCi/ml before withdrawal to 0.001 μCi/ml at 28 days. 18 references

  3. GATEWAY Demonstrations: Long-Term Evaluation of SSL Field Performance in Select Interior Projects

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Tess E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Davis, Robert G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wilkerson, Andrea M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-02-28

    The GATEWAY program evaluated the long-term performance characteristics (chromaticity change, maintained illuminance, and operations and maintenance) of LED lighting systems in four field installations previously documented in separate DOE GATEWAY reports.

  4. QUALITY OF AN ACADEMIC STUDY PROGRAMME - EVALUATION MODEL

    Directory of Open Access Journals (Sweden)

    Mirna Macur

    2016-01-01

    Full Text Available Quality of an academic study programme is evaluated by many: employees (internal evaluation and by external evaluators: experts, agencies and organisations. Internal and external evaluation of an academic programme follow written structure that resembles on one of the quality models. We believe the quality models (mostly derived from EFQM excellence model don’t fit very well into non-profit activities, policies and programmes, because they are much more complex than environment, from which quality models derive from (for example assembly line. Quality of an academic study programme is very complex and understood differently by various stakeholders, so we present dimensional evaluation in the article. Dimensional evaluation, as opposed to component and holistic evaluation, is a form of analytical evaluation in which the quality of value of the evaluand is determined by looking at its performance on multiple dimensions of merit or evaluation criteria. First stakeholders of a study programme and their views, expectations and interests are presented, followed by evaluation criteria. They are both joined into the evaluation model revealing which evaluation criteria can and should be evaluated by which stakeholder. Main research questions are posed and research method for each dimension listed.

  5. Periglacial processes incorporated into a long-term landscape evolution model

    DEFF Research Database (Denmark)

    Andersen, Jane Lund; Egholm, D.L.; Knudsen, Mads Faurschou

    Little is known about the long-term influence of periglacial processes on landscape evolution in cold areas, even though the efficiency of frost cracking on the breakdown of rocks has been documented by observations and experiments. Cold-room laboratory experiments show that a continuous water...... supply and sustained sub- zero temperatures are essential to develop fractures in porous rocks (e.g. Murton, 2006), but the cracking efficiency for harder rock types under natural conditions is less clear. However, based on experimental results for porous rocks, Hales and Roering (2007) proposed a model...... by their model and the elevation of scree deposits in the Southern Alps, New Zealand. This result suggests a link between frost-cracking efficiency and long-term landscape evolution and thus merits further investigations. Anderson et al. (2012) expanded this early model by including the effects of latent heat...

  6. Quantification of source-term profiles from near-field geochemical models

    International Nuclear Information System (INIS)

    McKinley, I.G.

    1985-01-01

    A geochemical model of the near-field is described which quantitatively treats the processes of engineered barrier degradation, buffering of aqueous chemistry by solid phases, nuclide solubilization and transport through the near-field and release to the far-field. The radionuclide source-terms derived from this model are compared with those from a simpler model used for repository safety analysis. 10 refs., 2 figs., 2 tabs

  7. Using satellite imagery for qualitative evaluation of plume transport in modeling the effects of the Kuwait oil fire smoke plumes

    International Nuclear Information System (INIS)

    Bass, A.; Janota, P.

    1992-01-01

    To forecast the behavior of the Kuwait oil fire smoke plumes and their possible acute or chronic health effects over the Arabian Gulf region, TASC created a comprehensive health and environmental impacts modeling system. A specially-adapted Lagrangian puff transport model was used to create (a) short-term (multiday) forecasts of plume transport and ground-level concentrations of soot and SO 2 ; and (b) long-term (seasonal and longer) estimates of average surface concentrations and depositions. EPA-approved algorithms were used to transform exposures to SO 2 and soot (as PAH/BaP) into morbidity, mortality and crop damage risks. Absent any ground truth, satellite imagery from the NOAA Polar Orbiter and the ESA Geostationary Meteosat offered the only opportunity for timely qualitative evaluation of the long-range plume transport and diffusion predictions. This paper shows the use of actual satellite images (including animated loops of hourly Meteosat images) to evaluate plume forecasts in near-real-time, and to sanity-check the meso- and long-range plume transport projections for the long-term estimates. Example modeled concentrations, depositions and health effects are shown

  8. Investigation of transformational and transactional leadership styles of school principals, and evaluation of them in terms of educational administration

    OpenAIRE

    Avcı, Ahmet

    2015-01-01

    The aim of this study is to investigate the transformational and transactional leadership styles of school principals, and to evaluate them in terms of educational administration. Descriptive survey model was used in the research. The data of the research were obtained from a total of 1,117 teachers working in public and private schools subjected to ministry of national education in Avcılar district of Istanbul province in 2014. In this study, data were obtained from the "personal informat...

  9. Developing a Teacher Evaluation Model: The Impact of Teachers’ Attitude toward the Performance Evaluation System (PES on Job Satisfaction and Organizational Commitment with the Mediating Role of Teachers’ Sense of Efficacy

    Directory of Open Access Journals (Sweden)

    Behrooz Saljooghi

    2016-05-01

    Full Text Available The objective of this paper was to design, develop and evaluate a causal model of teachers’ attitude toward the performance evaluation system (PES with the mediating role of teachers’ sense of efficacy on job satisfaction and organizational commitment. The study population included all teachers of male-only high schools in Tehran. 117 teachers were selected as the sample population using availability sampling. The present study is an applied research in terms of its objective and a descriptive research in terms of its data collection method. Furthermore, the study uses a correlational research design through structural equation modeling. In order to measure the study variables, the following questionnaires were used: Teachers’ Attitude toward Performance Evaluation, Teachers’ Sense of Efficacy, Job Satisfaction and Organizational Commitment. The results showed that teachers’ attitude toward the performance evaluation system had a significant positive effect on job satisfaction, organizational commitment and self-efficacy. Also, teachers’ sense of efficacy had a significant positive effect on job satisfaction. Moreover, the results showed that teachers’ attitude to the performance evaluation system had a positive and significant effect on organizational commitment with the mediating role of self-efficacy. Thus, the present study verified the causal model of teachers’ attitude toward the performance evaluation system with the mediating role of teachers’ sense of efficacy. Finally, the structural equation modeling reflects the positive impact of teachers’ attitude toward Iran’s Ministry of Education’s employee performance evaluation system on job satisfaction, sense of efficacy and organizational commitment.

  10. Site descriptive modelling - strategy for integrated evaluation

    International Nuclear Information System (INIS)

    Andersson, Johan

    2003-02-01

    The current document establishes the strategy to be used for achieving sufficient integration between disciplines in producing Site Descriptive Models during the Site Investigation stage. The Site Descriptive Model should be a multidisciplinary interpretation of geology, rock mechanics, thermal properties, hydrogeology, hydrogeochemistry, transport properties and ecosystems using site investigation data from deep bore holes and from the surface as input. The modelling comprise the following iterative steps, evaluation of primary data, descriptive and quantitative modelling (in 3D), overall confidence evaluation. Data are first evaluated within each discipline and then the evaluations are checked between the disciplines. Three-dimensional modelling (i.e. estimating the distribution of parameter values in space and its uncertainty) is made in a sequence, where the geometrical framework is taken from the geological model and in turn used by the rock mechanics, thermal and hydrogeological modelling etc. The three-dimensional description should present the parameters with their spatial variability over a relevant and specified scale, with the uncertainty included in this description. Different alternative descriptions may be required. After the individual discipline modelling and uncertainty assessment a phase of overall confidence evaluation follows. Relevant parts of the different modelling teams assess the suggested uncertainties and evaluate the feedback. These discussions should assess overall confidence by, checking that all relevant data are used, checking that information in past model versions is considered, checking that the different kinds of uncertainty are addressed, checking if suggested alternatives make sense and if there is potential for additional alternatives, and by discussing, if appropriate, how additional measurements (i.e. more data) would affect confidence. The findings as well as the modelling results are to be documented in a Site Description

  11. Pragmatic geometric model evaluation

    Science.gov (United States)

    Pamer, Robert

    2015-04-01

    Quantification of subsurface model reliability is mathematically and technically demanding as there are many different sources of uncertainty and some of the factors can be assessed merely in a subjective way. For many practical applications in industry or risk assessment (e. g. geothermal drilling) a quantitative estimation of possible geometric variations in depth unit is preferred over relative numbers because of cost calculations for different scenarios. The talk gives an overview of several factors that affect the geometry of structural subsurface models that are based upon typical geological survey organization (GSO) data like geological maps, borehole data and conceptually driven construction of subsurface elements (e. g. fault network). Within the context of the trans-European project "GeoMol" uncertainty analysis has to be very pragmatic also because of different data rights, data policies and modelling software between the project partners. In a case study a two-step evaluation methodology for geometric subsurface model uncertainty is being developed. In a first step several models of the same volume of interest have been calculated by omitting successively more and more input data types (seismic constraints, fault network, outcrop data). The positions of the various horizon surfaces are then compared. The procedure is equivalent to comparing data of various levels of detail and therefore structural complexity. This gives a measure of the structural significance of each data set in space and as a consequence areas of geometric complexity are identified. These areas are usually very data sensitive hence geometric variability in between individual data points in these areas is higher than in areas of low structural complexity. Instead of calculating a multitude of different models by varying some input data or parameters as it is done by Monte-Carlo-simulations, the aim of the second step of the evaluation procedure (which is part of the ongoing work) is to

  12. Analytical Solution for the Anisotropic Rabi Model: Effects of Counter-Rotating Terms

    Science.gov (United States)

    Zhang, Guofeng; Zhu, Hanjie

    2015-03-01

    The anisotropic Rabi model, which was proposed recently, differs from the original Rabi model: the rotating and counter-rotating terms are governed by two different coupling constants. This feature allows us to vary the counter-rotating interaction independently and explore the effects of it on some quantum properties. In this paper, we eliminate the counter-rotating terms approximately and obtain the analytical energy spectrums and wavefunctions. These analytical results agree well with the numerical calculations in a wide range of the parameters including the ultrastrong coupling regime. In the weak counter-rotating coupling limit we find out that the counter-rotating terms can be considered as the shifts to the parameters of the Jaynes-Cummings model. This modification shows the validness of the rotating-wave approximation on the assumption of near-resonance and relatively weak coupling. Moreover, the analytical expressions of several physics quantities are also derived, and the results show the break-down of the U(1)-symmetry and the deviation from the Jaynes-Cummings model.

  13. Risk management under a two-factor model of the term structure of interest rates

    OpenAIRE

    Manuel Moreno

    1997-01-01

    This paper presents several applications to interest rate risk management based on a two-factor continuous-time model of the term structure of interest rates previously presented in Moreno (1996). This model assumes that default free discount bond prices are determined by the time to maturity and two factors, the long-term interest rate and the spread (difference between the long-term rate and the short-term (instantaneous) riskless rate). Several new measures of ``generalized duration" are p...

  14. A Spectral Evaluation of Models Performances in Mediterranean Oak Woodlands

    Science.gov (United States)

    Vargas, R.; Baldocchi, D. D.; Abramowitz, G.; Carrara, A.; Correia, A.; Kobayashi, H.; Papale, D.; Pearson, D.; Pereira, J.; Piao, S.; Rambal, S.; Sonnentag, O.

    2009-12-01

    Ecosystem processes are influenced by climatic trends at multiple temporal scales including diel patterns and other mid-term climatic modes, such as interannual and seasonal variability. Because interactions between biophysical components of ecosystem processes are complex, it is important to test how models perform in frequency (e.g. hours, days, weeks, months, years) and time (i.e. day of the year) domains in addition to traditional tests of annual or monthly sums. Here we present a spectral evaluation using wavelet time series analysis of model performance in seven Mediterranean Oak Woodlands that encompass three deciduous and four evergreen sites. We tested the performance of five models (CABLE, ORCHIDEE, BEPS, Biome-BGC, and JULES) on measured variables of gross primary production (GPP) and evapotranspiration (ET). In general, model performance fails at intermediate periods (e.g. weeks to months) likely because these models do not represent the water pulse dynamics that influence GPP and ET at these Mediterranean systems. To improve the performance of a model it is critical to identify first where and when the model fails. Only by identifying where a model fails we can improve the model performance and use them as prognostic tools and to generate further hypotheses that can be tested by new experiments and measurements.

  15. A wind turbine evaluation model under a multi-criteria decision making environment

    International Nuclear Information System (INIS)

    Lee, Amy H.I.; Hung, Meng-Chan; Kang, He-Yau; Pearn, W.L.

    2012-01-01

    Highlights: ► This paper proposes an evaluation model to select suitable turbines in a wind farm. ► Interpretive structural modeling is used to know the relationship among factors. ► Fuzzy analytic network process is used to calculate the priorities of turbines. ► The results can be references for selecting the most appropriate wind turbines. - Abstract: Due to the impacts of fossil and nuclear energy on the security, economics, and environment in the world, the demand of alternative energy resources is expanding consistently and tremendously in recent years. Wind energy production, with its safe and environmental characteristics, has become the fastest growing renewable energy source in the world. The construction of new wind farms and the installation of new wind turbines are important processes in order to provide a long-term energy production. In this research, a comprehensive evaluation model, which incorporates interpretive structural modeling (ISM) and fuzzy analytic network process (FANP), is constructed to select suitable turbines when developing a wind farm. A case study is carried out in Taiwan in evaluating the expected performance of several potential types of wind turbines, and experts in a wind farm are invited to contribute their expertise in determining the importance of the factors of the wind turbine evaluation and in rating the performance of the turbines with respect to each factor. The most suitable turbines for installation can finally be generated after the calculations. The results can be references for decision makers in selecting the most appropriate wind turbines.

  16. Evaluating the quality of scenarios of short-term wind power generation

    DEFF Research Database (Denmark)

    Pinson, Pierre; Girard, R.

    2012-01-01

    Scenarios of short-term wind power generation are becoming increasingly popular as input to multi-stage decision-making problems e.g. multivariate stochastic optimization and stochastic programming. The quality of these scenarios is intuitively expected to substantially impact the benets from...... their use in decision-making. So far however, their verication is almost always focused on their marginal distributions for each individual lead time only, thus overlooking their temporal interdependence structure. The shortcomings of such an approach are discussed. Multivariate verication tools, as well...... as diagnostic approaches based on event-based verication are then presented. Their application to the evaluation of various sets of scenarios of short-term wind power generation demonstrates them as valuable discrimination tools....

  17. Training effectiveness evaluation model

    International Nuclear Information System (INIS)

    Penrose, J.B.

    1993-01-01

    NAESCO's Training Effectiveness Evaluation Model (TEEM) integrates existing evaluation procedures with new procedures. The new procedures are designed to measure training impact on organizational productivity. TEEM seeks to enhance organizational productivity through proactive training focused on operation results. These results can be identified and measured by establishing and tracking performance indicators. Relating training to organizational productivity is not easy. TEEM is a team process. It offers strategies to assess more effectively organizational costs and benefits of training. TEEM is one organization's attempt to refine, manage and extend its training evaluation program

  18. Credit Risk Evaluation Using a C-Variable Least Squares Support Vector Classification Model

    Science.gov (United States)

    Yu, Lean; Wang, Shouyang; Lai, K. K.

    Credit risk evaluation is one of the most important issues in financial risk management. In this paper, a C-variable least squares support vector classification (C-VLSSVC) model is proposed for credit risk analysis. The main idea of this model is based on the prior knowledge that different classes may have different importance for modeling and more weights should be given to those classes with more importance. The C-VLSSVC model can be constructed by a simple modification of the regularization parameter in LSSVC, whereby more weights are given to the lease squares classification errors with important classes than the lease squares classification errors with unimportant classes while keeping the regularized terms in its original form. For illustration purpose, a real-world credit dataset is used to test the effectiveness of the C-VLSSVC model.

  19. Modeling of correlated data with informative cluster sizes: An evaluation of joint modeling and within-cluster resampling approaches.

    Science.gov (United States)

    Zhang, Bo; Liu, Wei; Zhang, Zhiwei; Qu, Yanping; Chen, Zhen; Albert, Paul S

    2017-08-01

    Joint modeling and within-cluster resampling are two approaches that are used for analyzing correlated data with informative cluster sizes. Motivated by a developmental toxicity study, we examined the performances and validity of these two approaches in testing covariate effects in generalized linear mixed-effects models. We show that the joint modeling approach is robust to the misspecification of cluster size models in terms of Type I and Type II errors when the corresponding covariates are not included in the random effects structure; otherwise, statistical tests may be affected. We also evaluate the performance of the within-cluster resampling procedure and thoroughly investigate the validity of it in modeling correlated data with informative cluster sizes. We show that within-cluster resampling is a valid alternative to joint modeling for cluster-specific covariates, but it is invalid for time-dependent covariates. The two methods are applied to a developmental toxicity study that investigated the effect of exposure to diethylene glycol dimethyl ether.

  20. Evaluation of Current Computer Models Applied in the DOE Complex for SAR Analysis of Radiological Dispersion & Consequences

    Energy Technology Data Exchange (ETDEWEB)

    O' Kula, K. R. [Savannah River Site (SRS), Aiken, SC (United States); East, J. M. [Savannah River Site (SRS), Aiken, SC (United States); Weber, A. H. [Savannah River Site (SRS), Aiken, SC (United States); Savino, A. V. [Savannah River Site (SRS), Aiken, SC (United States); Mazzola, C. A. [Savannah River Site (SRS), Aiken, SC (United States)

    2003-01-01

    The evaluation of atmospheric dispersion/ radiological dose analysis codes included fifteen models identified in authorization basis safety analysis at DOE facilities, or from regulatory and research agencies where past or current work warranted inclusion of a computer model. All computer codes examined were reviewed using general and specific evaluation criteria developed by the Working Group. The criteria were based on DOE Orders and other regulatory standards and guidance for performing bounding and conservative dose calculations. Included were three categories of criteria: (1) Software Quality/User Interface; (2) Technical Model Adequacy; and (3) Application/Source Term Environment. A consensus-based limited quantitative ranking process was used to base an order of model preference as both an overall conclusion, and under specific conditions.

  1. Enhanced stability of car-following model upon incorporation of short-term driving memory

    Science.gov (United States)

    Liu, Da-Wei; Shi, Zhong-Ke; Ai, Wen-Huan

    2017-06-01

    Based on the full velocity difference model, a new car-following model is developed to investigate the effect of short-term driving memory on traffic flow in this paper. Short-term driving memory is introduced as the influence factor of driver's anticipation behavior. The stability condition of the newly developed model is derived and the modified Korteweg-de Vries (mKdV) equation is constructed to describe the traffic behavior near the critical point. Via numerical method, evolution of a small perturbation is investigated firstly. The results show that the improvement of this new car-following model over the previous ones lies in the fact that the new model can improve the traffic stability. Starting and breaking processes of vehicles in the signalized intersection are also investigated. The numerical simulations illustrate that the new model can successfully describe the driver's anticipation behavior, and that the efficiency and safety of the vehicles passing through the signalized intersection are improved by considering short-term driving memory.

  2. The evaluation of the National Long Term Care Demonstration. 1. An overview of the channeling demonstration and its evaluation.

    Science.gov (United States)

    Carcagno, G J; Kemper, P

    1988-04-01

    The channeling demonstration sought to substitute community care for nursing home care to reduce long-term care costs and improve the quality of life of elderly clients and the family members and friends who care for them. Two interventions were tested, each in five sites; both had comprehensive case management at their core. One model added a small amount of additional funding for direct community services to fill the gaps in the existing system; the other substantially expanded coverage of community services regardless of categorical eligibility under existing programs. The demonstration was evaluated using a randomized experimental design to test the effects of channeling on use of community care, nursing homes, hospitals, and informal caregiving, and on measures of the quality of life of clients and their informal caregivers. Data were obtained from interviews with clients and informal caregivers; service use and cost records came from Medicare, Medicaid, channeling, and providers; and death records for an 18-month follow-up period were examined.

  3. Evaluating psychodiagnostic decisions.

    Science.gov (United States)

    Witteman, Cilia L M; Harries, Clare; Bekker, Hilary L; Van Aarle, Edward J M

    2007-02-01

    Several frameworks can be used to evaluate decision making. These may relate to different aspects of the decision-making process, or concern the decision outcome. Evaluations of psychodiagnostic decisions have shown diagnosticians to be poor decision makers. In this essay we argue that this finding results from the evaluation of only one part of the diagnostic process. We put forward that evaluations are typically carried out by comparing clinicians' behaviour to one of several normative models, for example hypothetico-deductive reasoning. These models make strong assumptions about human reasoning capabilities, which make it almost impossible for people to measure up to them. The subsequent two parts of the psychodiagnostic process (causal explanation and treatment decisions), are typically not included in these evaluation studies. Treatment decisions are evaluated in effectiveness studies; that is, they are evaluated in terms of their outcomes, not in terms of the diagnosticians' decision processes. Psychodiagnosticians' causal explanation has hardly ever been the subject of evaluation. We argue that in order to achieve clinical excellence, this part of the psychodiagnostic process should also be well understood. In this essay we first describe evaluation of psychodiagnostic decision making. We then propose a framework to describe causal explanation, that is, a situation assessment in terms of a causal schema or a story or script. We identify and discuss the tools available for evaluating this part of the psychodiagnostic process.

  4. Evaluation of local stress and local hydrogen concentration at grain boundary using three-dimensional polycrystalline model

    International Nuclear Information System (INIS)

    Ebihara, Ken-ichi; Itakura, Mitsuhiro; Yamaguchi, Masatake; Kaburaki, Hideo; Suzudo, Tomoaki

    2010-01-01

    The decohesion model in which hydrogen segregating at grain boundaries reduces cohesive energy is considered to explain hydrogen embrittlement. Although there are several experimental and theoretical supports of this model, its total process is still unclear. In order to understand hydrogen embrittlement in terms of the decohesion model, therefore, it is necessary to evaluate stress and hydrogen concentration at grain boundaries under experimental conditions and to verify the grain boundary decohesion process. Under this consideration, we evaluated the stress and the hydrogen concentration at grain boundaries in the three-dimensional polycrystalline model which was generated by the random Voronoi tessellation. The crystallographic anisotropy was given to each grain. As the boundary conditions of the calculations, data extracted from the results calculated in the notched round-bar specimen model under the tensile test condition in which fracture of the steel specimen is observed was given to the polycrystalline model. As a result, it was found that the evaluated stress does not reach the fracture stress which was estimated under the condition of the evaluated hydrogen concentration by first principles calculations. Therefore, it was considered that the initiation of grain boundary fracture needs other factors except the stress concentration due to the crystallographic anisotropy. (author)

  5. Evaluation of weather-based rice yield models in India

    Science.gov (United States)

    Sudharsan, D.; Adinarayana, J.; Reddy, D. Raji; Sreenivas, G.; Ninomiya, S.; Hirafuji, M.; Kiura, T.; Tanaka, K.; Desai, U. B.; Merchant, S. N.

    2013-01-01

    The objective of this study was to compare two different rice simulation models—standalone (Decision Support System for Agrotechnology Transfer [DSSAT]) and web based (SImulation Model for RIce-Weather relations [SIMRIW])—with agrometeorological data and agronomic parameters for estimation of rice crop production in southern semi-arid tropics of India. Studies were carried out on the BPT5204 rice variety to evaluate two crop simulation models. Long-term experiments were conducted in a research farm of Acharya N G Ranga Agricultural University (ANGRAU), Hyderabad, India. Initially, the results were obtained using 4 years (1994-1997) of data with weather parameters from a local weather station to evaluate DSSAT simulated results with observed values. Linear regression models used for the purpose showed a close relationship between DSSAT and observed yield. Subsequently, yield comparisons were also carried out with SIMRIW and DSSAT, and validated with actual observed values. Realizing the correlation coefficient values of SIMRIW simulation values in acceptable limits, further rice experiments in monsoon (Kharif) and post-monsoon (Rabi) agricultural seasons (2009, 2010 and 2011) were carried out with a location-specific distributed sensor network system. These proximal systems help to simulate dry weight, leaf area index and potential yield by the Java based SIMRIW on a daily/weekly/monthly/seasonal basis. These dynamic parameters are useful to the farming community for necessary decision making in a ubiquitous manner. However, SIMRIW requires fine tuning for better results/decision making.

  6. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    Science.gov (United States)

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation. 2010 Elsevier Ltd. All rights reserved.

  7. Generalized economic model for evaluating disposal costs at a low-level waste disposal facility

    International Nuclear Information System (INIS)

    Baird, R.D.; Rogers, V.C.

    1985-01-01

    An economic model is developed which can be used to evaluate cash flows associated with the development, operations, closure, and long-term maintenance of a proposed Low-Level Radioactive Waste disposal facility and to determine the unit disposal charges and unit surcharges which might result. The model includes the effects of nominal interest rate (rate of return on investment, or cost of capital), inflation rate, waste volume growth rate, site capacity, duration of various phases of the facility history, and the cash flows associated with each phase. The model uses standard discounted cash flow techniques on an after-tax basis to determine that unit disposal charge which is necessary to cover all costs and expenses and to generate an adequate rate of return on investment. It separately considers cash flows associated with post-operational activities to determine the required unit surcharge. The model is applied to three reference facilities to determine the respective unit disposal charges and unit surcharges, with various values of parameters. The sensitivity of the model results are evaluated for the unit disposal charge

  8. Distributed Evaluation of Local Sensitivity Analysis (DELSA), with application to hydrologic models

    Science.gov (United States)

    Rakovec, O.; Hill, M. C.; Clark, M. P.; Weerts, A. H.; Teuling, A. J.; Uijlenhoet, R.

    2014-01-01

    This paper presents a hybrid local-global sensitivity analysis method termed the Distributed Evaluation of Local Sensitivity Analysis (DELSA), which is used here to identify important and unimportant parameters and evaluate how model parameter importance changes as parameter values change. DELSA uses derivative-based "local" methods to obtain the distribution of parameter sensitivity across the parameter space, which promotes consideration of sensitivity analysis results in the context of simulated dynamics. This work presents DELSA, discusses how it relates to existing methods, and uses two hydrologic test cases to compare its performance with the popular global, variance-based Sobol' method. The first test case is a simple nonlinear reservoir model with two parameters. The second test case involves five alternative "bucket-style" hydrologic models with up to 14 parameters applied to a medium-sized catchment (200 km2) in the Belgian Ardennes. Results show that in both examples, Sobol' and DELSA identify similar important and unimportant parameters, with DELSA enabling more detailed insight at much lower computational cost. For example, in the real-world problem the time delay in runoff is the most important parameter in all models, but DELSA shows that for about 20% of parameter sets it is not important at all and alternative mechanisms and parameters dominate. Moreover, the time delay was identified as important in regions producing poor model fits, whereas other parameters were identified as more important in regions of the parameter space producing better model fits. The ability to understand how parameter importance varies through parameter space is critical to inform decisions about, for example, additional data collection and model development. The ability to perform such analyses with modest computational requirements provides exciting opportunities to evaluate complicated models as well as many alternative models.

  9. Short-Term Wind Speed Forecasting for Power System Operations

    KAUST Repository

    Zhu, Xinxin; Genton, Marc G.

    2012-01-01

    some statistical short-term wind speed forecasting models, including traditional time series approaches and more advanced space-time statistical models. It also discusses the evaluation of forecast accuracy, in particular, the need for realistic loss

  10. Accumulator and random-walk models of psychophysical discrimination: a counter-evaluation.

    Science.gov (United States)

    Vickers, D; Smith, P

    1985-01-01

    In a recent assessment of models of psychophysical discrimination, Heath criticises the accumulator model for its reliance on computer simulation and qualitative evidence, and contrasts it unfavourably with a modified random-walk model, which yields exact predictions, is susceptible to critical test, and is provided with simple parameter-estimation techniques. A counter-evaluation is presented, in which the approximations employed in the modified random-walk analysis are demonstrated to be seriously inaccurate, the resulting parameter estimates to be artefactually determined, and the proposed test not critical. It is pointed out that Heath's specific application of the model is not legitimate, his data treatment inappropriate, and his hypothesis concerning confidence inconsistent with experimental results. Evidence from adaptive performance changes is presented which shows that the necessary assumptions for quantitative analysis in terms of the modified random-walk model are not satisfied, and that the model can be reconciled with data at the qualitative level only by making it virtually indistinguishable from an accumulator process. A procedure for deriving exact predictions for an accumulator process is outlined.

  11. Time-series modeling: applications to long-term finfish monitoring data

    International Nuclear Information System (INIS)

    Bireley, L.E.

    1985-01-01

    The growing concern and awareness that developed during the 1970's over the effects that industry had on the environment caused the electric utility industry in particular to develop monitoring programs. These programs generate long-term series of data that are not very amenable to classical normal-theory statistical analysis. The monitoring data collected from three finfish programs (impingement, trawl and seine) at the Millstone Nuclear Power Station were typical of such series and thus were used to develop methodology that used the full extent of the information in the series. The basis of the methodology was classic Box-Jenkins time-series modeling; however, the models also included deterministic components that involved flow, season and time as predictor variables. Time entered into the models as harmonic regression terms. Of the 32 models fitted to finfish catch data, 19 were found to account for more than 70% of the historical variation. The models were than used to forecast finfish catches a year in advance and comparisons were made to actual data. Usually the confidence intervals associated with the forecasts encompassed most of the observed data. The technique can provide the basis for intervention analysis in future impact assessments

  12. Multivariate Term Structure Models with Level and Heteroskedasticity Effects

    DEFF Research Database (Denmark)

    Christiansen, Charlotte

    2005-01-01

    The paper introduces and estimates a multivariate level-GARCH model for the long rate and the term-structure spread where the conditional volatility is proportional to the ãth power of the variable itself (level effects) and the conditional covariance matrix evolves according to a multivariate GA...... and the level model. GARCH effects are more important than level effects. The results are robust to the maturity of the interest rates. Udgivelsesdato: MAY...

  13. Do climate model predictions agree with long-term precipitation trends in the arid southwestern United States?

    Science.gov (United States)

    Elias, E.; Rango, A.; James, D.; Maxwell, C.; Anderson, J.; Abatzoglou, J. T.

    2016-12-01

    Researchers evaluating climate projections across southwestern North America observed a decreasing precipitation trend. Aridification was most pronounced in the cold (non-monsoonal) season, whereas downward trends in precipitation were smaller in the warm (monsoonal) season. In this region, based upon a multimodel mean of 20 Coupled Model Intercomparison Project 5 models using a business-as-usual (Representative Concentration Pathway 8.5) trajectory, midcentury precipitation is projected to increase slightly during the monsoonal time period (July-September; 6%) and decrease slightly during the remainder of the year (October-June; -4%). We use observed long-term (1915-2015) monthly precipitation records from 16 weather stations to investigate how well measured trends corroborate climate model predictions during the monsoonal and non-monsoonal timeframe. Running trend analysis using the Mann-Kendall test for 15 to 101 year moving windows reveals that half the stations showed significant (p≤0.1), albeit small, increasing trends based on the longest term record. Trends based on shorter-term records reveal a period of significant precipitation decline at all stations representing the 1950s drought. Trends from 1930 to 2015 reveal significant annual, monsoonal and non-monsoonal increases in precipitation (Fig 1). The 1960 to 2015 time window shows no significant precipitation trends. The more recent time window (1980 to 2015) shows a slight, but not significant, increase in monsoonal precipitation and a larger, significant decline in non-monsoonal precipitation. GCM precipitation projections are consistent with more recent trends for the region. Running trends from the most recent time window (mid-1990s to 2015) at all stations show increasing monsoonal precipitation and decreasing Oct-Jun precipitation, with significant trends at 6 of 16 stations. Running trend analysis revealed that the long-term trends were not persistent throughout the series length, but depended

  14. Developing a conceptual model for selecting and evaluating online markets

    Directory of Open Access Journals (Sweden)

    Sadegh Feizollahi

    2013-04-01

    Full Text Available There are many evidences, which emphasis on the benefits of using new technologies of information and communication in international business and many believe that E-Commerce can help satisfy customer explicit and implicit requirements. Internet shopping is a concept developed after the introduction of electronic commerce. Information technology (IT and its applications, specifically in the realm of the internet and e-mail promoted the development of e-commerce in terms of advertising, motivating and information. However, with the development of new technologies, credit and financial exchange on the internet websites were constructed so to facilitate e-commerce. The proposed study sends a total of 200 questionnaires to the target group (teachers - students - professionals - managers of commercial web sites and it manages to collect 130 questionnaires for final evaluation. Cronbach's alpha test is used for measuring reliability and to evaluate the validity of measurement instruments (questionnaires, and to assure construct validity, confirmatory factor analysis is employed. In addition, in order to analyze the research questions based on the path analysis method and to determine markets selection models, a regular technique is implemented. In the present study, after examining different aspects of e-commerce, we provide a conceptual model for selecting and evaluating online marketing in Iran. These findings provide a consistent, targeted and holistic framework for the development of the Internet market in the country.

  15. Conceptual modelling of human resource evaluation process

    Directory of Open Access Journals (Sweden)

    Negoiţă Doina Olivia

    2017-01-01

    Full Text Available Taking into account the highly diverse tasks which employees have to fulfil due to complex requirements of nowadays consumers, the human resource within an enterprise has become a strategic element for developing and exploiting products which meet the market expectations. Therefore, organizations encounter difficulties when approaching the human resource evaluation process. Hence, the aim of the current paper is to design a conceptual model of the aforementioned process, which allows the enterprises to develop a specific methodology. In order to design the conceptual model, Business Process Modelling instruments were employed - Adonis Community Edition Business Process Management Toolkit using the ADONIS BPMS Notation. The conceptual model was developed based on an in-depth secondary research regarding the human resource evaluation process. The proposed conceptual model represents a generic workflow (sequential and/ or simultaneously activities, which can be extended considering the enterprise’s needs regarding their requirements when conducting a human resource evaluation process. Enterprises can benefit from using software instruments for business process modelling as they enable process analysis and evaluation (predefined / specific queries and also model optimization (simulations.

  16. Bardeen-anomaly and Wess-Zumino term in the supersymmetric standard model

    CERN Document Server

    Ferrara, Sergio; Porrati, Massimo; Stora, Raymond Félix

    1994-01-01

    We construct the Bardeen anomaly and its related Wess-Zumino term in the supersymmetric standard model. In particular we show that it can be written in terms of a composite linear superfield related to supersymmetrized Chern-Simons forms, in very much the same way as the Green-Schwarz term in four-dimensional string theory. Some physical applications, such as the contribution to the g-2 of gauginos when a heavy top is integrated out, are briefly discussed.

  17. Development of oil supply and demand planning model for mid- and long-term

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Hyun [Korea Energy Economics Institute, Euiwang (Korea)

    1997-10-01

    Despite the liberalization of oil market, a systematic model is required for reasonable supply and demand of oil, which still has an important influence on industry and state economy. It is required a demand model deriving prospects of each sector and product and a supply model examining the optimum rate of operation, production mix of products, stock, export and import, and the size of equipment investment to meet given demand. As the first phase for the development of supply and demand model, the existing oil and energy models in domestic and overseas were reviewed and recommendations for establishing a Korean oil supply and demand model were derived in this study. Based on these, a principle for establishing a model and a rough framework were set up. In advance of mid- and long-term prospects, a short-term prospect model was established and the short-term prospects for the first quarter of 1999 and for the year 1999 were presented on trial. Due to the size and characters of a supply model, a plan for an ideal model was first explained and then a plan for creating a model step by step was presented as a realistic scheme. (author). 16 refs., 9 figs., 19 tabs.

  18. Comparative approaches from empirical to mechanistic simulation modelling in Land Evaluation studies

    Science.gov (United States)

    Manna, P.; Basile, A.; Bonfante, A.; Terribile, F.

    2009-04-01

    The Land Evaluation (LE) comprise the evaluation procedures to asses the attitudes of the land to a generic or specific use (e.g. biomass production). From local to regional and national scale the approach to the land use planning should requires a deep knowledge of the processes that drive the functioning of the soil-plant-atmosphere system. According to the classical approaches the assessment of attitudes is the result of a qualitative comparison between the land/soil physical properties and the land use requirements. These approaches have a quick and inexpensive applicability; however, they are based on empirical and qualitative models with a basic knowledge structure specifically built for a specific landscape and for the specific object of the evaluation (e.g. crop). The outcome from this situation is the huge difficulties in the spatial extrapolation of the LE results and the rigidity of the system. Modern techniques instead, rely on the application of mechanistic and quantitative simulation modelling that allow a dynamic characterisation of the interrelated physical and chemical processes taking place in the soil landscape. Moreover, the insertion of physical based rules in the LE procedure may make it less difficult in terms of both extending spatially the results and changing the object (e.g. crop species, nitrate dynamics, etc.) of the evaluation. On the other side these modern approaches require high quality and quantity of input data that cause a significant increase in costs. In this scenario nowadays the LE expert is asked to choose the best LE methodology considering costs, complexity of the procedure and benefits in handling a specific land evaluation. In this work we performed a forage maize land suitability study by comparing 9 different methods having increasing complexity and costs. The study area, of about 2000 ha, is located in North Italy in the Lodi plain (Po valley). The range of the 9 employed methods ranged from standard LE approaches to

  19. Analytical Solution for the Anisotropic Rabi Model: Effects of Counter-Rotating Terms

    OpenAIRE

    Zhang, Guofeng; Zhu, Hanjie

    2015-01-01

    The anisotropic Rabi model, which was proposed recently, differs from the original Rabi model: the rotating and counter-rotating terms are governed by two different coupling constants. This feature allows us to vary the counter-rotating interaction independently and explore the effects of it on some quantum properties. In this paper, we eliminate the counter-rotating terms approximately and obtain the analytical energy spectrums and wavefunctions. These analytical results agree well with the ...

  20. The role of nuclear energy for Korean long-term energy supply strategy : application of energy demand-supply model

    International Nuclear Information System (INIS)

    Chae, Kyu Nam

    1995-02-01

    An energy demand and supply analysis is carried out to establish the future nuclear energy system of Korea in the situation of environmental restriction and resource depletion. Based on the useful energy intensity concept, a long-term energy demand forecasting model FIN2USE is developed to integrate with a supply model. The energy supply optimization model MESSAGE is improved to evaluate the role of nuclear energy system in Korean long-term energy supply strategy. Long-term demand for useful energy used as an exogeneous input of the energy supply model is derived from the trend of useful energy intensity by sectors and energy carriers. Supply-side optimization is performed for the overall energy system linked with the reactor and nuclear fuel cycle strategy. The limitation of fossil fuel resources and the CO 2 emission constraints are reflected as determinants of the future energy system. As a result of optimization of energy system using linear programming with the objective of total discounted system cost, the optimal energy system is obtained with detailed results on the nuclear sector for various scenarios. It is shown that the relative importance of nuclear energy would increase especially in the cases of CO 2 emission constraint. It is concluded that nuclear reactor strategy and fuel cycle strategy should be incorporated with national energy strategy and be changed according to environmental restriction and energy demand scenarios. It is shown that this modelling approach is suitable for a decision support system of nuclear energy policy

  1. Nonuniversal gaugino masses from nonsinglet F-terms in nonminimal unified models

    International Nuclear Information System (INIS)

    Martin, Stephen P.

    2009-01-01

    In phenomenological studies of low-energy supersymmetry, running gaugino masses are often taken to be equal near the scale of apparent gauge coupling unification. However, many known mechanisms can avoid this universality, even in models with unified gauge interactions. One example is an F-term vacuum expectation value that is a singlet under the standard model gauge group but transforms nontrivially in the symmetric product of two adjoint representations of a group that contains the standard model gauge group. Here, I compute the ratios of gaugino masses that follow from F-terms in nonsinglet representations of SO(10) and E 6 and their subgroups, extending well-known results for SU(5). The SO(10) results correct some long-standing errors in the literature.

  2. Long-term behaviour of concrete in water saturated media - Experimental and modelling approach

    Energy Technology Data Exchange (ETDEWEB)

    Peycelon, H; Mazoin, C

    2004-07-01

    In the context of the nuclear long-lived radioactive waste management, cement-based materials are currently used for waste encapsulation and containers development. Such materials are also likely to be used for engineered barriers in deep repositories. Various types of cement - CEM I, CEM V - have been currently studied, mainly to evaluate materials long-term durability. Studies have been performed on the leaching behavior of hardened cement pastes based on these cements. The effect of temperature is taking into account. Leaching experiments for 25 deg C, 50 deg C and 85 deg C were carried out with a standard test developed at CEA. Experimental results were analyzed and calculations were made to estimate calcium fluxes and degraded thicknesses. Experimental and modelling results were compared. (authors)

  3. A Team Building Model for Software Engineering Courses Term Projects

    Science.gov (United States)

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  4. Short-Term Bus Passenger Demand Prediction Based on Time Series Model and Interactive Multiple Model Approach

    Directory of Open Access Journals (Sweden)

    Rui Xue

    2015-01-01

    Full Text Available Although bus passenger demand prediction has attracted increased attention during recent years, limited research has been conducted in the context of short-term passenger demand forecasting. This paper proposes an interactive multiple model (IMM filter algorithm-based model to predict short-term passenger demand. After aggregated in 15 min interval, passenger demand data collected from a busy bus route over four months were used to generate time series. Considering that passenger demand exhibits various characteristics in different time scales, three time series were developed, named weekly, daily, and 15 min time series. After the correlation, periodicity, and stationarity analyses, time series models were constructed. Particularly, the heteroscedasticity of time series was explored to achieve better prediction performance. Finally, IMM filter algorithm was applied to combine individual forecasting models with dynamically predicted passenger demand for next interval. Different error indices were adopted for the analyses of individual and hybrid models. The performance comparison indicates that hybrid model forecasts are superior to individual ones in accuracy. Findings of this study are of theoretical and practical significance in bus scheduling.

  5. Long-term relationships of major macro-variables in a resource-related economic model of Australia

    International Nuclear Information System (INIS)

    Harvie, Charles; Hoa, T. van

    1993-01-01

    The paper reports the results of a simple cointegration analysis applied to bivariate causality models using data on resource output, oil prices, terms of trade, current account and output growth to investigate the long-term relationships among these major macroeconomic aggregates in a resource-related economic model of Australia. For the period 1960-1990, the empirical evidence indicates that these five macro-variables, as formulated in our model, are not random walks. In addition, resource production and oil prices are significantly cointegrated, and they are also significantly cointegrated with the current account, terms of trade and economic growth. These findings provide support to the long-term adjustments foundation of our resource-related model. (author)

  6. Modelling and evaluation of surgical performance using hidden Markov models.

    Science.gov (United States)

    Megali, Giuseppe; Sinigaglia, Stefano; Tonet, Oliver; Dario, Paolo

    2006-10-01

    Minimally invasive surgery has become very widespread in the last ten years. Since surgeons experience difficulties in learning and mastering minimally invasive techniques, the development of training methods is of great importance. While the introduction of virtual reality-based simulators has introduced a new paradigm in surgical training, skill evaluation methods are far from being objective. This paper proposes a method for defining a model of surgical expertise and an objective metric to evaluate performance in laparoscopic surgery. Our approach is based on the processing of kinematic data describing movements of surgical instruments. We use hidden Markov model theory to define an expert model that describes expert surgical gesture. The model is trained on kinematic data related to exercises performed on a surgical simulator by experienced surgeons. Subsequently, we use this expert model as a reference model in the definition of an objective metric to evaluate performance of surgeons with different abilities. Preliminary results show that, using different topologies for the expert model, the method can be efficiently used both for the discrimination between experienced and novice surgeons, and for the quantitative assessment of surgical ability.

  7. Evaluating to Solve Educational Problems: An Alternative Model.

    Science.gov (United States)

    Friedman, Myles I.; Anderson, Lorin W.

    1979-01-01

    A 19-step general evaluation model is described through its four stages: identifying problems, prescribing program solutions, evaluating the operation of the program, and evaluating the effectiveness of the model. The role of the evaluator in decision making is also explored. (RAO)

  8. Modelling the Long-term Periglacial Imprint on Mountain Landscapes

    DEFF Research Database (Denmark)

    Andersen, Jane Lund; Egholm, David Lundbek; Knudsen, Mads Faurschou

    Studies of periglacial processes usually focus on small-scale, isolated phenomena, leaving less explored questions of how such processes shape vast areas of Earth’s surface. Here we use numerical surface process modelling to better understand how periglacial processes drive large-scale, long-term...

  9. Long-term creep modeling of wood using time temperature superposition principle

    OpenAIRE

    Gamalath, Sandhya Samarasinghe

    1991-01-01

    Long-term creep and recovery models (master curves) were developed from short-term data using the time temperature superposition principle (TTSP) for kiln-dried southern pine loaded in compression parallel-to-grain and exposed to constant environmental conditions (~70°F, ~9%EMC). Short-term accelerated creep (17 hour) and recovery (35 hour) data were collected for each specimen at a range of temperature (70°F-150°F) and constant moisture condition of 9%. The compressive stra...

  10. Evaluation of modelling body burden of Cs-137

    Energy Technology Data Exchange (ETDEWEB)

    Bergstroem, U; Nordlinder, S

    1996-05-01

    Within the IAEA/CEC VAMP-program one working group studied the precision in dose assessment models when calculating body burden of {sup 137}Cs as a result of exposure through multiple exposure pathways. One scenario used data from southern Finland regarding contamination of various media due to the fallout from the Chernobyl accident. In this study, a time dependent multiple exposure pathway model was constructed based on compartment theory. Uncertainties in model responses due to uncertainties in input parameter values were studied. The initial predictions for body burden were good, within a factor of 2 of the observed, while the time dynamics of levels in milk and meat did not agree satisfactorily. Some results, nevertheless, showed good agreement with observations due to compensatory effects. After disclosure of additional observational data, major reasons for mispredictions were identified as lack of consideration of time dependence of fixation of {sup 137}Cs in soils, and the selection of parameter values. When correction of this was made, a close agreement between predictions and observations was obtained. This study shows that the dose contribution due to {sup 137}Cs in food products from the seminatural environment is important for long-term exposure to man. The evaluation provided a basis for improvements of crucial parts in the model. 14 refs, 18 figs, 8 tabs.

  11. Capturing the sensitivity of land-use regression models to short-term mobile monitoring campaigns using air pollution micro-sensors.

    Science.gov (United States)

    Minet, L; Gehr, R; Hatzopoulou, M

    2017-11-01

    The development of reliable measures of exposure to traffic-related air pollution is crucial for the evaluation of the health effects of transportation. Land-use regression (LUR) techniques have been widely used for the development of exposure surfaces, however these surfaces are often highly sensitive to the data collected. With the rise of inexpensive air pollution sensors paired with GPS devices, we witness the emergence of mobile data collection protocols. For the same urban area, can we achieve a 'universal' model irrespective of the number of locations and sampling visits? Can we trade the temporal representation of fixed-point sampling for a larger spatial extent afforded by mobile monitoring? This study highlights the challenges of short-term mobile sampling campaigns in terms of the resulting exposure surfaces. A mobile monitoring campaign was conducted in 2015 in Montreal; nitrogen dioxide (NO 2 ) levels at 1395 road segments were measured under repeated visits. We developed LUR models based on sub-segments, categorized in terms of the number of visits per road segment. We observe that LUR models were highly sensitive to the number of road segments and to the number of visits per road segment. The associated exposure surfaces were also highly dissimilar. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Mathematical models of human paralyzed muscle after long-term training.

    Science.gov (United States)

    Law, L A Frey; Shields, R K

    2007-01-01

    Spinal cord injury (SCI) results in major musculoskeletal adaptations, including muscle atrophy, faster contractile properties, increased fatigability, and bone loss. The use of functional electrical stimulation (FES) provides a method to prevent paralyzed muscle adaptations in order to sustain force-generating capacity. Mathematical muscle models may be able to predict optimal activation strategies during FES, however muscle properties further adapt with long-term training. The purpose of this study was to compare the accuracy of three muscle models, one linear and two nonlinear, for predicting paralyzed soleus muscle force after exposure to long-term FES training. Further, we contrasted the findings between the trained and untrained limbs. The three models' parameters were best fit to a single force train in the trained soleus muscle (N=4). Nine additional force trains (test trains) were predicted for each subject using the developed models. Model errors between predicted and experimental force trains were determined, including specific muscle force properties. The mean overall error was greatest for the linear model (15.8%) and least for the nonlinear Hill Huxley type model (7.8%). No significant error differences were observed between the trained versus untrained limbs, although model parameter values were significantly altered with training. This study confirmed that nonlinear models most accurately predict both trained and untrained paralyzed muscle force properties. Moreover, the optimized model parameter values were responsive to the relative physiological state of the paralyzed muscle (trained versus untrained). These findings are relevant for the design and control of neuro-prosthetic devices for those with SCI.

  13. Model description and evaluation of the mark-recapture survival model used to parameterize the 2012 status and threats analysis for the Florida manatee (Trichechus manatus latirostris)

    Science.gov (United States)

    Langtimm, Catherine A.; Kendall, William L.; Beck, Cathy A.; Kochman, Howard I.; Teague, Amy L.; Meigs-Friend, Gaia; Peñaloza, Claudia L.

    2016-11-30

    This report provides supporting details and evidence for the rationale, validity and efficacy of a new mark-recapture model, the Barker Robust Design, to estimate regional manatee survival rates used to parameterize several components of the 2012 version of the Manatee Core Biological Model (CBM) and Threats Analysis (TA).  The CBM and TA provide scientific analyses on population viability of the Florida manatee subspecies (Trichechus manatus latirostris) for U.S. Fish and Wildlife Service’s 5-year reviews of the status of the species as listed under the Endangered Species Act.  The model evaluation is presented in a standardized reporting framework, modified from the TRACE (TRAnsparent and Comprehensive model Evaluation) protocol first introduced for environmental threat analyses.  We identify this new protocol as TRACE-MANATEE SURVIVAL and this model evaluation specifically as TRACE-MANATEE SURVIVAL, Barker RD version 1. The longer-term objectives of the manatee standard reporting format are to (1) communicate to resource managers consistent evaluation information over sequential modeling efforts; (2) build understanding and expertise on the structure and function of the models; (3) document changes in model structures and applications in response to evolving management objectives, new biological and ecological knowledge, and new statistical advances; and (4) provide greater transparency for management and research review.

  14. A BHLS model based moment analysis of muon g-2, and its use for lattice QCD evaluations of a{sup had}{sub μ}

    Energy Technology Data Exchange (ETDEWEB)

    Benayoun, M.; DelBuono, L. [Paris VI et Paris VII Univs. (France). LPNHE; David, P. [Paris VI et Paris VII Univs. (France). LPNHE; Paris-Diderot Univ. (France). LIED; Jegerlehner, F. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik; Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany)

    2016-05-15

    We present an up-to-date analysis of muon g-2 evaluations in terms of Mellin-Barnes moments as they might be useful for lattice QCD calculations of a{sub μ}. The moments up to 4th order are evaluated directly in terms of e{sup +}e{sup -}-annihilation data and improved within the Hidden Local Symmetry (HLS) Model, supplied with appropriate symmetry breaking mechanisms. The model provides a reliable Effective Lagrangian (BHLS) estimate of the two-body channels plus the πππ channel up to 1.05 GeV, just including the φ resonance. The HLS piece accounts for 80% of the contribution to a{sub μ}. The missing pieces are evaluated in the standard way directly in terms of the data. We find that the moment expansion converges well in terms of a few moments. The two types of moments which show up in the Mellin-Barnes representation are calculated in terms of hadronic cross-section data in the timelike region and in terms of the hadronic vacuum polarization (HVP) function in the spacelike region which is accessible to lattice QCD (LQCD). In the Euclidean the first type of moments are the usual Taylor coefficients of the HVP and we show that the second type of moments may be obtained as integrals over the appropriately Taylor truncated HVP function. Specific results for the isovector part of a{sup had}{sub μ} are determined by means of HLS model predictions in close relation to τ-decay spectra.

  15. A Model for Evaluating Student Clinical Psychomotor Skills.

    Science.gov (United States)

    And Others; Fiel, Nicholas J.

    1979-01-01

    A long-range plan to evaluate medical students' physical examination skills was undertaken at the Ingham Family Medical Clinic at Michigan State University. The development of the psychomotor skills evaluation model to evaluate the skill of blood pressure measurement, tests of the model's reliability, and the use of the model are described. (JMD)

  16. An exemplar-familiarity model predicts short-term and long-term probe recognition across diverse forms of memory search.

    Science.gov (United States)

    Nosofsky, Robert M; Cox, Gregory E; Cao, Rui; Shiffrin, Richard M

    2014-11-01

    Experiments were conducted to test a modern exemplar-familiarity model on its ability to account for both short-term and long-term probe recognition within the same memory-search paradigm. Also, making connections to the literature on attention and visual search, the model was used to interpret differences in probe-recognition performance across diverse conditions that manipulated relations between targets and foils across trials. Subjects saw lists of from 1 to 16 items followed by a single item recognition probe. In a varied-mapping condition, targets and foils could switch roles across trials; in a consistent-mapping condition, targets and foils never switched roles; and in an all-new condition, on each trial a completely new set of items formed the memory set. In the varied-mapping and all-new conditions, mean correct response times (RTs) and error proportions were curvilinear increasing functions of memory set size, with the RT results closely resembling ones from hybrid visual-memory search experiments reported by Wolfe (2012). In the consistent-mapping condition, new-probe RTs were invariant with set size, whereas old-probe RTs increased slightly with increasing study-test lag. With appropriate choice of psychologically interpretable free parameters, the model accounted well for the complete set of results. The work provides support for the hypothesis that a common set of processes involving exemplar-based familiarity may govern long-term and short-term probe recognition across wide varieties of memory- search conditions. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  17. Short-Term Degradation of Bi-Component Electrospun Fibers: Qualitative and Quantitative Evaluations via AFM Analysis

    Directory of Open Access Journals (Sweden)

    Marica Marrese

    2018-03-01

    Full Text Available Electrospun polymeric fibers are currently used as 3D models for in vitro applications in biomedical areas, i.e., tissue engineering, cell and drug delivery. The high customization of the electrospinning process offers numerous opportunities to manipulate and control surface area, fiber diameter, and fiber density to evaluate the response of cells under different morphological and/or biochemical stimuli. The aim of this study was to investigate—via atomic force microscopy (AFM—the chemical and morphological changes in bi-component electrospun fibers (BEFs during the in vitro degradation process using a biological medium. BEFs were fabricated by electrospinning a mixture of synthetic-polycaprolactone (PCL-and natural polymers (gelatin into a binary solution. During the hydrolytic degradation of protein, no significant remarkable effects were recognized in terms of fiber integrity. However, increases in surface roughness as well as a decrease in fiber diameter as a function of the degradation conditions were detected. We suggest that morphological and chemical changes due to the local release of gelatin positively influence cell behavior in culture, in terms of cell adhesion and spreading, thus working to mimic the native microenvironment of natural tissues.

  18. Saphire models and software for ASP evaluations

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission's (NRC's) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented

  19. Toyotarity. Term, model, range

    Directory of Open Access Journals (Sweden)

    Stanisław Borkowski

    2013-04-01

    Full Text Available The Toyotarity and BOST term was presented in the chapter. The BOST method allows to define relations between material resources and human resources and between human resources and human resources (TOYOTARITY. This term was also invented by the Author (and is legally protected. The idea of methodology is an outcome of 12 years of work.

  20. Comfrey (Symphytum Officinale. l.) and Experimental Hepatic Carcinogenesis: A Short-term Carcinogenesis Model Study.

    Science.gov (United States)

    Gomes, Maria Fernanda Pereira Lavieri; de Oliveira Massoco, Cristina; Xavier, José Guilherme; Bonamin, Leoni Villano

    2010-06-01

    Comfrey or Symphytum officinale (L.) (Boraginaceae) is a very popular plant used for therapeutic purposes. Since the 1980s, its effects have been studied in long-term carcinogenesis studies, in which Comfrey extract is administered at high doses during several months and the neoplastic hepatic lesions are evaluated. However, the literature on this topic is very poor considering the studies performed under short-term carcinogenesis protocols, such as the 'resistant hepatocyte model' (RHM). In these studies, it is possible to observe easily the phenomena related to the early phases of tumor development, since pre-neoplastic lesions (PNLs) rise in about 1-2 months of chemical induction. Herein, the effects of chronic oral treatment of rats with 10% Comfrey ethanolic extract were evaluated in a RHM. Wistar rats were sequentially treated with N-nitrosodiethylamine (ip) and 2-acetilaminofluorene (po), and submitted to hepatectomy to induce carcinogenesis promotion. Macroscopic/microscopic quantitative analysis of PNL was performed. Non-parametric statistical tests (Mann-Whitney and χ(2)) were used, and the level of significance was set at P ≤ 0.05. Comfrey treatment reduced the number of pre-neoplastic macroscopic lesions up to 1 mm (P ≤ 0.05), the percentage of oval cells (P = 0.0001) and mitotic figures (P = 0.007), as well as the number of Proliferating Cell Nuclear Antigen (PCNA) positive cells (P = 0.0001) and acidophilic pre-neoplastic nodules (P = 0.05). On the other hand, the percentage of cells presenting megalocytosis (P = 0.0001) and vacuolar degeneration (P = 0.0001) was increased. Scores of fibrosis, glycogen stores and the number of nucleolus organizing regions were not altered. The study indicated that oral treatment of rats with 10% Comfrey alcoholic extract reduced cell proliferation in this model.

  1. Performance evaluation of ionospheric time delay forecasting models using GPS observations at a low-latitude station

    Science.gov (United States)

    Sivavaraprasad, G.; Venkata Ratnam, D.

    2017-07-01

    Ionospheric delay is one of the major atmospheric effects on the performance of satellite-based radio navigation systems. It limits the accuracy and availability of Global Positioning System (GPS) measurements, related to critical societal and safety applications. The temporal and spatial gradients of ionospheric total electron content (TEC) are driven by several unknown priori geophysical conditions and solar-terrestrial phenomena. Thereby, the prediction of ionospheric delay is challenging especially over Indian sub-continent. Therefore, an appropriate short/long-term ionospheric delay forecasting model is necessary. Hence, the intent of this paper is to forecast ionospheric delays by considering day to day, monthly and seasonal ionospheric TEC variations. GPS-TEC data (January 2013-December 2013) is extracted from a multi frequency GPS receiver established at K L University, Vaddeswaram, Guntur station (geographic: 16.37°N, 80.37°E; geomagnetic: 7.44°N, 153.75°E), India. An evaluation, in terms of forecasting capabilities, of three ionospheric time delay models - an Auto Regressive Moving Average (ARMA) model, Auto Regressive Integrated Moving Average (ARIMA) model, and a Holt-Winter's model is presented. The performances of these models are evaluated through error measurement analysis during both geomagnetic quiet and disturbed days. It is found that, ARMA model is effectively forecasting the ionospheric delay with an accuracy of 82-94%, which is 10% more superior to ARIMA and Holt-Winter's models. Moreover, the modeled VTEC derived from International Reference Ionosphere, IRI (IRI-2012) model and new global TEC model, Neustrelitz TEC Model (NTCM-GL) have compared with forecasted VTEC values of ARMA, ARIMA and Holt-Winter's models during geomagnetic quiet days. The forecast results are indicating that ARMA model would be useful to set up an early warning system for ionospheric disturbances at low latitude regions.

  2. A shell-model calculation in terms of correlated subsystems

    International Nuclear Information System (INIS)

    Boisson, J.P.; Silvestre-Brac, B.

    1979-01-01

    A method for solving the shell-model equations in terms of a basis which includes correlated subsystems is presented. It is shown that the method allows drastic truncations of the basis to be made. The corresponding calculations are easy to perform and can be carried out rapidly

  3. Mapping the course of the EU "Power Target Model"... on its own terms

    OpenAIRE

    GLACHANT, Jean-Michel

    2016-01-01

    The European Union took more than 20 years to start defining a common market design for its internal electricity market: a European Power Target Model. And, a further 10 years to fully implement it. Meanwhile, the reference generation set of that model has shifted from CCGT burning gas to RES units transforming intermittent natural resources. Could the existing EU target model continue to work well for the short- term operation and long-term investment? If not, can the existing EU institution...

  4. Embedding Term Similarity and Inverse Document Frequency into a Logical Model of Information Retrieval.

    Science.gov (United States)

    Losada, David E.; Barreiro, Alvaro

    2003-01-01

    Proposes an approach to incorporate term similarity and inverse document frequency into a logical model of information retrieval. Highlights include document representation and matching; incorporating term similarity into the measure of distance; new algorithms for implementation; inverse document frequency; and logical versus classical models of…

  5. Loss of confinement of liquefied gases. Evaluation of the source term; Perte de confinement de gaz liquefies. Evaluation du terme source

    Energy Technology Data Exchange (ETDEWEB)

    Alix, P.; Novat, E.; Hocquet, J.; Bigot, J.P. [Ecole Nationale Superieure des Mines, Centre SPIN, 42 - Saint-Etienne (France)

    2001-07-01

    In this work, the states law corresponding to flow rate measurements of two-phase flows performed with five different fluid (water, butane, R11, ethyl acetate, methanol) is applied. This allows to show that the critical mass flux (which is used as source term in the scenario of loss of confinement in liquefied gas reservoirs) is a 'universal' function of the reduced initial pressure P{sub 0}{sup *}, which can be used for most of the single-constituent fluids of the processes industry. Thus it is easy to make a relatively precise estimation of the critical mass flux (uncertainty < 20% for P{sub 0}{sup *} < 15%) without the need of any model. It is shown also that no improvement of the models can be expected from the use of the vaporization kinetics. On the contrary, a qualitative consideration indicates that the use of the slip seems more promising. (J.S.)

  6. Actinide Source Term Program, position paper. Revision 1

    International Nuclear Information System (INIS)

    Novak, C.F.; Papenguth, H.W.; Crafts, C.C.; Dhooge, N.J.

    1994-01-01

    The Actinide Source Term represents the quantity of actinides that could be mobilized within WIPP brines and could migrate with the brines away from the disposal room vicinity. This document presents the various proposed methods for estimating this source term, with a particular focus on defining these methods and evaluating the defensibility of the models for mobile actinide concentrations. The conclusions reached in this document are: the 92 PA open-quotes expert panelclose quotes model for mobile actinide concentrations is not defensible; and, although it is extremely conservative, the open-quotes inventory limitsclose quotes model is the only existing defensible model for the actinide source term. The model effort in progress, open-quotes chemical modeling of mobile actinide concentrationsclose quotes, supported by a laboratory effort that is also in progress, is designed to provide a reasonable description of the system and be scientifically realistic and supplant the open-quotes Inventory limitsclose quotes model

  7. Short-term droughts forecast using Markov chain model in Victoria, Australia

    Science.gov (United States)

    Rahmat, Siti Nazahiyah; Jayasuriya, Niranjali; Bhuiyan, Muhammed A.

    2017-07-01

    A comprehensive risk management strategy for dealing with drought should include both short-term and long-term planning. The objective of this paper is to present an early warning method to forecast drought using the Standardised Precipitation Index (SPI) and a non-homogeneous Markov chain model. A model such as this is useful for short-term planning. The developed method has been used to forecast droughts at a number of meteorological monitoring stations that have been regionalised into six (6) homogenous clusters with similar drought characteristics based on SPI. The non-homogeneous Markov chain model was used to estimate drought probabilities and drought predictions up to 3 months ahead. The drought severity classes defined using the SPI were computed at a 12-month time scale. The drought probabilities and the predictions were computed for six clusters that depict similar drought characteristics in Victoria, Australia. Overall, the drought severity class predicted was quite similar for all the clusters, with the non-drought class probabilities ranging from 49 to 57 %. For all clusters, the near normal class had a probability of occurrence varying from 27 to 38 %. For the more moderate and severe classes, the probabilities ranged from 2 to 13 % and 3 to 1 %, respectively. The developed model predicted drought situations 1 month ahead reasonably well. However, 2 and 3 months ahead predictions should be used with caution until the models are developed further.

  8. Construction of long-term isochronous stress-strain curves by a modeling of short-term creep curves for a Grade 9Cr-1Mo steel

    International Nuclear Information System (INIS)

    Kim, Woo-Gon; Yin, Song-Nan; Koo, Gyeong-Hoi

    2009-01-01

    This study dealt with the construction of long-term isochronous stress-strain curves (ISSC) by a modeling of short-term creep curves for a Grade 9Cr-1Mo steel (G91) which is a candidate material for structural applications in the next generation nuclear reactors as well as in fusion reactors. To do this, tensile material data used in the inelastic constitutive equations was obtained by tensile tests at 550degC. Creep curves were obtained by a series of creep tests with different stress levels of 300MPa to 220MPa at an identical controlled temperature of 550degC. On the basis of these experimental data, the creep curves were characterized by Garofalo's creep model. Three parameters of P 1 , P 2 and P 3 in Garofalo's model were properly optimized by a nonlinear least square fitting (NLSF) analysis. The stress dependency of the three parameters was found to be a linear relationship. But, the P 3 parameter representing the steady state creep rate exhibited a two slope behavior with different stress exponents at a transient stress of about 250 MPa. The long-term creep curves of the G91 steel was modeled by Garofalo's model with only a few short-term creep data. Using the modeled creep curves, the long-term isochronous curves up to 10 5 hours were successfully constructed. (author)

  9. Geometrical aspects of operator ordering terms in gauge invariant quantum models

    International Nuclear Information System (INIS)

    Houston, P.J.

    1990-01-01

    Finite-dimensional quantum models with both boson and fermion degrees of freedom, and which have a gauge invariance, are studied here as simple versions of gauge invariant quantum field theories. The configuration space of these finite-dimensional models has the structure of a principal fibre bundle and has defined on it a metric which is invariant under the action of the bundle or gauge group. When the gauge-dependent degrees of freedom are removed, thereby defining the quantum models on the base of the principal fibre bundle, extra operator ordering terms arise. By making use of dimensional reduction methods in removing the gauge dependence, expressions are obtained here for the operator ordering terms which show clearly their dependence on the geometry of the principal fibre bundle structure. (author)

  10. EDM - A model for optimising the short-term power operation of a complex hydroelectric network

    International Nuclear Information System (INIS)

    Tremblay, M.; Guillaud, C.

    1996-01-01

    In order to optimize the short-term power operation of a complex hydroelectric network, a new model called EDM was added to PROSPER, a water management analysis system developed by SNC-Lavalin. PROSPER is now divided into three parts: an optimization model (DDDP), a simulation model (ESOLIN), and an economic dispatch model (EDM) for the short-term operation. The operation of the KSEB hydroelectric system (located in southern India) with PROSPER was described. The long-term analysis with monthly time steps is assisted by the DDDP, and the daily analysis with hourly or half-hourly time steps is performed with the EDM model. 3 figs

  11. An Overview of Short-term Statistical Forecasting Methods

    DEFF Research Database (Denmark)

    Elias, Russell J.; Montgomery, Douglas C.; Kulahci, Murat

    2006-01-01

    An overview of statistical forecasting methodology is given, focusing on techniques appropriate to short- and medium-term forecasts. Topics include basic definitions and terminology, smoothing methods, ARIMA models, regression methods, dynamic regression models, and transfer functions. Techniques...... for evaluating and monitoring forecast performance are also summarized....

  12. Ultra-Short-Term Wind Power Prediction Using a Hybrid Model

    Science.gov (United States)

    Mohammed, E.; Wang, S.; Yu, J.

    2017-05-01

    This paper aims to develop and apply a hybrid model of two data analytical methods, multiple linear regressions and least square (MLR&LS), for ultra-short-term wind power prediction (WPP), for example taking, Northeast China electricity demand. The data was obtained from the historical records of wind power from an offshore region, and from a wind farm of the wind power plant in the areas. The WPP achieved in two stages: first, the ratios of wind power were forecasted using the proposed hybrid method, and then the transformation of these ratios of wind power to obtain forecasted values. The hybrid model combines the persistence methods, MLR and LS. The proposed method included two prediction types, multi-point prediction and single-point prediction. WPP is tested by applying different models such as autoregressive moving average (ARMA), autoregressive integrated moving average (ARIMA) and artificial neural network (ANN). By comparing results of the above models, the validity of the proposed hybrid model is confirmed in terms of error and correlation coefficient. Comparison of results confirmed that the proposed method works effectively. Additional, forecasting errors were also computed and compared, to improve understanding of how to depict highly variable WPP and the correlations between actual and predicted wind power.

  13. Developing a clinical utility framework to evaluate prediction models in radiogenomics

    Science.gov (United States)

    Wu, Yirong; Liu, Jie; Munoz del Rio, Alejandro; Page, David C.; Alagoz, Oguzhan; Peissig, Peggy; Onitilo, Adedayo A.; Burnside, Elizabeth S.

    2015-03-01

    Combining imaging and genetic information to predict disease presence and behavior is being codified into an emerging discipline called "radiogenomics." Optimal evaluation methodologies for radiogenomics techniques have not been established. We aim to develop a clinical decision framework based on utility analysis to assess prediction models for breast cancer. Our data comes from a retrospective case-control study, collecting Gail model risk factors, genetic variants (single nucleotide polymorphisms-SNPs), and mammographic features in Breast Imaging Reporting and Data System (BI-RADS) lexicon. We first constructed three logistic regression models built on different sets of predictive features: (1) Gail, (2) Gail+SNP, and (3) Gail+SNP+BI-RADS. Then, we generated ROC curves for three models. After we assigned utility values for each category of findings (true negative, false positive, false negative and true positive), we pursued optimal operating points on ROC curves to achieve maximum expected utility (MEU) of breast cancer diagnosis. We used McNemar's test to compare the predictive performance of the three models. We found that SNPs and BI-RADS features augmented the baseline Gail model in terms of the area under ROC curve (AUC) and MEU. SNPs improved sensitivity of the Gail model (0.276 vs. 0.147) and reduced specificity (0.855 vs. 0.912). When additional mammographic features were added, sensitivity increased to 0.457 and specificity to 0.872. SNPs and mammographic features played a significant role in breast cancer risk estimation (p-value < 0.001). Our decision framework comprising utility analysis and McNemar's test provides a novel framework to evaluate prediction models in the realm of radiogenomics.

  14. Long-term evaluation of orbital dynamics in the Sun-planet system considering axial-tilt

    Science.gov (United States)

    Bakhtiari, Majid; Daneshjou, Kamran

    2018-05-01

    In this paper, the axial-tilt (obliquity) effect of planets on the motion of planets’ orbiter in prolonged space missions has been investigated in the presence of the Sun gravity. The proposed model is based on non-simplified perturbed dynamic equations of planetary orbiter motion. From a new point of view, in this work, the dynamic equations regarding a disturbing body in elliptic inclined three-dimensional orbit are derived. The accuracy of this non-simplified method is validated with dual-averaged method employed on a generalized Earth-Moon system. It is shown that the neglected short-time oscillations in dual-averaged technique can accumulate and propel to remarkable errors in the prolonged evolution. After validation, the effects of the planet’s axial-tilt on eccentricity, inclination and right ascension of the ascending node of the orbiter are investigated. Moreover, a generalized model is provided to study the effects of third-body inclination and eccentricity on orbit characteristics. It is shown that the planet’s axial-tilt is the key to facilitating some significant changes in orbital elements in long-term mission and short-time oscillations must be considered in accurate prolonged evaluation.

  15. Application of air pollution dispersion modeling for source-contribution assessment and model performance evaluation at integrated industrial estate-Pantnagar

    Energy Technology Data Exchange (ETDEWEB)

    Banerjee, T., E-mail: tirthankaronline@gmail.com [Department of Environmental Science, G.B. Pant University of Agriculture and Technology, Pantnagar, U.S. Nagar, Uttarakhand 263 145 (India); Barman, S.C., E-mail: scbarman@yahoo.com [Department of Environmental Monitoring, Indian Institute of Toxicology Research, Post Box No. 80, Mahatma Gandhi Marg, Lucknow-226 001, Uttar Pradesh (India); Srivastava, R.K., E-mail: rajeevsrivastava08@gmail.com [Department of Environmental Science, G.B. Pant University of Agriculture and Technology, Pantnagar, U.S. Nagar, Uttarakhand 263 145 (India)

    2011-04-15

    Source-contribution assessment of ambient NO{sub 2} concentration was performed at Pantnagar, India through simulation of two urban mathematical dispersive models namely Gaussian Finite Line Source Model (GFLSM) and Industrial Source Complex Model (ISCST-3) and model performances were evaluated. Principal approaches were development of comprehensive emission inventory, monitoring of traffic density and regional air quality and conclusively simulation of urban dispersive models. Initially, 18 industries were found responsible for emission of 39.11 kg/h of NO{sub 2} through 43 elevated stacks. Further, vehicular emission potential in terms of NO{sub 2} was computed as 7.1 kg/h. Air quality monitoring delineates an annual average NO{sub 2} concentration of 32.6 {mu}g/m{sup 3}. Finally, GFLSM and ISCST-3 were simulated in conjunction with developed emission inventories and existing meteorological conditions. Models simulation indicated that contribution of NO{sub 2} from industrial and vehicular source was in a range of 45-70% and 9-39%, respectively. Further, statistical analysis revealed satisfactory model performance with an aggregate accuracy of 61.9%. - Research highlights: > Application of dispersion modeling for source-contribution assessment of ambient NO{sub 2}. > Inventorization revealed emission from industry and vehicles was 39.11 and 7.1 kg/h. > GFLSM revealed that vehicular pollution contributes a range of 9.0-38.6%. > Source-contribution of 45-70% was found for industrial emission through ISCST-3. > Aggregate performance of both models shows good agreement with an accuracy of 61.9%. - Development of industrial and vehicular inventory in terms of ambient NO{sub 2} for model simulation at Pantnagar, India and model validation revealed satisfactory outcome.

  16. Modeling for Green Supply Chain Evaluation

    Directory of Open Access Journals (Sweden)

    Elham Falatoonitoosi

    2013-01-01

    Full Text Available Green supply chain management (GSCM has become a practical approach to develop environmental performance. Under strict regulations and stakeholder pressures, enterprises need to enhance and improve GSCM practices, which are influenced by both traditional and green factors. This study developed a causal evaluation model to guide selection of qualified suppliers by prioritizing various criteria and mapping causal relationships to find effective criteria to improve green supply chain. The aim of the case study was to model and examine the influential and important main GSCM practices, namely, green logistics, organizational performance, green organizational activities, environmental protection, and green supplier evaluation. In the case study, decision-making trial and evaluation laboratory technique is applied to test the developed model. The result of the case study shows only “green supplier evaluation” and “green organizational activities” criteria of the model are in the cause group and the other criteria are in the effect group.

  17. Evaluating Energy Efficiency Policies with Energy-Economy Models

    Energy Technology Data Exchange (ETDEWEB)

    Mundaca, Luis; Neij, Lena; Worrell, Ernst; McNeil, Michael A.

    2010-08-01

    The growing complexities of energy systems, environmental problems and technology markets are driving and testing most energy-economy models to their limits. To further advance bottom-up models from a multidisciplinary energy efficiency policy evaluation perspective, we review and critically analyse bottom-up energy-economy models and corresponding evaluation studies on energy efficiency policies to induce technological change. We use the household sector as a case study. Our analysis focuses on decision frameworks for technology choice, type of evaluation being carried out, treatment of market and behavioural failures, evaluated policy instruments, and key determinants used to mimic policy instruments. Although the review confirms criticism related to energy-economy models (e.g. unrealistic representation of decision-making by consumers when choosing technologies), they provide valuable guidance for policy evaluation related to energy efficiency. Different areas to further advance models remain open, particularly related to modelling issues, techno-economic and environmental aspects, behavioural determinants, and policy considerations.

  18. Development of Short-term Molecular Thresholds to Predict Long-term Mouse Liver Tumor Outcomes: Phthalate Case Study

    Science.gov (United States)

    Short-term molecular profiles are a central component of strategies to model health effects of environmental chemicals. In this study, a 7 day mouse assay was used to evaluate transcriptomic and proliferative responses in the liver for a hepatocarcinogenic phthalate, di (2-ethylh...

  19. A hands-on approach for fitting long-term survival models under the GAMLSS framework.

    Science.gov (United States)

    de Castro, Mário; Cancho, Vicente G; Rodrigues, Josemar

    2010-02-01

    In many data sets from clinical studies there are patients insusceptible to the occurrence of the event of interest. Survival models which ignore this fact are generally inadequate. The main goal of this paper is to describe an application of the generalized additive models for location, scale, and shape (GAMLSS) framework to the fitting of long-term survival models. In this work the number of competing causes of the event of interest follows the negative binomial distribution. In this way, some well known models found in the literature are characterized as particular cases of our proposal. The model is conveniently parameterized in terms of the cured fraction, which is then linked to covariates. We explore the use of the gamlss package in R as a powerful tool for inference in long-term survival models. The procedure is illustrated with a numerical example. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  20. The geometric background-field method, renormalization and the Wess-Zumino term in non-linear sigma-models

    International Nuclear Information System (INIS)

    Mukhi, S.

    1986-01-01

    A simple recursive algorithm is presented which generates the reparametrization-invariant background-field expansion for non-linear sigma-models on manifolds with an arbitrary riemannian metric. The method is also applicable to Wess-Zumino terms and to counterterms. As an example, the general-metric model is expanded to sixth order and compared with previous results. For locally symmetric spaces, we actually obtain a general formula for the nth order term. The method is shown to facilitate the study of models with Wess-Zumino terms. It is demonstrated that, for chiral models, the Wess-Zumino term is unrenormalized to all orders in perturbation theory even when the model is not conformally invariant. (orig.)

  1. Short-Term Load Forecasting Model Based on Quantum Elman Neural Networks

    Directory of Open Access Journals (Sweden)

    Zhisheng Zhang

    2016-01-01

    Full Text Available Short-term load forecasting model based on quantum Elman neural networks was constructed in this paper. The quantum computation and Elman feedback mechanism were integrated into quantum Elman neural networks. Quantum computation can effectively improve the approximation capability and the information processing ability of the neural networks. Quantum Elman neural networks have not only the feedforward connection but also the feedback connection. The feedback connection between the hidden nodes and the context nodes belongs to the state feedback in the internal system, which has formed specific dynamic memory performance. Phase space reconstruction theory is the theoretical basis of constructing the forecasting model. The training samples are formed by means of K-nearest neighbor approach. Through the example simulation, the testing results show that the model based on quantum Elman neural networks is better than the model based on the quantum feedforward neural network, the model based on the conventional Elman neural network, and the model based on the conventional feedforward neural network. So the proposed model can effectively improve the prediction accuracy. The research in the paper makes a theoretical foundation for the practical engineering application of the short-term load forecasting model based on quantum Elman neural networks.

  2. Long-term records and modelling of acidification, recovery and liming at Lake Hovvatn, Norway

    Energy Technology Data Exchange (ETDEWEB)

    Hindar, A. [Norwegian Inst. for Water Research, Grimstad (Norway); Wright, R.F. [Norwegian Inst. for Water Research, Oslo (Norway)

    2005-11-01

    Scenarios for acidification in Europe have shown that large parts of southern Norway will be negatively impacted by sulfur (S) and nitrogen (N) emissions in the future. Long-term data of acidification and recovery as well as the effects of a liming program at Lake Store Hovvatn were presented in this paper, along with data collected from Lake Lille Hovvatn as unlimed reference. Water samples from the lakes were collected 5 times annually from varying depths. Total organic carbon was measured after wet chemical oxidation by infrared detection. Acidification hindcasts and forecasts for the period 1870-2050 were conducted with the dynamic model MAGIC, which simulated soil solution and surface water chemistry to predict average concentrations of the major ions. The model showed good agreement with major changes in water chemistry observed over the past 30 years, as well simulating pH and concentrations of inorganic aluminium (Al). The data were evaluated in terms of the prospects for the re-establishment of a self-sustaining brown trout population. All liming efforts at Lake Store Hovvatn resulted in improvements in water quality. However, the stocked fish showed excellent survival and growth rates after liming but no natural recruitment, which suggested that fish eggs at shallow depths under ice cover are a sensitive biological indicator. Continuous records of pH revealed serious difficulties in maintaining adequate water quality at shallow depths in winter. While various liming techniques were discussed, it was concluded that the problem of surface water acidification in southern Norway is not solved, and a long-term strategy is called for. 45 refs., 5 tabs., 6 figs.

  3. Short- and long-term antidepressant effects of ketamine in a rat chronic unpredictable stress model.

    Science.gov (United States)

    Jiang, Yinghong; Wang, Yiqiang; Sun, Xiaoran; Lian, Bo; Sun, Hongwei; Wang, Gang; Du, Zhongde; Li, Qi; Sun, Lin

    2017-08-01

    This research was aimed to evaluate the behaviors of short- or long-term antidepressant effects of ketamine in rats exposed to chronic unpredictable stress (CUS). Ketamine, a glutamate noncompetitive NMDA receptor antagonist, regulates excitatory amino acid functions, such as anxiety disorders and major depression, and plays an important role in synaptic plasticity and learning and memory. After 42 days of CUS model, male rats received either a single injection of ketamine (10 mg/kg; day 43) or 15 daily injections (days 43-75). The influence of ketamine on behavioral reactivity was assessed 24 hr (short-term) or 7 weeks after ketamine treatment (long-term). Behavioral tests used to assess the effects of these treatments included the sucrose preference (SP), open field (OF), elevated plus maze (EPM), forced swimming (FS), and water maze (WM) to detect anxiety-like behavior (OF and EPM), forced swimming (FS), and water maze (WM). Results: Short-term ketamine administration resulted in increases of body weight gain, higher sensitivity to sucrose, augmented locomotor activity in the OF, more entries into the open arms of the EPM, along increased activity in the FS test; all responses indicative of reductions in depression/despair in anxiety-eliciting situations. No significant differences in these behaviors were obtained under conditions of long-term ketamine administration ( p  > .05). The CUS + Ketamine group showed significantly increased activity as compared with the CUS + Vehicle group for analysis of the long-term effects of ketamine (* p   .05). Taken together these findings demonstrate that a short-term administration of ketamine induced rapid antidepressant-like effects in adult male rats exposed to CUS conditions, effects that were not observed in response to the long-term treatment regime.

  4. Short term load forecasting: two stage modelling

    Directory of Open Access Journals (Sweden)

    SOARES, L. J.

    2009-06-01

    Full Text Available This paper studies the hourly electricity load demand in the area covered by a utility situated in the Seattle, USA, called Puget Sound Power and Light Company. Our proposal is put into proof with the famous dataset from this company. We propose a stochastic model which employs ANN (Artificial Neural Networks to model short-run dynamics and the dependence among adjacent hours. The model proposed treats each hour's load separately as individual single series. This approach avoids modeling the intricate intra-day pattern (load profile displayed by the load, which varies throughout days of the week and seasons. The forecasting performance of the model is evaluated in similiar mode a TLSAR (Two-Level Seasonal Autoregressive model proposed by Soares (2003 using the years of 1995 and 1996 as the holdout sample. Moreover, we conclude that non linearity is present in some series of these data. The model results are analyzed. The experiment shows that our tool can be used to produce load forecasting in tropical climate places.

  5. Short- and Long-Term Feedbacks on Vegetation Water Use: Unifying Evidence from Observations and Modeling

    Science.gov (United States)

    Mackay, D. S.

    2001-05-01

    Recent efforts to measure and model the interacting influences of climate, soil, and vegetation on soil water and nutrient dynamics have identified numerous important feedbacks that produce nonlinear responses. In particular, plant physiological factors that control rates of transpiration respond to soil water deficits and vapor pressure deficits (VPD) in the short-term, and to climate, nutrient cycling and disturbance in the long-term. The starting point of this presentation is the observation that in many systems, in particular forest ecosystems, conservative water use emerges as a result of short-term closure of stomata in response to high evaporative demand, and long-term vegetative canopy development under nutrient limiting conditions. Evidence for important short-term controls is presented from sap flux measurements of stand transpiration, remote sensing, and modeling of transpiration through a combination of physically-based modeling and Monte Carlo analysis. A common result is a strong association between stomatal conductance (gs) and the negative evaporative gain (∂ gs/∂ VPD) associated with the sensitivity of stomatal closure to rates of water loss. The importance of this association from the standpoint of modeling transpiration depends on the degree of canopy-atmosphere coupling. This suggests possible simplifications to future canopy component models for use in watershed and larger-scale hydrologic models for short-term processes. However, further results are presented from theoretical modeling, which suggest that feedbacks between hydrology and vegetation in current long-term (inter-annual to century) models may be too simple, as they do not capture the spatially variable nature of slow nutrient cycling in response to soil water dynamics and site history. Memory effects in the soil nutrient pools can leave lasting effects on more rapid processes associated with soil, vegetation, atmosphere coupling.

  6. Medium- and Long-term Prediction of LOD Change with the Leap-step Autoregressive Model

    Science.gov (United States)

    Liu, Q. B.; Wang, Q. J.; Lei, M. F.

    2015-09-01

    It is known that the accuracies of medium- and long-term prediction of changes of length of day (LOD) based on the combined least-square and autoregressive (LS+AR) decrease gradually. The leap-step autoregressive (LSAR) model is more accurate and stable in medium- and long-term prediction, therefore it is used to forecast the LOD changes in this work. Then the LOD series from EOP 08 C04 provided by IERS (International Earth Rotation and Reference Systems Service) is used to compare the effectiveness of the LSAR and traditional AR methods. The predicted series resulted from the two models show that the prediction accuracy with the LSAR model is better than that from AR model in medium- and long-term prediction.

  7. A quantum hydrodynamic model for multicomponent quantum magnetoplasma with Jeans term

    International Nuclear Information System (INIS)

    Masood, W.; Salimullah, M.; Shah, H.A.

    2008-01-01

    The effect of Jeans term in a multicomponent self-gravitating quantum magnetoplasma is investigated employing the quantum hydrodynamic (QHD) model. The effects of quantum Bohm potential and statistical terms as well as the ambient magnetic field are also investigated on both dust and ion dynamics driven waves in this Letter. We state the conditions that can drive the system unstable in the presence of Jeans term. The limiting cases are also presented. The present work may have relevance in the dense astrophysical environments where the self-gravitating effects are expected to play a pivotal role

  8. Fission barriers within the liquid drop model with the surface-curvature term

    International Nuclear Information System (INIS)

    Pomorski, K.; Dudek, J.

    2004-01-01

    The recently revised liquid drop model (PRC 67(2003) 044316) containing the curvature term reproduces the masses of 2766 experimentally known isotopes having Z≥8 and N≥8 with the r.m.s. deviation equal to 0.698 MeV when the microscopic corrections of Moeller et al. is used. The influence of the congruence energy as well as the compression term on the barrier heights is discussed within this new macroscopic model. The r.m.s. deviation of the fission barrier heights of 40 isotopes with Z≥34 is 1.73 MeV only when deformation-dependent congruence energy is included. The effect of the compression term in the liquid drop energy has rather weak influence on the barrier heights. (author)

  9. Modeling of the sawtooth instability in tokamaks using a current viscosity term

    International Nuclear Information System (INIS)

    Ward, D.J.; Jardin, S.C.

    1988-08-01

    We propose a new method for modeling the sawtooth instability and other MHD activity in axisymmetric tokamak transport simulations. A hyper-resistivity (or current viscosity) term is included in the mean field Ohm's law to describe the effects of the three-dimensional fluctuating fields on the evolution of the inverse transform, q, characterizing the mean fields. This term has the effect of flattening the current profile, while dissipating energy and conserving helicity. A fully implicit MHD transport and 2-D toroidal equilibrium code has been developed to calculate the evolution in time of the q-profile and the current profile using this new term. The results of this code are compared to the Kadomtsev reconnection model in the circular cylindrical limit. 17 refs., 8 figs

  10. Locally Rotationally Symmetric Bianchi Type-I Model with Time Varying Λ Term

    International Nuclear Information System (INIS)

    Tiwari, R. K.; Jha, Navin Kumar

    2009-01-01

    We investigate the locally rotationally symmetric (LRS) Bianchi type-I cosmological model for stiff matter and a vacuum solution with a cosmological term proportional to R −m (R is the scale factor and m is a positive constant). The cosmological term decreases with time. We obtain that for both the cases the present universe is accelerating with a large fraction of cosmological density in the form of a cosmological term

  11. Direct numerical simulation of particle-laden turbulent channel flows with two- and four-way coupling effects: models of terms in the Reynolds stress budgets

    International Nuclear Information System (INIS)

    Dritselis, Chris D

    2017-01-01

    In the first part of this study (Dritselis 2016 Fluid Dyn. Res. 48 015507), the Reynolds stress budgets were evaluated through point-particle direct numerical simulations (pp-DNSs) for the particle-laden turbulent flow in a vertical channel with two- and four-way coupling effects. Here several turbulence models are assessed by direct comparison of the particle contribution terms to the budgets, the dissipation rate, the pressure-strain rate, and the transport rate with the model expressions using the pp-DNS data. It is found that the models of the particle sources to the equations of fluid turbulent kinetic energy and dissipation rate cannot represent correctly the physics of the complex interaction between turbulence and particles. A relatively poor performance of the pressure-strain term models is revealed in the particulate flows, while the algebraic models for the dissipation rate of the fluid turbulence kinetic energy and the transport rate terms can adequately reproduce the main trends due to the presence of particles. Further work is generally needed to improve the models in order to account properly for the momentum exchange between the two phases and the effects of particle inertia, gravity and inter-particle collisions. (paper)

  12. Direct numerical simulation of particle-laden turbulent channel flows with two- and four-way coupling effects: models of terms in the Reynolds stress budgets

    Energy Technology Data Exchange (ETDEWEB)

    Dritselis, Chris D, E-mail: dritseli@mie.uth.gr [Mechanical Engineering Department, University of Thessaly, Pedion Areos, 38334 Volos (Greece)

    2017-04-15

    In the first part of this study (Dritselis 2016 Fluid Dyn. Res. 48 015507), the Reynolds stress budgets were evaluated through point-particle direct numerical simulations (pp-DNSs) for the particle-laden turbulent flow in a vertical channel with two- and four-way coupling effects. Here several turbulence models are assessed by direct comparison of the particle contribution terms to the budgets, the dissipation rate, the pressure-strain rate, and the transport rate with the model expressions using the pp-DNS data. It is found that the models of the particle sources to the equations of fluid turbulent kinetic energy and dissipation rate cannot represent correctly the physics of the complex interaction between turbulence and particles. A relatively poor performance of the pressure-strain term models is revealed in the particulate flows, while the algebraic models for the dissipation rate of the fluid turbulence kinetic energy and the transport rate terms can adequately reproduce the main trends due to the presence of particles. Further work is generally needed to improve the models in order to account properly for the momentum exchange between the two phases and the effects of particle inertia, gravity and inter-particle collisions. (paper)

  13. [Decision modeling for economic evaluation of health technologies].

    Science.gov (United States)

    de Soárez, Patrícia Coelho; Soares, Marta Oliveira; Novaes, Hillegonda Maria Dutilh

    2014-10-01

    Most economic evaluations that participate in decision-making processes for incorporation and financing of technologies of health systems use decision models to assess the costs and benefits of the compared strategies. Despite the large number of economic evaluations conducted in Brazil, there is a pressing need to conduct an in-depth methodological study of the types of decision models and their applicability in our setting. The objective of this literature review is to contribute to the knowledge and use of decision models in the national context of economic evaluations of health technologies. This article presents general definitions about models and concerns with their use; it describes the main models: decision trees, Markov chains, micro-simulation, simulation of discrete and dynamic events; it discusses the elements involved in the choice of model; and exemplifies the models addressed in national economic evaluation studies of diagnostic and therapeutic preventive technologies and health programs.

  14. Transcatheter arterial embolization for hepatoma; I. Short-term evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Heung Suk; Koh, Byung Hee; Cho, On Koo; Hahm, Chang Kok; Rhee, Jong Chul; Lee, Min Ho; Kee, Choon Suhk [Hanyang University College of Medicine, Seoul (Korea, Republic of)

    1985-12-15

    Anticancer effect and complications were evaluated after transcatheter arterial embolization (TAE) in 12 patients with hepatocellular carcinoma until 2 weeks and 4 weeks after TAE, respectively. The results were as follows; 1. Serum alpha-fetoprotein value decreased in 7 out of 9 patients with high value prior to TAE. 2. Loss of enhancement and better definition on enhanced computed tomography (CT) were seen in the tumors in all caes, and low density areas in 9/10.Gas bubbles were seen in low-density areas in 4/10 and high density areas caused by lipiodol in 6/10. 3. Post-embolization syndrome was developed in most patients but improved clinically within a week after TAE. 4. On laboratory examination, impairment of liver function was developed in most patients but improved within 4 weeks after TAE. 5. Complications on CT included splenic infarction and thickening of wall of the gallbladder, which didn't require specific treatment. The authors conclude that TAE for hepatocellular carcinoma reveals apparent anticancer effect on short-term evaluation, and resultant complications are transient and improved by conservative treatment.

  15. Transcatheter arterial embolization for hepatoma; I. Short-term evaluation

    International Nuclear Information System (INIS)

    Seo, Heung Suk; Koh, Byung Hee; Cho, On Koo; Hahm, Chang Kok; Rhee, Jong Chul; Lee, Min Ho; Kee, Choon Suhk

    1985-01-01

    Anticancer effect and complications were evaluated after transcatheter arterial embolization (TAE) in 12 patients with hepatocellular carcinoma until 2 weeks and 4 weeks after TAE, respectively. The results were as follows; 1. Serum alpha-fetoprotein value decreased in 7 out of 9 patients with high value prior to TAE. 2. Loss of enhancement and better definition on enhanced computed tomography (CT) were seen in the tumors in all caes, and low density areas in 9/10.Gas bubbles were seen in low-density areas in 4/10 and high density areas caused by lipiodol in 6/10. 3. Post-embolization syndrome was developed in most patients but improved clinically within a week after TAE. 4. On laboratory examination, impairment of liver function was developed in most patients but improved within 4 weeks after TAE. 5. Complications on CT included splenic infarction and thickening of wall of the gallbladder, which didn't require specific treatment. The authors conclude that TAE for hepatocellular carcinoma reveals apparent anticancer effect on short-term evaluation, and resultant complications are transient and improved by conservative treatment

  16. Presenting an Evaluation Model for the Cancer Registry Software.

    Science.gov (United States)

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  17. Short-Term Memory for Serial Order: A Recurrent Neural Network Model

    Science.gov (United States)

    Botvinick, Matthew M.; Plaut, David C.

    2006-01-01

    Despite a century of research, the mechanisms underlying short-term or working memory for serial order remain uncertain. Recent theoretical models have converged on a particular account, based on transient associations between independent item and context representations. In the present article, the authors present an alternative model, according…

  18. A Long-Term Mathematical Model for Mining Industries

    Energy Technology Data Exchange (ETDEWEB)

    Achdou, Yves, E-mail: achdou@ljll.univ-paris-diderot.fr [Univ. Paris Diderot, Sorbonne Paris Cité, Laboratoire Jacques-Louis Lions, UMR 7598, UPMC, CNRS (France); Giraud, Pierre-Noel [CERNA, Mines ParisTech (France); Lasry, Jean-Michel [Univ. Paris Dauphine (France); Lions, Pierre-Louis [Collège de France (France)

    2016-12-15

    A parcimonious long term model is proposed for a mining industry. Knowing the dynamics of the global reserve, the strategy of each production unit consists of an optimal control problem with two controls, first the flux invested into prospection and the building of new extraction facilities, second the production rate. In turn, the dynamics of the global reserve depends on the individual strategies of the producers, so the models leads to an equilibrium, which is described by low dimensional systems of partial differential equations. The dimensionality depends on the number of technologies that a mining producer can choose. In some cases, the systems may be reduced to a Hamilton–Jacobi equation which is degenerate at the boundary and whose right hand side may blow up at the boundary. A mathematical analysis is supplied. Then numerical simulations for models with one or two technologies are described. In particular, a numerical calibration of the model in order to fit the historical data is carried out.

  19. A Long-Term Mathematical Model for Mining Industries

    International Nuclear Information System (INIS)

    Achdou, Yves; Giraud, Pierre-Noel; Lasry, Jean-Michel; Lions, Pierre-Louis

    2016-01-01

    A parcimonious long term model is proposed for a mining industry. Knowing the dynamics of the global reserve, the strategy of each production unit consists of an optimal control problem with two controls, first the flux invested into prospection and the building of new extraction facilities, second the production rate. In turn, the dynamics of the global reserve depends on the individual strategies of the producers, so the models leads to an equilibrium, which is described by low dimensional systems of partial differential equations. The dimensionality depends on the number of technologies that a mining producer can choose. In some cases, the systems may be reduced to a Hamilton–Jacobi equation which is degenerate at the boundary and whose right hand side may blow up at the boundary. A mathematical analysis is supplied. Then numerical simulations for models with one or two technologies are described. In particular, a numerical calibration of the model in order to fit the historical data is carried out.

  20. Long term evaluation of mesenchymal stem cell therapy in a feline model of chronic allergic asthma

    Science.gov (United States)

    Trzil, Julie E; Masseau, Isabelle; Webb, Tracy L; Chang, Chee-hoon; Dodam, John R; Cohn, Leah A; Liu, Hong; Quimby, Jessica M; Dow, Steven W; Reinero, Carol R

    2014-01-01

    Background Mesenchymal stem cells (MSCs) decrease airway eosinophilia, airway hyperresponsiveness (AHR), and remodeling in murine models of acutely induced asthma. We hypothesized that MSCs would diminish these hallmark features in a chronic feline asthma model. Objective To document effects of allogeneic, adipose-derived MSCs on airway inflammation, airway hyperresponsiveness (AHR), and remodeling over time and investigate mechanisms by which MSCs alter local and systemic immunologic responses in chronic experimental feline allergic asthma. Methods Cats with chronic, experimentally-induced asthma received six intravenous infusions of MSCs (0.36–2.5X10E7 MSCs/infusion) or placebo bimonthly at the time of study enrollment. Cats were evaluated at baseline and longitudinally for one year. Outcome measures included: bronchoalveolar lavage fluid cytology to assess airway eosinophilia; pulmonary mechanics and clinical scoring to assess AHR; and thoracic computed tomographic (CT) scans to assess structural changes (airway remodeling). CT scans were evaluated using a scoring system for lung attenuation (LA) and bronchial wall thickening (BWT). To assess mechanisms of MSC action, immunologic assays including allergen-specific IgE, cellular IL-10 production, and allergen-specific lymphocyte proliferation were performed. Results There were no differences between treatment groups or over time with respect to airway eosinophilia or AHR. However, significantly lower LA and BWT scores were noted in CT images of MSC-treated animals compared to placebo-treated cats at month 8 of the study (LA p=0.0311; BWT p=0.0489). No differences were noted between groups in the immunologic assays. Conclusions and Clinical Relevance When administered after development of chronic allergic feline asthma, MSCs failed to reduce airway inflammation and AHR. However, repeated administration of MSCs at the start of study did reduce computed tomographic measures of airway remodeling by month 8, though

  1. CMAQ Model Evaluation Framework

    Science.gov (United States)

    CMAQ is tested to establish the modeling system’s credibility in predicting pollutants such as ozone and particulate matter. Evaluation of CMAQ has been designed to assess the model’s performance for specific time periods and for specific uses.

  2. Evaluation of the performance of DIAS ionospheric forecasting models

    Directory of Open Access Journals (Sweden)

    Tsagouri Ioanna

    2011-08-01

    Full Text Available Nowcasting and forecasting ionospheric products and services for the European region are regularly provided since August 2006 through the European Digital upper Atmosphere Server (DIAS, http://dias.space.noa.gr. Currently, DIAS ionospheric forecasts are based on the online implementation of two models: (i the solar wind driven autoregression model for ionospheric short-term forecast (SWIF, which combines historical and real-time ionospheric observations with solar-wind parameters obtained in real time at the L1 point from NASA ACE spacecraft, and (ii the geomagnetically correlated autoregression model (GCAM, which is a time series forecasting method driven by a synthetic geomagnetic index. In this paper we investigate the operational ability and the accuracy of both DIAS models carrying out a metrics-based evaluation of their performance under all possible conditions. The analysis was established on the systematic comparison between models’ predictions with actual observations obtained over almost one solar cycle (1998–2007 at four European ionospheric locations (Athens, Chilton, Juliusruh and Rome and on the comparison of the models’ performance against two simple prediction strategies, the median- and the persistence-based predictions during storm conditions. The results verify operational validity for both models and quantify their prediction accuracy under all possible conditions in support of operational applications but also of comparative studies in assessing or expanding the current ionospheric forecasting capabilities.

  3. Notes on power of normality tests of error terms in regression models

    International Nuclear Information System (INIS)

    Střelec, Luboš

    2015-01-01

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models

  4. Notes on power of normality tests of error terms in regression models

    Energy Technology Data Exchange (ETDEWEB)

    Střelec, Luboš [Department of Statistics and Operation Analysis, Faculty of Business and Economics, Mendel University in Brno, Zemědělská 1, Brno, 61300 (Czech Republic)

    2015-03-10

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.

  5. Interim report on nodel evaluation methodology and the evaluation of LEAP

    Energy Technology Data Exchange (ETDEWEB)

    Alsmiller, R.G. Jr.; Barish, J.; Bjornstad, D.

    1980-04-01

    This report describes progress made at ORNL toward development and demonstration of a methodology for evaluating energy-economic modeling codes and important results derived from these codes. To bolster traditional evaluation methods with more-quantitative procedures of interest to the Energy Information Administration, ORNL is applying sensitivity theory as part of a comprehensive effort to quantify the importance of various data and model parameters to the key results that are of interest. The Long-Term Energy Analysis Program (LEAP) was chosen as the initial focus for the research. LEAP is an energy-economy model which resides in the Long-Term Energy Analysis Division (LTEAD) of the Integrative Analysis Group in the Office of Applied Analysis, EIA. LTEAD developed Model 22C of LEAP for two reasons: (1) to prepare projections through the year 2020, which were needed for the 1978 EIA Annual Report to Congress and (2) to develop a base for analyses of specific options for Federal action. LEAP Model 22C and its uses are described to provide the background for this interim description of the model evaluation effort at ORNL. 19 figures, 10 tables.

  6. Development of a short-term model to predict natural gas demand, March 1989

    International Nuclear Information System (INIS)

    Lihn, M.L.

    1989-03-01

    Project management decisions for the Gas Research Institute (GRI) R and D program require an appreciation of the short-term outlook for gas consumption. This paper provides a detailed discussion of the methodology used to develop short-term models for the residential, commercial, industrial, and electric utility sectors. The relative success of the models in projecting gas demand, compared with actual gas demand, is reviewed for each major gas-consuming sector. The comparison of actual to projected gas demand has pointed out several problems with the model, and possible solutions to these problems are discussed

  7. Density-dependent microbial turnover improves soil carbon model predictions of long-term litter manipulations

    Science.gov (United States)

    Georgiou, Katerina; Abramoff, Rose; Harte, John; Riley, William; Torn, Margaret

    2017-04-01

    Climatic, atmospheric, and land-use changes all have the potential to alter soil microbial activity via abiotic effects on soil or mediated by changes in plant inputs. Recently, many promising microbial models of soil organic carbon (SOC) decomposition have been proposed to advance understanding and prediction of climate and carbon (C) feedbacks. Most of these models, however, exhibit unrealistic oscillatory behavior and SOC insensitivity to long-term changes in C inputs. Here we diagnose the sources of instability in four models that span the range of complexity of these recent microbial models, by sequentially adding complexity to a simple model to include microbial physiology, a mineral sorption isotherm, and enzyme dynamics. We propose a formulation that introduces density-dependence of microbial turnover, which acts to limit population sizes and reduce oscillations. We compare these models to results from 24 long-term C-input field manipulations, including the Detritus Input and Removal Treatment (DIRT) experiments, to show that there are clear metrics that can be used to distinguish and validate the inherent dynamics of each model structure. We find that widely used first-order models and microbial models without density-dependence cannot readily capture the range of long-term responses observed across the DIRT experiments as a direct consequence of their model structures. The proposed formulation improves predictions of long-term C-input changes, and implies greater SOC storage associated with CO2-fertilization-driven increases in C inputs over the coming century compared to common microbial models. Finally, we discuss our findings in the context of improving microbial model behavior for inclusion in Earth System Models.

  8. Evaluation of helper-dependent canine adenovirus vectors in a 3D human CNS model

    Science.gov (United States)

    Simão, Daniel; Pinto, Catarina; Fernandes, Paulo; Peddie, Christopher J.; Piersanti, Stefania; Collinson, Lucy M.; Salinas, Sara; Saggio, Isabella; Schiavo, Giampietro; Kremer, Eric J.; Brito, Catarina; Alves, Paula M.

    2017-01-01

    Gene therapy is a promising approach with enormous potential for treatment of neurodegenerative disorders. Viral vectors derived from canine adenovirus type 2 (CAV-2) present attractive features for gene delivery strategies in the human brain, by preferentially transducing neurons, are capable of efficient axonal transport to afferent brain structures, have a 30-kb cloning capacity and have low innate and induced immunogenicity in pre-clinical tests. For clinical translation, in-depth pre-clinical evaluation of efficacy and safety in a human setting is primordial. Stem cell-derived human neural cells have a great potential as complementary tools by bridging the gap between animal models, which often diverge considerably from human phenotype, and clinical trials. Herein, we explore helper-dependent CAV-2 (hd-CAV-2) efficacy and safety for gene delivery in a human stem cell-derived 3D neural in vitro model. Assessment of hd-CAV-2 vector efficacy was performed at different multiplicities of infection, by evaluating transgene expression and impact on cell viability, ultrastructural cellular organization and neuronal gene expression. Under optimized conditions, hd-CAV-2 transduction led to stable long-term transgene expression with minimal toxicity. hd-CAV-2 preferentially transduced neurons, while human adenovirus type 5 (HAdV5) showed increased tropism towards glial cells. This work demonstrates, in a physiologically relevant 3D model, that hd-CAV-2 vectors are efficient tools for gene delivery to human neurons, with stable long-term transgene expression and minimal cytotoxicity. PMID:26181626

  9. Evaluation of radiobiological effects in 3 distinct biological models

    International Nuclear Information System (INIS)

    Lemos, J.; Costa, P.; Cunha, L.; Metello, L.F.; Carvalho, A.P.; Vasconcelos, V.; Genesio, P.; Ponte, F.; Costa, P.S.; Crespo, P.

    2015-01-01

    Full text of publication follows. The present work aims at sharing the process of development of advanced biological models to study radiobiological effects. Recognizing several known limitations and difficulties of the current monolayer cellular models, as well as the increasing difficulties to use advanced biological models, our group has been developing advanced biological alternative models, namely three-dimensional cell cultures and a less explored animal model (the Zebra fish - Danio rerio - which allows the access to inter-generational data, while characterized by a great genetic homology towards the humans). These 3 models (monolayer cellular model, three-dimensional cell cultures and zebra fish) were externally irradiated with 100 mGy, 500 mGy or 1 Gy. The consequences of that irradiation were studied using cellular and molecular tests. Our previous experimental studies with 100 mGy external gamma irradiation of HepG2 monolayer cells showed a slight increase in the proliferation rate 24 h, 48 h and 72 h post irradiation. These results also pointed into the presence of certain bystander effects 72 h post irradiation, constituting the starting point for the need of a more accurate analysis realized with this work. At this stage, we continue focused on the acute biological effects. Obtained results, namely MTT and clonogenic assays for evaluating cellular metabolic activity and proliferation in the in vitro models, as well as proteomics for the evaluation of in vivo effects will be presented, discussed and explained. Several hypotheses will be presented and defended based on the facts previously demonstrated. This work aims at sharing the actual state and the results already available from this medium-term project, building the proof of the added value on applying these advanced models, while demonstrating the strongest and weakest points from all of them (so allowing the comparison between them and to base the subsequent choice for research groups starting

  10. Mid- and long-term runoff predictions by an improved phase-space reconstruction model

    International Nuclear Information System (INIS)

    Hong, Mei; Wang, Dong; Wang, Yuankun; Zeng, Xiankui; Ge, Shanshan; Yan, Hengqian; Singh, Vijay P.

    2016-01-01

    In recent years, the phase-space reconstruction method has usually been used for mid- and long-term runoff predictions. However, the traditional phase-space reconstruction method is still needs to be improved. Using the genetic algorithm to improve the phase-space reconstruction method, a new nonlinear model of monthly runoff is constructed. The new model does not rely heavily on embedding dimensions. Recognizing that the rainfall–runoff process is complex, affected by a number of factors, more variables (e.g. temperature and rainfall) are incorporated in the model. In order to detect the possible presence of chaos in the runoff dynamics, chaotic characteristics of the model are also analyzed, which shows the model can represent the nonlinear and chaotic characteristics of the runoff. The model is tested for its forecasting performance in four types of experiments using data from six hydrological stations on the Yellow River and the Yangtze River. Results show that the medium-and long-term runoff is satisfactorily forecasted at the hydrological stations. Not only is the forecasting trend accurate, but also the mean absolute percentage error is no more than 15%. Moreover, the forecast results of wet years and dry years are both good, which means that the improved model can overcome the traditional ‘‘wet years and dry years predictability barrier,’’ to some extent. The model forecasts for different regions are all good, showing the universality of the approach. Compared with selected conceptual and empirical methods, the model exhibits greater reliability and stability in the long-term runoff prediction. Our study provides a new thinking for research on the association between the monthly runoff and other hydrological factors, and also provides a new method for the prediction of the monthly runoff. - Highlights: • The improved phase-space reconstruction model of monthly runoff is established. • Two variables (temperature and rainfall) are incorporated

  11. Mid- and long-term runoff predictions by an improved phase-space reconstruction model

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Mei [Research Center of Ocean Environment Numerical Simulation, Institute of Meteorology and oceanography, PLA University of Science and Technology, Nanjing (China); Wang, Dong, E-mail: wangdong@nju.edu.cn [Key Laboratory of Surficial Geochemistry, Ministry of Education, Department of Hydrosciences, School of Earth Sciences and Engineering, Collaborative Innovation Center of South China Sea Studies, State Key Laboratory of Pollution Control and Resource Reuse, Nanjing University, Nanjing 210093 (China); Wang, Yuankun; Zeng, Xiankui [Key Laboratory of Surficial Geochemistry, Ministry of Education, Department of Hydrosciences, School of Earth Sciences and Engineering, Collaborative Innovation Center of South China Sea Studies, State Key Laboratory of Pollution Control and Resource Reuse, Nanjing University, Nanjing 210093 (China); Ge, Shanshan; Yan, Hengqian [Research Center of Ocean Environment Numerical Simulation, Institute of Meteorology and oceanography, PLA University of Science and Technology, Nanjing (China); Singh, Vijay P. [Department of Biological and Agricultural Engineering Zachry Department of Civil Engineering, Texas A & M University, College Station, TX 77843 (United States)

    2016-07-15

    In recent years, the phase-space reconstruction method has usually been used for mid- and long-term runoff predictions. However, the traditional phase-space reconstruction method is still needs to be improved. Using the genetic algorithm to improve the phase-space reconstruction method, a new nonlinear model of monthly runoff is constructed. The new model does not rely heavily on embedding dimensions. Recognizing that the rainfall–runoff process is complex, affected by a number of factors, more variables (e.g. temperature and rainfall) are incorporated in the model. In order to detect the possible presence of chaos in the runoff dynamics, chaotic characteristics of the model are also analyzed, which shows the model can represent the nonlinear and chaotic characteristics of the runoff. The model is tested for its forecasting performance in four types of experiments using data from six hydrological stations on the Yellow River and the Yangtze River. Results show that the medium-and long-term runoff is satisfactorily forecasted at the hydrological stations. Not only is the forecasting trend accurate, but also the mean absolute percentage error is no more than 15%. Moreover, the forecast results of wet years and dry years are both good, which means that the improved model can overcome the traditional ‘‘wet years and dry years predictability barrier,’’ to some extent. The model forecasts for different regions are all good, showing the universality of the approach. Compared with selected conceptual and empirical methods, the model exhibits greater reliability and stability in the long-term runoff prediction. Our study provides a new thinking for research on the association between the monthly runoff and other hydrological factors, and also provides a new method for the prediction of the monthly runoff. - Highlights: • The improved phase-space reconstruction model of monthly runoff is established. • Two variables (temperature and rainfall) are incorporated

  12. Short Term Evaluation of an Anatomically Shaped Polycarbonate Urethane Total Meniscus Replacement in a Goat Model.

    Directory of Open Access Journals (Sweden)

    A C T Vrancken

    Full Text Available Since the treatment options for symptomatic total meniscectomy patients are still limited, an anatomically shaped, polycarbonate urethane (PCU, total meniscus replacement was developed. This study evaluates the in vivo performance of the implant in a goat model, with a specific focus on the implant location in the joint, geometrical integrity of the implant and the effect of the implant on synovial membrane and articular cartilage histopathological condition.The right medial meniscus of seven Saanen goats was replaced by the implant. Sham surgery (transection of the MCL, arthrotomy and MCL suturing was performed in six animals. The contralateral knee joints of both groups served as control groups. After three months follow-up the following aspects of implant performance were evaluated: implant position, implant deformation and the histopathological condition of the synovium and cartilage.Implant geometry was well maintained during the three month implantation period. No signs of PCU wear were found and the implant did not induce an inflammatory response in the knee joint. In all animals, implant fixation was compromised due to suture breakage, wear or elongation, likely causing the increase in extrusion observed in the implant group. Both the femoral cartilage and tibial cartilage in direct contact with the implant showed increased damage compared to the sham and sham-control groups.This study demonstrates that the novel, anatomically shaped PCU total meniscal replacement is biocompatible and resistant to three months of physiological loading. Failure of the fixation sutures may have increased implant mobility, which probably induced implant extrusion and potentially stimulated cartilage degeneration. Evidently, redesigning the fixation method is necessary. Future animal studies should evaluate the improved fixation method and compare implant performance to current treatment standards, such as allografts.

  13. Effect of the forcing term in the pseudopotential lattice Boltzmann modeling of thermal flows.

    Science.gov (United States)

    Li, Qing; Luo, K H

    2014-05-01

    The pseudopotential lattice Boltzmann (LB) model is a popular model in the LB community for simulating multiphase flows. Recently, several thermal LB models, which are based on the pseudopotential LB model and constructed within the framework of the double-distribution-function LB method, were proposed to simulate thermal multiphase flows [G. Házi and A. Márkus, Phys. Rev. E 77, 026305 (2008); L. Biferale, P. Perlekar, M. Sbragaglia, and F. Toschi, Phys. Rev. Lett. 108, 104502 (2012); S. Gong and P. Cheng, Int. J. Heat Mass Transfer 55, 4923 (2012); M. R. Kamali et al., Phys. Rev. E 88, 033302 (2013)]. The objective of the present paper is to show that the effect of the forcing term on the temperature equation must be eliminated in the pseudopotential LB modeling of thermal flows. First, the effect of the forcing term on the temperature equation is shown via the Chapman-Enskog analysis. For comparison, alternative treatments that are free from the forcing-term effect are provided. Subsequently, numerical investigations are performed for two benchmark tests. The numerical results clearly show that the existence of the forcing-term effect will lead to significant numerical errors in the pseudopotential LB modeling of thermal flows.

  14. A RETROSPECTIVE OF EVALUATION MODELS ON INTELLECTUAL CAPITAL

    Directory of Open Access Journals (Sweden)

    Ienciu Nicoleta Maria

    2011-12-01

    Full Text Available In the classical theory of economics, capital is one of the three factors of production, in addition to land and labor, and refers in particular to buildings, equipment, and machinery etc., used for the production of other goods (the term physical capital is also used by the specialized literature (Bratianu and Jianu, 2006. The present study intend to bring to the forefront the main evalluation methods for intellectual capital, as proposed, supported and criticized at the same time by researchers and practitioners. The study offers response to the following research questions: Which are the advantages and disadvantages of the intellectual capital evaluation methods? And what are the main studies approaching the subject of intellectual capital evaluation at international level? The collection and analysis of intellectual capital evaluation models and the non-participative observation are the main instruments used to bring to the forefront the main international existing evaluation frameworks. The information sources representing the base for these researches are especially constituted by articles published in specialized magazines, both from accounting and economics fields, specialized works relevant to the reference field, legislative documents, official documents, press releases and other documents issued by various national and international bodies. The most representative studies bringing to the forefront the evaluation of intellectual capital are the ones elaborated by Mouritsen et al (Mouritsen et al, 2001, Manea and Gorgan (Manea and Gorgan, 2003, Tayles (Tayles, 2002, Tayles et al (Tayles et al, 2007. The presented approaches offer a general idea on the range of methods, disciplines and operational specializations existing for the evaluation of intellectual capital. Only one of them - Balanced Scorecard is largely used, while the rest of the methods remain too theoretical or too poorly developed to be universally accepted. We believe that

  15. Renormalization of the nonlinear O(3) model with θ-term

    Energy Technology Data Exchange (ETDEWEB)

    Flore, Raphael, E-mail: raphael.flore@uni-jena.de [Theoretisch-Physikalisches Institut, Friedrich-Schiller-Universität Jena, Max-Wien-Platz 1, D-07743 Jena (Germany)

    2013-05-11

    The renormalization of the topological term in the two-dimensional nonlinear O(3) model is studied by means of the Functional Renormalization Group. By considering the topological charge as a limit of a more general operator, it is shown that a finite multiplicative renormalization occurs in the extreme infrared. In order to compute the effects of the zero modes, a specific representation of the Clifford algebra is developed which allows to reformulate the bosonic problem in terms of Dirac operators and to employ the index theorem.

  16. Long short-term memory neural network for air pollutant concentration predictions: Method development and evaluation.

    Science.gov (United States)

    Li, Xiang; Peng, Ling; Yao, Xiaojing; Cui, Shaolong; Hu, Yuan; You, Chengzeng; Chi, Tianhe

    2017-12-01

    Air pollutant concentration forecasting is an effective method of protecting public health by providing an early warning against harmful air pollutants. However, existing methods of air pollutant concentration prediction fail to effectively model long-term dependencies, and most neglect spatial correlations. In this paper, a novel long short-term memory neural network extended (LSTME) model that inherently considers spatiotemporal correlations is proposed for air pollutant concentration prediction. Long short-term memory (LSTM) layers were used to automatically extract inherent useful features from historical air pollutant data, and auxiliary data, including meteorological data and time stamp data, were merged into the proposed model to enhance the performance. Hourly PM 2.5 (particulate matter with an aerodynamic diameter less than or equal to 2.5 μm) concentration data collected at 12 air quality monitoring stations in Beijing City from Jan/01/2014 to May/28/2016 were used to validate the effectiveness of the proposed LSTME model. Experiments were performed using the spatiotemporal deep learning (STDL) model, the time delay neural network (TDNN) model, the autoregressive moving average (ARMA) model, the support vector regression (SVR) model, and the traditional LSTM NN model, and a comparison of the results demonstrated that the LSTME model is superior to the other statistics-based models. Additionally, the use of auxiliary data improved model performance. For the one-hour prediction tasks, the proposed model performed well and exhibited a mean absolute percentage error (MAPE) of 11.93%. In addition, we conducted multiscale predictions over different time spans and achieved satisfactory performance, even for 13-24 h prediction tasks (MAPE = 31.47%). Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Evaluation of load flow and grid expansion in a unit-commitment and expansion optimization model SciGRID International Conference on Power Grid Modelling

    Science.gov (United States)

    Senkpiel, Charlotte; Biener, Wolfgang; Shammugam, Shivenes; Längle, Sven

    2018-02-01

    Energy system models serve as a basis for long term system planning. Joint optimization of electricity generating technologies, storage systems and the electricity grid leads to lower total system cost compared to an approach in which the grid expansion follows a given technology portfolio and their distribution. Modelers often face the problem of finding a good tradeoff between computational time and the level of detail that can be modeled. This paper analyses the differences between a transport model and a DC load flow model to evaluate the validity of using a simple but faster transport model within the system optimization model in terms of system reliability. The main findings in this paper are that a higher regional resolution of a system leads to better results compared to an approach in which regions are clustered as more overloads can be detected. An aggregation of lines between two model regions compared to a line sharp representation has little influence on grid expansion within a system optimizer. In a DC load flow model overloads can be detected in a line sharp case, which is therefore preferred. Overall the regions that need to reinforce the grid are identified within the system optimizer. Finally the paper recommends the usage of a load-flow model to test the validity of the model results.

  18. A Fuzzy Comprehensive Evaluation Model for Sustainability Risk Evaluation of PPP Projects

    Directory of Open Access Journals (Sweden)

    Libiao Bai

    2017-10-01

    Full Text Available Evaluating the sustainability risk level of public–private partnership (PPP projects can reduce project risk incidents and achieve the sustainable development of the organization. However, the existing studies about PPP projects risk management mainly focus on exploring the impact of financial and revenue risks but ignore the sustainability risks, causing the concept of “sustainability” to be missing while evaluating the risk level of PPP projects. To evaluate the sustainability risk level and achieve the most important objective of providing a reference for the public and private sectors when making decisions on PPP project management, this paper constructs a factor system of sustainability risk of PPP projects based on an extensive literature review and develops a mathematical model based on the methods of fuzzy comprehensive evaluation model (FCEM and failure mode, effects and criticality analysis (FMECA for evaluating the sustainability risk level of PPP projects. In addition, this paper conducts computational experiment based on a questionnaire survey to verify the effectiveness and feasibility of this proposed model. The results suggest that this model is reasonable for evaluating the sustainability risk level of PPP projects. To our knowledge, this paper is the first study to evaluate the sustainability risk of PPP projects, which would not only enrich the theories of project risk management, but also serve as a reference for the public and private sectors for the sustainable planning and development. Keywords: sustainability risk eva

  19. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  20. Coarse Grid Modeling of Turbine Film Cooling Flows Using Volumetric Source Terms

    Science.gov (United States)

    Heidmann, James D.; Hunter, Scott D.

    2001-01-01

    The recent trend in numerical modeling of turbine film cooling flows has been toward higher fidelity grids and more complex geometries. This trend has been enabled by the rapid increase in computing power available to researchers. However, the turbine design community requires fast turnaround time in its design computations, rendering these comprehensive simulations ineffective in the design cycle. The present study describes a methodology for implementing a volumetric source term distribution in a coarse grid calculation that can model the small-scale and three-dimensional effects present in turbine film cooling flows. This model could be implemented in turbine design codes or in multistage turbomachinery codes such as APNASA, where the computational grid size may be larger than the film hole size. Detailed computations of a single row of 35 deg round holes on a flat plate have been obtained for blowing ratios of 0.5, 0.8, and 1.0, and density ratios of 1.0 and 2.0 using a multiblock grid system to resolve the flows on both sides of the plate as well as inside the hole itself. These detailed flow fields were spatially averaged to generate a field of volumetric source terms for each conservative flow variable. Solutions were also obtained using three coarse grids having streamwise and spanwise grid spacings of 3d, 1d, and d/3. These coarse grid solutions used the integrated hole exit mass, momentum, energy, and turbulence quantities from the detailed solutions as volumetric source terms. It is shown that a uniform source term addition over a distance from the wall on the order of the hole diameter is able to predict adiabatic film effectiveness better than a near-wall source term model, while strictly enforcing correct values of integrated boundary layer quantities.

  1. Long-term modelling for estimation of man-induced environmental risks

    International Nuclear Information System (INIS)

    Mahura, A.; Baklanov, A.; Sorensen, J.H.; Tridvornov, A.

    2006-01-01

    Full text: As a part of the EU coordination action project ENVIRO-RISKS the long-term regional and transboundary atmospheric transport, dispersion, and deposition of atmospheric pollutants is investigated. Focus is on pollutants originating at nuclear and chemical risk sites of the NIS countries. Potential sources of atmospheric pollution include chemical and metallurgical enterprises and smelters, former testing polygons of nuclear weapons, and nuclear plants and facilities. These are situated within territories of Kazakhstan, Ukraine, and Russia (the Siberian, Ural, Krasnoyarsk, and Kola regions). The atmospheric pollutants considered are radionuclides such as Cs-137, I-131, and Sr-90 as well as sulphates and heavy metals. The Danish Emergency Model for Atmosphere (DERMA) is employed for simulations using 3D meteorological fields from the European Center for Medium-range Weather Forecasts. The modeled concentration and deposition fields of atmospheric pollutants are used as input into further collaborative studies to estimate the man-induced environmental risks from regionally and remotely located potential sources with a focus on Siberian territories. The temporal and spatial variability of these fields and the probabilities and contribution of removal during atmospheric transport are evaluated. Possible approaches for further GIS integration and use of obtained results are discussed with respect to estimation of man-received doses and risks, impact on environment with a focus on forests, applicability for integrated systems for regional environmental monitoring and management, and others. (author)

  2. Evaluation of Life Sciences and Social Sciences Course Books in Term of Societal Sexuality

    Science.gov (United States)

    Aykac, Necdet

    2012-01-01

    This study aims to evaluate primary school Life Sciences (1st, 2nd, and 3rd grades) and Social Sciences (4th, 5th, and 6th grades) course books in terms of gender discrimination. This study is a descriptive study aiming to evaluate the primary school Life Sciences (1st, 2nd, 3rd grades) and Social Sciences (4th, 5th, and 6th grades) course books…

  3. The Schwinger term and the Berry phase in simple models

    International Nuclear Information System (INIS)

    Grosse, H.

    1989-01-01

    We discuss quantization of fermions interacting with external fields and observe the occurrence of equivalent as well as inequivalent representations of the canonical anticommutation relations. Implementability of gauge and axial gauge transformations leads to generators which fulfill an algebra of charges with Schwinger term. This term can be written as a cocycle and leads to the boson-fermion correspondence. During an adiabatic transport along closed loops in a parameter space we may pick up a nonintegrable phase factor, usually called the Berry phase. We study the occurrence of such a topological phase in a model and give the parallel transport for density matrices. After second quantization one may pick up both a Berry phase and a Schwinger term. 13 refs. (Author)

  4. Remembering over the short-term: the case against the standard model.

    Science.gov (United States)

    Nairne, James S

    2002-01-01

    Psychologists often assume that short-term storage is synonymous with activation, a mnemonic property that keeps information in an immediately accessible form. Permanent knowledge is activated, as a result of on-line cognitive processing, and an activity trace is established "in" short-term (or working) memory. Activation is assumed to decay spontaneously with the passage of time, so a refreshing process-rehearsal-is needed to maintain availability. Most of the phenomena of immediate retention, such as capacity limitations and word length effects, are assumed to arise from trade-offs between rehearsal and decay. This "standard model" of how we remember over the short-term still enjoys considerable popularity, although recent research questions most of its main assumptions. In this chapter I review the recent research and identify the empirical and conceptual problems that plague traditional conceptions of short-term memory. Increasingly, researchers are recognizing that short-term retention is cue driven, much like long-term memory, and that neither rehearsal nor decay is likely to explain the particulars of short-term forgetting.

  5. An Evaluation of Teachers' Attitudes and Beliefs Levels on Classroom Control in Terms of Teachers' Sense of Efficacy (The Sample of Biology Teachers in Turkey)

    Science.gov (United States)

    Kurt, Hakan

    2014-01-01

    The aim of this study is to evaluate biology teachers' attitudes and belief levels on classroom control in terms of teachers' sense of efficacy. The screening model was used in the study. The study group was comprised of 135 biology teachers. In this study, Teachers' Sense of Efficacy Scale (TSES) and The Attitudes and Beliefs on Classroom Control…

  6. Estimating Multivariate Exponentail-Affine Term Structure Models from Coupon Bound Prices using Nonlinear Filtering

    DEFF Research Database (Denmark)

    Baadsgaard, Mikkel; Nielsen, Jan Nygaard; Madsen, Henrik

    2000-01-01

    An econometric analysis of continuous-timemodels of the term structure of interest rates is presented. A panel of coupon bond prices with different maturities is used to estimate the embedded parameters of a continuous-discrete state space model of unobserved state variables: the spot interest rate...... noise term should account for model errors. A nonlinear filtering method is used to compute estimates of the state variables, and the model parameters are estimated by a quasimaximum likelihood method provided that some assumptions are imposed on the model residuals. Both Monte Carlo simulation results...

  7. Advanced Monte Carlo procedure for the IFMIF d-Li neutron source term based on evaluated cross section data

    International Nuclear Information System (INIS)

    Simakov, S.P.; Fischer, U.; Moellendorff, U. von; Schmuck, I.; Konobeev, A.Yu.; Korovin, Yu.A.; Pereslavtsev, P.

    2002-01-01

    A newly developed computational procedure is presented for the generation of d-Li source neutrons in Monte Carlo transport calculations based on the use of evaluated double-differential d+ 6,7 Li cross section data. A new code M c DeLicious was developed as an extension to MCNP4C to enable neutronics design calculations for the d-Li based IFMIF neutron source making use of the evaluated deuteron data files. The M c DeLicious code was checked against available experimental data and calculation results of M c DeLi and MCNPX, both of which use built-in analytical models for the Li(d, xn) reaction. It is shown that M c DeLicious along with newly evaluated d+ 6,7 Li data is superior in predicting the characteristics of the d-Li neutron source. As this approach makes use of tabulated Li(d, xn) cross sections, the accuracy of the IFMIF d-Li neutron source term can be steadily improved with more advanced and validated data

  8. Advanced Monte Carlo procedure for the IFMIF d-Li neutron source term based on evaluated cross section data

    CERN Document Server

    Simakov, S P; Moellendorff, U V; Schmuck, I; Konobeev, A Y; Korovin, Y A; Pereslavtsev, P

    2002-01-01

    A newly developed computational procedure is presented for the generation of d-Li source neutrons in Monte Carlo transport calculations based on the use of evaluated double-differential d+ sup 6 sup , sup 7 Li cross section data. A new code M sup c DeLicious was developed as an extension to MCNP4C to enable neutronics design calculations for the d-Li based IFMIF neutron source making use of the evaluated deuteron data files. The M sup c DeLicious code was checked against available experimental data and calculation results of M sup c DeLi and MCNPX, both of which use built-in analytical models for the Li(d, xn) reaction. It is shown that M sup c DeLicious along with newly evaluated d+ sup 6 sup , sup 7 Li data is superior in predicting the characteristics of the d-Li neutron source. As this approach makes use of tabulated Li(d, xn) cross sections, the accuracy of the IFMIF d-Li neutron source term can be steadily improved with more advanced and validated data.

  9. Code-switched English pronunciation modeling for Swahili spoken term detection

    CSIR Research Space (South Africa)

    Kleynhans, N

    2016-05-01

    Full Text Available Computer Science 81 ( 2016 ) 128 – 135 5th Workshop on Spoken Language Technology for Under-resourced Languages, SLTU 2016, 9-12 May 2016, Yogyakarta, Indonesia Code-switched English Pronunciation Modeling for Swahili Spoken Term Detection Neil...

  10. Higher order spin-dependent terms in D0-brane scattering from the matrix model

    International Nuclear Information System (INIS)

    McArthur, I.N.

    1998-01-01

    The potential describing long-range interactions between D0-branes contains spin-dependent terms. In the matrix model, these should be reproduced by the one-loop effective action computed in the presence of a non-trivial fermionic background ψ. The v 3 ψ 2 /r 8 term in the effective action has been computed by Kraus and shown to correspond to a spin-orbit interaction between D0-branes, and the ψ 8 /r 11 term in the static potential has been obtained by Barrio et al. In this paper, the v 2 ψ 4 /r 9 term is computing in the matrix model and compared with the corresponding results of Morales et al. obtained using string theoretic methods. The technique employed is adapted to the underlying supersymmetry of the matrix model, and should be useful in the calculation of spin-dependent effects in more general Dp-brane scatterings. (orig.)

  11. Educational game models: conceptualization and evaluation ...

    African Journals Online (AJOL)

    Educational game models: conceptualization and evaluation. ... The Game Object Model (GOM), that marries educational theory and game design, forms the basis for the development of the Persona Outlining ... AJOL African Journals Online.

  12. A logic model framework for evaluation and planning in a primary care practice-based research network (PBRN)

    Science.gov (United States)

    Hayes, Holly; Parchman, Michael L.; Howard, Ray

    2012-01-01

    Evaluating effective growth and development of a Practice-Based Research Network (PBRN) can be challenging. The purpose of this article is to describe the development of a logic model and how the framework has been used for planning and evaluation in a primary care PBRN. An evaluation team was formed consisting of the PBRN directors, staff and its board members. After the mission and the target audience were determined, facilitated meetings and discussions were held with stakeholders to identify the assumptions, inputs, activities, outputs, outcomes and outcome indicators. The long-term outcomes outlined in the final logic model are two-fold: 1.) Improved health outcomes of patients served by PBRN community clinicians; and 2.) Community clinicians are recognized leaders of quality research projects. The Logic Model proved useful in identifying stakeholder interests and dissemination activities as an area that required more attention in the PBRN. The logic model approach is a useful planning tool and project management resource that increases the probability that the PBRN mission will be successfully implemented. PMID:21900441

  13. Evaluation of a Mathematical Model of Rat Body Weight Regulation in Application to Caloric Restriction and Drug Treatment Studies.

    Science.gov (United States)

    Selimkhanov, Jangir; Thompson, W Clayton; Patterson, Terrell A; Hadcock, John R; Scott, Dennis O; Maurer, Tristan S; Musante, Cynthia J

    2016-01-01

    The purpose of this work is to develop a mathematical model of energy balance and body weight regulation that can predict species-specific response to common pre-clinical interventions. To this end, we evaluate the ability of a previously published mathematical model of mouse metabolism to describe changes in body weight and body composition in rats in response to two short-term interventions. First, we adapt the model to describe body weight and composition changes in Sprague-Dawley rats by fitting to data previously collected from a 26-day caloric restriction study. The calibrated model is subsequently used to describe changes in rat body weight and composition in a 23-day cannabinoid receptor 1 antagonist (CB1Ra) study. While the model describes body weight data well, it fails to replicate body composition changes with CB1Ra treatment. Evaluation of a key model assumption about deposition of fat and fat-free masses shows a limitation of the model in short-term studies due to the constraint placed on the relative change in body composition components. We demonstrate that the model can be modified to overcome this limitation, and propose additional measurements to further test the proposed model predictions. These findings illustrate how mathematical models can be used to support drug discovery and development by identifying key knowledge gaps and aiding in the design of additional experiments to further our understanding of disease-relevant and species-specific physiology.

  14. Evaluation of a Mathematical Model of Rat Body Weight Regulation in Application to Caloric Restriction and Drug Treatment Studies.

    Directory of Open Access Journals (Sweden)

    Jangir Selimkhanov

    Full Text Available The purpose of this work is to develop a mathematical model of energy balance and body weight regulation that can predict species-specific response to common pre-clinical interventions. To this end, we evaluate the ability of a previously published mathematical model of mouse metabolism to describe changes in body weight and body composition in rats in response to two short-term interventions. First, we adapt the model to describe body weight and composition changes in Sprague-Dawley rats by fitting to data previously collected from a 26-day caloric restriction study. The calibrated model is subsequently used to describe changes in rat body weight and composition in a 23-day cannabinoid receptor 1 antagonist (CB1Ra study. While the model describes body weight data well, it fails to replicate body composition changes with CB1Ra treatment. Evaluation of a key model assumption about deposition of fat and fat-free masses shows a limitation of the model in short-term studies due to the constraint placed on the relative change in body composition components. We demonstrate that the model can be modified to overcome this limitation, and propose additional measurements to further test the proposed model predictions. These findings illustrate how mathematical models can be used to support drug discovery and development by identifying key knowledge gaps and aiding in the design of additional experiments to further our understanding of disease-relevant and species-specific physiology.

  15. A Spiking Working Memory Model Based on Hebbian Short-Term Potentiation

    Science.gov (United States)

    Fiebig, Florian

    2017-01-01

    A dominant theory of working memory (WM), referred to as the persistent activity hypothesis, holds that recurrently connected neural networks, presumably located in the prefrontal cortex, encode and maintain WM memory items through sustained elevated activity. Reexamination of experimental data has shown that prefrontal cortex activity in single units during delay periods is much more variable than predicted by such a theory and associated computational models. Alternative models of WM maintenance based on synaptic plasticity, such as short-term nonassociative (non-Hebbian) synaptic facilitation, have been suggested but cannot account for encoding of novel associations. Here we test the hypothesis that a recently identified fast-expressing form of Hebbian synaptic plasticity (associative short-term potentiation) is a possible mechanism for WM encoding and maintenance. Our simulations using a spiking neural network model of cortex reproduce a range of cognitive memory effects in the classical multi-item WM task of encoding and immediate free recall of word lists. Memory reactivation in the model occurs in discrete oscillatory bursts rather than as sustained activity. We relate dynamic network activity as well as key synaptic characteristics to electrophysiological measurements. Our findings support the hypothesis that fast Hebbian short-term potentiation is a key WM mechanism. SIGNIFICANCE STATEMENT Working memory (WM) is a key component of cognition. Hypotheses about the neural mechanism behind WM are currently under revision. Reflecting recent findings of fast Hebbian synaptic plasticity in cortex, we test whether a cortical spiking neural network model with such a mechanism can learn a multi-item WM task (word list learning). We show that our model can reproduce human cognitive phenomena and achieve comparable memory performance in both free and cued recall while being simultaneously compatible with experimental data on structure, connectivity, and

  16. The regional climate model as a tool for long-term planning of Quebec water resources

    International Nuclear Information System (INIS)

    Frigon, A.

    2008-01-01

    'Full text': In recent years, important progress has been made in downscaling GCM (Global Climate Model) projections to a resolution where hydrological studies become feasible. Climate change simulations performed with RCMs (Regional Climate Models) have reached a level of confidence that allows us to take advantage of this information in long-term planning of water resources. The RCMs' main advantage consist in their construction based on balanced land as well as atmosphere water and energy budgets, and on their inclusion of feedbacks between the surface and the atmosphere. Such models therefore generate sequences of weather events, providing long time series of hydro-climatic variables that are internally consistent, allowing the analysis of hydrologic regimes. At OURANOS, special attention is placed on the hydrological cycle, given its key role on socioeconomic activities. The Canadian Regional Climate Model (CRCM) was developed as a potential tool to provide climate projections at the watershed scale. Various analyses performed over small basins in Quebec provide information on the level of confidence we have in the CRCM for use in hydrological studies. Even though this approach is not free of uncertainty, it was found useful by some water resource managers and hence this information should be considered. One of the keys to retain usefulness, despite the associated uncertainties, is to make use of more than a single regional climate projection. This approach will allow for the evaluation of the climate change signal and its associated level of confidence. Such a methodology is already applied by Hydro-Quebec in the long-term planning of its water resources for hydroelectric generation over the Quebec territory. (author)

  17. Numeric simulation model for long-term orthodontic tooth movement with contact boundary conditions using the finite element method.

    Science.gov (United States)

    Hamanaka, Ryo; Yamaoka, Satoshi; Anh, Tuan Nguyen; Tominaga, Jun-Ya; Koga, Yoshiyuki; Yoshida, Noriaki

    2017-11-01

    Although many attempts have been made to simulate orthodontic tooth movement using the finite element method, most were limited to analyses of the initial displacement in the periodontal ligament and were insufficient to evaluate the effect of orthodontic appliances on long-term tooth movement. Numeric simulation of long-term tooth movement was performed in some studies; however, neither the play between the brackets and archwire nor the interproximal contact forces were considered. The objectives of this study were to simulate long-term orthodontic tooth movement with the edgewise appliance by incorporating those contact conditions into the finite element model and to determine the force system when the space is closed with sliding mechanics. We constructed a 3-dimensional model of maxillary dentition with 0.022-in brackets and 0.019 × 0.025-in archwire. Forces of 100 cN simulating sliding mechanics were applied. The simulation was accomplished on the assumption that bone remodeling correlates with the initial tooth displacement. This method could successfully represent the changes in the moment-to-force ratio: the tooth movement pattern during space closure. We developed a novel method that could simulate the long-term orthodontic tooth movement and accurately determine the force system in the course of time by incorporating contact boundary conditions into finite element analysis. It was also suggested that friction is progressively increased during space closure in sliding mechanics. Copyright © 2017. Published by Elsevier Inc.

  18. Construction method and application of 3D velocity model for evaluation of strong seismic motion and its cost performance

    International Nuclear Information System (INIS)

    Matsuyama, Hisanori; Fujiwara, Hiroyuki

    2014-01-01

    Based on experiences of making subsurface structure models for seismic strong motion evaluation, the advantages and disadvantages in terms of convenience and cost for several methods used to make such models were reported. As for the details, gravity and micro-tremor surveys were considered to be highly valid in terms of convenience and cost. However, stratigraphy and seismic velocity structure are required to make accurate 3-D subsurface structures. To realize these, methods for directly examining subsurface ground or using controlled tremor sources (at high cost) are needed. As a result, it was summarized that in modeling subsurface structures, some sort of plan including both types of methods is desirable and that several methods must be combined to match one's intended purposes and budget. (authors)

  19. Evaluating strategies for sustainable intensification of US agriculture through the Long-Term Agroecosystem Research network

    Science.gov (United States)

    Spiegal, S.; Bestelmeyer, B. T.; Archer, D. W.; Augustine, D. J.; Boughton, E. H.; Boughton, R. K.; Cavigelli, M. A.; Clark, P. E.; Derner, J. D.; Duncan, E. W.; Hapeman, C. J.; Harmel, R. D.; Heilman, P.; Holly, M. A.; Huggins, D. R.; King, K.; Kleinman, P. J. A.; Liebig, M. A.; Locke, M. A.; McCarty, G. W.; Millar, N.; Mirsky, S. B.; Moorman, T. B.; Pierson, F. B.; Rigby, J. R.; Robertson, G. P.; Steiner, J. L.; Strickland, T. C.; Swain, H. M.; Wienhold, B. J.; Wulfhorst, J. D.; Yost, M. A.; Walthall, C. L.

    2018-03-01

    Sustainable intensification is an emerging model for agriculture designed to reconcile accelerating global demand for agricultural products with long-term environmental stewardship. Defined here as increasing agricultural production while maintaining or improving environmental quality, sustainable intensification hinges upon decision-making by agricultural producers, consumers, and policy-makers. The Long-Term Agroecosystem Research (LTAR) network was established to inform these decisions. Here we introduce the LTAR Common Experiment, through which scientists and partnering producers in US croplands, rangelands, and pasturelands are conducting 21 independent but coordinated experiments. Each local effort compares the outcomes of a predominant, conventional production system in the region (‘business as usual’) with a system hypothesized to advance sustainable intensification (‘aspirational’). Following the logic of a conceptual model of interactions between agriculture, economics, society, and the environment, we identified commonalities among the 21 experiments in terms of (a) concerns about business-as-usual production, (b) ‘aspirational outcomes’ motivating research into alternatives, (c) strategies for achieving the outcomes, (d) practices that support the strategies, and (e) relationships between practice outreach and adoption. Network-wide, concerns about business as usual include the costs of inputs, opportunities lost to uniform management approaches, and vulnerability to accelerating environmental changes. Motivated by environmental, economic, and societal outcomes, scientists and partnering producers are investigating 15 practices in aspirational treatments to sustainably intensify agriculture, from crop diversification to ecological restoration. Collectively, the aspirational treatments reveal four general strategies for sustainable intensification: (1) reducing reliance on inputs through ecological intensification, (2) diversifying management

  20. Time-Dependent Global Sensitivity Analysis for Long-Term Degeneracy Model Using Polynomial Chaos

    Directory of Open Access Journals (Sweden)

    Jianbin Guo

    2014-07-01

    Full Text Available Global sensitivity is used to quantify the influence of uncertain model inputs on the output variability of static models in general. However, very few approaches can be applied for the sensitivity analysis of long-term degeneracy models, as far as time-dependent reliability is concerned. The reason is that the static sensitivity may not reflect the completed sensitivity during the entire life circle. This paper presents time-dependent global sensitivity analysis for long-term degeneracy models based on polynomial chaos expansion (PCE. Sobol’ indices are employed as the time-dependent global sensitivity since they provide accurate information on the selected uncertain inputs. In order to compute Sobol’ indices more efficiently, this paper proposes a moving least squares (MLS method to obtain the time-dependent PCE coefficients with acceptable simulation effort. Then Sobol’ indices can be calculated analytically as a postprocessing of the time-dependent PCE coefficients with almost no additional cost. A test case is used to show how to conduct the proposed method, then this approach is applied to an engineering case, and the time-dependent global sensitivity is obtained for the long-term degeneracy mechanism model.

  1. A mixed multiscale model better accounting for the cross term of the subgrid-scale stress and for backscatter

    Science.gov (United States)

    Thiry, Olivier; Winckelmans, Grégoire

    2016-02-01

    In the large-eddy simulation (LES) of turbulent flows, models are used to account for the subgrid-scale (SGS) stress. We here consider LES with "truncation filtering only" (i.e., that due to the LES grid), thus without regular explicit filtering added. The SGS stress tensor is then composed of two terms: the cross term that accounts for interactions between resolved scales and unresolved scales, and the Reynolds term that accounts for interactions between unresolved scales. Both terms provide forward- (dissipation) and backward (production, also called backscatter) energy transfer. Purely dissipative, eddy-viscosity type, SGS models are widely used: Smagorinsky-type models, or more advanced multiscale-type models. Dynamic versions have also been developed, where the model coefficient is determined using a dynamic procedure. Being dissipative by nature, those models do not provide backscatter. Even when using the dynamic version with local averaging, one typically uses clipping to forbid negative values of the model coefficient and hence ensure the stability of the simulation; hence removing the backscatter produced by the dynamic procedure. More advanced SGS model are thus desirable, and that better conform to the physics of the true SGS stress, while remaining stable. We here investigate, in decaying homogeneous isotropic turbulence, and using a de-aliased pseudo-spectral method, the behavior of the cross term and of the Reynolds term: in terms of dissipation spectra, and in terms of probability density function (pdf) of dissipation in physical space: positive and negative (backscatter). We then develop a new mixed model that better accounts for the physics of the SGS stress and for the backscatter. It has a cross term part which is built using a scale-similarity argument, further combined with a correction for Galilean invariance using a pseudo-Leonard term: this is the term that also does backscatter. It also has an eddy-viscosity multiscale model part that

  2. Development of comprehensive long-term-dry stored Spent Fuel INtegrity EvaLuator [SFINEL] - I

    International Nuclear Information System (INIS)

    Kwon, H. M.; Yang, Y. S.; Kim, Y. S.; You, K. S.; Min, D. K.; No, S. K.

    1999-01-01

    Safe management of spent nuclear fuels is socially, technically, and economically very important in terms of environmental protection and utilization of recyclable resources. One of the most critical parts in the management is to establish the comprehensive monitoring system which can maintain and confirm the integrity of the spent fuels, whenever necessary, until final policy is determined on the their treatment and disposal. Especially in the first stage of maturing up the system, it is essential to secure a computing tool or code which can evaluate the integrity of the fuel cladding based on its power history and cladding degradation mechanisms. SFINEL code, an integrated computer program for predicting the spent fuel rod integrity based on burn-up history and major degradation mechanisms, has been developed in this research. This code can sufficiently simulate the power history of a fuel rod during the reactor operation and estimate the degree of deterioration of spent fuel cladding using the recently-developed models on the degradation mechanisms

  3. A generalized one-factor term structure model and pricing of interest rate derivative securities

    NARCIS (Netherlands)

    Jiang, George J.

    1997-01-01

    The purpose of this paper is to propose a nonparametric interest rate term structure model and investigate its implications on term structure dynamics and prices of interest rate derivative securities. The nonparametric spot interest rate process is estimated from the observed short-term interest

  4. [Contract focused short-term group therapy--results of an evaluation].

    Science.gov (United States)

    Hirschberg, Rainer; Meyer, Birgit

    2010-01-01

    A short description outlines the development of commission focused short-term therapy (AFoG) for children and adolescents. Subsequently the generic principles of psychotherapy are applied to AFoG in order to underline the basic assumptions of this variation of systemic group therapy. Behavioural changes arising in different contexts (school, family, group therapy) show the need for an appropriate flexibility of group therapy techniques. The evaluation was accomplished using the Child Behaviour Checklist (CBCL 4-18) at the beginning and 3 month after the end of the group therapy. The results show positive effects which finally are discussed critically.

  5. Introduction of a high-throughput double-stent animal model for the evaluation of biodegradable vascular stents.

    Science.gov (United States)

    Borinski, Mauricio; Flege, Christian; Schreiber, Fabian; Krott, Nicole; Gries, Thomas; Liehn, Elisa; Blindt, Rüdiger; Marx, Nikolaus; Vogt, Felix

    2012-11-01

    Current stent system efficacy for the treatment of coronary artery disease is hampered by in-stent restenosis (ISR) rates of up to 20% in certain high-risk settings and by the risk of stent thrombosis, which is characterized by a high mortality rate. In theory, biodegradable vascular devices exhibit crucial advantages. Most absorbable implant materials are based on poly-L-lactic acid (PLLA) owing to its mechanical properties; however, PLLA might induce an inflammatory reaction in the vessel wall. Evaluation of biodegradable implant efficacy includes a long-term examination of tissue response; therefore, a simple in vivo tool for thorough biocompatibility and biodegradation evaluation would facilitate future stent system development. Rats have been used for the study of in vivo degradation processes, and stent implantation into the abdominal aorta of rats is a proven model for stent evaluation. Here, we report the transformation of the porcine double-stent animal model into the high-throughput rat abdominal aorta model. As genetic manipulation of rats was introduced recently, this novel method presents a powerful tool for future in vivo biodegradable candidate stent biocompatibility and biodegradation characterization in a reliable simple model of coronary ISR. Copyright © 2012 Wiley Periodicals, Inc.

  6. Multi-criteria evaluation of hydrological models

    Science.gov (United States)

    Rakovec, Oldrich; Clark, Martyn; Weerts, Albrecht; Hill, Mary; Teuling, Ryan; Uijlenhoet, Remko

    2013-04-01

    Over the last years, there is a tendency in the hydrological community to move from the simple conceptual models towards more complex, physically/process-based hydrological models. This is because conceptual models often fail to simulate the dynamics of the observations. However, there is little agreement on how much complexity needs to be considered within the complex process-based models. One way to proceed to is to improve understanding of what is important and unimportant in the models considered. The aim of this ongoing study is to evaluate structural model adequacy using alternative conceptual and process-based models of hydrological systems, with an emphasis on understanding how model complexity relates to observed hydrological processes. Some of the models require considerable execution time and the computationally frugal sensitivity analysis, model calibration and uncertainty quantification methods are well-suited to providing important insights for models with lengthy execution times. The current experiment evaluates two version of the Framework for Understanding Structural Errors (FUSE), which both enable running model inter-comparison experiments. One supports computationally efficient conceptual models, and the second supports more-process-based models that tend to have longer execution times. The conceptual FUSE combines components of 4 existing conceptual hydrological models. The process-based framework consists of different forms of Richard's equations, numerical solutions, groundwater parameterizations and hydraulic conductivity distribution. The hydrological analysis of the model processes has evolved from focusing only on simulated runoff (final model output), to also including other criteria such as soil moisture and groundwater levels. Parameter importance and associated structural importance are evaluated using different types of sensitivity analyses techniques, making use of both robust global methods (e.g. Sobol') as well as several

  7. Short-Term and Medium-Term Reliability Evaluation for Power Systems With High Penetration of Wind Power

    DEFF Research Database (Denmark)

    Ding, Yi; Singh, Chanan; Goel, Lalit

    2014-01-01

    reliability evaluation techniques for power systems are well developed. These techniques are more focused on steady-state (time-independent) reliability evaluation and have been successfully applied in power system planning and expansion. In the operational phase, however, they may be too rough......The expanding share of the fluctuating and less predictable wind power generation can introduce complexities in power system reliability evaluation and management. This entails a need for the system operator to assess the system status more accurately for securing real-time balancing. The existing...... an approximation of the time-varying behavior of power systems with high penetration of wind power. This paper proposes a time-varying reliability assessment technique. Time-varying reliability models for wind farms, conventional generating units, and rapid start-up generating units are developed and represented...

  8. Short-term spheroid culture of primary colorectal cancer cells as an in vitro model for personalizing cancer medicine

    DEFF Research Database (Denmark)

    Jeppesen, Maria; Hagel, Grith; Glenthoj, Anders

    2017-01-01

    Chemotherapy treatment of cancer remains a challenge due to the molecular and functional heterogeneity displayed by tumours originating from the same cell type. The pronounced heterogeneity makes it difficult for oncologists to devise an effective therapeutic strategy for the patient. One approac...... and combinations most commonly used for treatment of colorectal cancer. In summary, short-term spheroid culture of primary colorectal adenocarcinoma cells represents a promising in vitro model for use in personalized medicine....... for increasing treatment efficacy is to test the chemosensitivity of cancer cells obtained from the patient's tumour. 3D culture represents a promising method for modelling patient tumours in vitro. The aim of this study was therefore to evaluate how closely short-term spheroid cultures of primary colorectal...... cancer cells resemble the original tumour. Colorectal cancer cells were isolated from human tumour tissue and cultured as spheroids. Spheroid cultures were established with a high success rate and remained viable for at least 10 days. The spheroids exhibited significant growth over a period of 7 days...

  9. Management of remanent lifetime. Short-term benefits of the maintenance evaluation and improvement programme

    International Nuclear Information System (INIS)

    Sainero Garcia, J.

    1993-01-01

    Remanent Lifetime Management, which is scientifically based on knowing the degradatory phenomena associated with aging, today allows us to optimize plant life through a long-term maintenance strategy combining preventive maintenance and condition monitoring programmes. Within a project for Remanent Lifetime Management (RLM), the determination of methods of control and mitigation of degradations due to aging depends on the programme of Maintenance Evaluation and Improvement (MEI). This programme, underpinned by the analysis of degradatory phenomena to which plant components are subjected, evaluates current maintenance practices and defines the complementary actions which would facilitate establishment of a long-term strategy to control aging. Together with this main objective of the RLM project, the MEI programme achieves short-term benefits since, right from the beginning, it offers solutions to mitigate and guard against degradations in crucial plant components, and generally sets out a programme to control aging. The MEI programme further serves as a tool to reach the final objectives of the new 10CFR50.65 rule, 'Requirements for Maintenance Programs for NPPs'. The MEI always offers the option should the Utility Owner decide to extend plant life. (author)

  10. Characterization, Long-Term Behavior Evaluation and Thermomechanical Properties of Untreated and Treated Flax Fiber-Reinforced Composites

    Science.gov (United States)

    Amiri, Ali

    In recent years there has been a resurgence of interest in the usage of natural fiber reinforced composites in more advanced structural applications. Consequently, the need for improving their mechanical properties as well as service life and long-term behavior modeling and predictions has arisen. In a step towards further development of these materials, in this study, two newly developed biobased resins, derived from soybean oil, methacrylated epoxidized sucrose soyate and double methacrylated epoxidized sucrose soyate are combined with untreated and alkaline treated flax fiber to produce novel biocomposites. Vinyl ester reinforced with flax fiber is used as control in addition to comparing properties of biobased composites against commercial pultruded composites. Effects of alkaline treatment of flax fiber as well as addition of 1% acrylic resin to vinyl ester and the two mentioned biobased resins on mechanical properties are studied. Properties are evaluated in short-term and also, after being exposed to accelerated weathering (i.e. UV and moisture). Moreover, long-term creep of these novel biobased composites and effect of fiber and matrix treatment on viscoelastic behavior is investigated using Time-temperature superposition (TTS) principle. Based on the results of this study, the TTS provides an accelerated method for evaluation of mechanical properties of biobased composites, and satisfactory master curves are achieved by use of this principle. Also, fiber and matrix treatments were effective in increasing mechanical properties of biobased composites in short-term, and treatments delayed the creep response and slowed the process of creep in composites under study in the steady state region. Overall, results of this study reveal the successful production of biocomposites having properties that meet or exceed those of conventional pultruded members while maintaining high biocontent. Composites using treated flax fiber and newly developed resins showed less

  11. Rock mechanics models evaluation report: Draft report

    International Nuclear Information System (INIS)

    1985-10-01

    This report documents the evaluation of the thermal and thermomechanical models and codes for repository subsurface design and for design constraint analysis. The evaluation was based on a survey of the thermal and thermomechanical codes and models that are applicable to subsurface design, followed by a Kepner-Tregoe (KT) structured decision analysis of the codes and models. The end result of the KT analysis is a balanced, documented recommendation of the codes and models which are best suited to conceptual subsurface design for the salt repository. The various laws for modeling the creep of rock salt are also reviewed in this report. 37 refs., 1 fig., 7 tabs

  12. Damage Analysis and Evaluation of High Strength Concrete Frame Based on Deformation-Energy Damage Model

    Directory of Open Access Journals (Sweden)

    Huang-bin Lin

    2015-01-01

    Full Text Available A new method of characterizing the damage of high strength concrete structures is presented, which is based on the deformation energy double parameters damage model and incorporates both of the main forms of damage by earthquakes: first time damage beyond destruction and energy consumption. Firstly, test data of high strength reinforced concrete (RC columns were evaluated. Then, the relationship between stiffness degradation, strength degradation, and ductility performance was obtained. And an expression for damage in terms of model parameters was determined, as well as the critical input data for the restoring force model to be used in analytical damage evaluation. Experimentally, the unloading stiffness was found to be related to the cycle number. Then, a correction for this changing was applied to better describe the unloading phenomenon and compensate for the shortcomings of structure elastic-plastic time history analysis. The above algorithm was embedded into an IDARC program. Finally, a case study of high strength RC multistory frames was presented. Under various seismic wave inputs, the structural damages were predicted. The damage model and correction algorithm of stiffness unloading were proved to be suitable and applicable in engineering design and damage evaluation of a high strength concrete structure.

  13. The capital-asset pricing model reconsidered: tests in real terms on ...

    African Journals Online (AJOL)

    This paper extends previous work of the authors to reconsider the capital-asset pricing model (CAPM) in South Africa in real terms. As in that work, the main question this study aimed to answer remains: Can the CAPM be accepted in the South African market for the purposes of the stochastic modelling of investment returns ...

  14. Theoretical evaluation of maximum electric field approximation of direct band-to-band tunneling Kane model for low bandgap semiconductors

    Science.gov (United States)

    Dang Chien, Nguyen; Shih, Chun-Hsing; Hoa, Phu Chi; Minh, Nguyen Hong; Thi Thanh Hien, Duong; Nhung, Le Hong

    2016-06-01

    The two-band Kane model has been popularly used to calculate the band-to-band tunneling (BTBT) current in tunnel field-effect transistor (TFET) which is currently considered as a promising candidate for low power applications. This study theoretically clarifies the maximum electric field approximation (MEFA) of direct BTBT Kane model and evaluates its appropriateness for low bandgap semiconductors. By analysing the physical origin of each electric field term in the Kane model, it has been elucidated in the MEFA that the local electric field term must be remained while the nonlocal electric field terms are assigned by the maximum value of electric field at the tunnel junction. Mathematical investigations have showed that the MEFA is more appropriate for low bandgap semiconductors compared to high bandgap materials because of enhanced tunneling probability in low field regions. The appropriateness of the MEFA is very useful for practical uses in quickly estimating the direct BTBT current in low bandgap TFET devices.

  15. Evaluating Emulation-based Models of Distributed Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Stephen T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Gabert, Kasimir G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Tarman, Thomas D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Emulytics Initiatives

    2017-08-01

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses and describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.

  16. Short-term treatment with flumazenil restores long-term object memory in a mouse model of Down syndrome.

    Science.gov (United States)

    Colas, Damien; Chuluun, Bayarsaikhan; Garner, Craig C; Heller, H Craig

    2017-04-01

    Down syndrome (DS) is a common genetic cause of intellectual disability yet no pro-cognitive drug therapies are approved for human use. Mechanistic studies in a mouse model of DS (Ts65Dn mice) demonstrate that impaired cognitive function is due to excessive neuronal inhibitory tone. These deficits are normalized by chronic, short-term low doses of GABA A receptor (GABA A R) antagonists in adult animals, but none of the compounds investigated are approved for human use. We explored the therapeutic potential of flumazenil (FLUM), a GABA A R antagonist working at the benzodiazepine binding site that has FDA approval. Long-term memory was assessed by the Novel Object Recognition (NOR) testing in Ts65Dn mice after acute or short-term chronic treatment with FLUM. Short-term, low, chronic dose regimens of FLUM elicit long-lasting (>1week) normalization of cognitive function in both young and aged mice. FLUM at low dosages produces long lasting cognitive improvements and has the potential of fulfilling an unmet therapeutic need in DS. Copyright © 2017. Published by Elsevier Inc.

  17. Evaluating Sustainability Models for Interoperability through Brokering Software

    Science.gov (United States)

    Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew

    2016-04-01

    Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.

  18. Nuclear safety culture evaluation model based on SSE-CMM

    International Nuclear Information System (INIS)

    Yang Xiaohua; Liu Zhenghai; Liu Zhiming; Wan Yaping; Peng Guojian

    2012-01-01

    Safety culture, which is of great significance to establish safety objectives, characterizes level of enterprise safety production and development. Traditional safety culture evaluation models emphasis on thinking and behavior of individual and organization, and pay attention to evaluation results while ignore process. Moreover, determining evaluation indicators lacks objective evidence. A novel multidimensional safety culture evaluation model, which has scientific and completeness, is addressed by building an preliminary mapping between safety culture and SSE-CMM's (Systems Security Engineering Capability Maturity Model) process area and generic practice. The model focuses on enterprise system security engineering process evaluation and provides new ideas and scientific evidences for the study of safety culture. (authors)

  19. Estimating the Term Structure With a Semiparametric Bayesian Hierarchical Model: An Application to Corporate Bonds1

    Science.gov (United States)

    Cruz-Marcelo, Alejandro; Ensor, Katherine B.; Rosner, Gary L.

    2011-01-01

    The term structure of interest rates is used to price defaultable bonds and credit derivatives, as well as to infer the quality of bonds for risk management purposes. We introduce a model that jointly estimates term structures by means of a Bayesian hierarchical model with a prior probability model based on Dirichlet process mixtures. The modeling methodology borrows strength across term structures for purposes of estimation. The main advantage of our framework is its ability to produce reliable estimators at the company level even when there are only a few bonds per company. After describing the proposed model, we discuss an empirical application in which the term structure of 197 individual companies is estimated. The sample of 197 consists of 143 companies with only one or two bonds. In-sample and out-of-sample tests are used to quantify the improvement in accuracy that results from approximating the term structure of corporate bonds with estimators by company rather than by credit rating, the latter being a popular choice in the financial literature. A complete description of a Markov chain Monte Carlo (MCMC) scheme for the proposed model is available as Supplementary Material. PMID:21765566

  20. A conceptual framework for a long-term economic model for the treatment of attention-deficit/hyperactivity disorder.

    Science.gov (United States)

    Nagy, Balázs; Setyawan, Juliana; Coghill, David; Soroncz-Szabó, Tamás; Kaló, Zoltán; Doshi, Jalpa A

    2017-06-01

    Models incorporating long-term outcomes (LTOs) are not available to assess the health economic impact of attention-deficit/hyperactivity disorder (ADHD). Develop a conceptual modelling framework capable of assessing long-term economic impact of ADHD therapies. Literature was reviewed; a conceptual structure for the long-term model was outlined with attention to disease characteristics and potential impact of treatment strategies. The proposed model has four layers: i) multi-state short-term framework to differentiate between ADHD treatments; ii) multiple states being merged into three core health states associated with LTOs; iii) series of sub-models in which particular LTOs are depicted; iv) outcomes collected to be either used directly for economic analyses or translated into other relevant measures. This conceptual model provides a framework to assess relationships between short- and long-term outcomes of the disease and its treatment, and to estimate the economic impact of ADHD treatments throughout the course of the disease.

  1. Pathologic evaluation of normal and perfused term placental tissue

    DEFF Research Database (Denmark)

    Maroun, Lisa Leth; Mathiesen, Line; Hedegaard, Morten

    2014-01-01

    This study reports for the 1st time the incidence and interobserver variation of morphologic findings in a series of 34 term placentas from pregnancies with normal outcome used for perfusion studies. Histologic evaluation of placental tissue is challenging, especially when it comes to defining...... "normal tissue" versus "pathologic lesions." A scoring system for registration of abnormal morphologic findings was developed. Light microscopic examination was performed independently by 2 pathologists, and interobserver variation was analyzed. Findings in normal and perfused tissue were compared...... and selected findings were tested against success parameters from the perfusions. Finally, the criteria for frequent lesions with fair to poor interobserver variation in the nonperfused tissue were revised and reanalyzed. In the perfused tissue, the perfusion artefact "trophoblastic vacuolization," which...

  2. Long Term Validity of Monetary Exchange Rate Model: Evidence from Turkey

    Directory of Open Access Journals (Sweden)

    Ugur Ahmet

    2014-03-01

    Full Text Available In this study, it was analyzed if there is a long term relationship among the nominal exchange rate and monetary fundamentals within the periods of 1998:1-2011:2 in Turkey. This relationship has been analysed by using structural VAR (SVAR model. Besides, Granger causality test and Dolado-Lütkepohl Granger causality test were used to determine if there were a causality relationship among the nominal exchange rate and monetary fundamentals. As a result of the SVAR model, the relationship among the series related to nominal exchange rate and money supply, GDP, interest rate in Turkey in long term were not determined and at the end of causality tests, causality relationship among the nominal exchange rate and monetary fundamentals were not determined.

  3. New Source Term Model for the RESRAD-OFFSITE Code Version 3

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States); Gnanapragasam, Emmanuel [Argonne National Lab. (ANL), Argonne, IL (United States); Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Chen, Shih-Yew [Argonne National Lab. (ANL), Argonne, IL (United States)

    2013-06-01

    This report documents the new source term model developed and implemented in Version 3 of the RESRAD-OFFSITE code. This new source term model includes: (1) "first order release with transport" option, in which the release of the radionuclide is proportional to the inventory in the primary contamination and the user-specified leach rate is the proportionality constant, (2) "equilibrium desorption release" option, in which the user specifies the distribution coefficient which quantifies the partitioning of the radionuclide between the solid and aqueous phases, and (3) "uniform release" option, in which the radionuclides are released from a constant fraction of the initially contaminated material during each time interval and the user specifies the duration over which the radionuclides are released.

  4. Detecting robust signals of interannual variability of gross primary productivity in Asia from multiple terrestrial carbon cycle models and long-term satellite-based vegetation data

    Science.gov (United States)

    Ichii, K.; Kondo, M.; Ueyama, M.; Kato, T.; Ito, A.; Sasai, T.; Sato, H.; Kobayashi, H.; Saigusa, N.

    2014-12-01

    Long term record of satellite-based terrestrial vegetation are important to evaluate terrestrial carbon cycle models. In this study, we demonstrate how multiple satellite observation can be used for evaluating past changes in gross primary productivity (GPP) and detecting robust anomalies in terrestrial carbon cycle in Asia through our model-data synthesis analysis, Asia-MIP. We focused on the two different temporal coverages: long-term (30 years; 1982-2011) and decadal (10 years; 2001-2011; data intensive period) scales. We used a NOAA/AVHRR NDVI record for long-term analysis and multiple satellite data and products (e.g. Terra-MODIS, SPOT-VEGETATION) as historical satellite data, and multiple terrestrial carbon cycle models (e.g. BEAMS, Biome-BGC, ORCHIDEE, SEIB-DGVM, and VISIT). As a results of long-term (30 years) trend analysis, satellite-based time-series data showed that approximately 40% of the area has experienced a significant increase in the NDVI, while only a few areas have experienced a significant decreasing trend over the last 30 years. The increases in the NDVI were dominant in the sub-continental regions of Siberia, East Asia, and India. Simulations using the terrestrial biosphere models also showed significant increases in GPP, similar to the results for the NDVI, in boreal and temperate regions. A modeled sensitivity analysis showed that the increases in GPP are explained by increased temperature and precipitation in Siberia. Precipitation, solar radiation, CO2fertilization and land cover changes are important factors in the tropical regions. However, the relative contributions of each factor to GPP changes are different among the models. Year-to-year variations of terrestrial GPP were overall consistently captured by the satellite data and terrestrial carbon cycle models if the anomalies are large (e.g. 2003 summer GPP anomalies in East Asia and 2002 spring GPP anomalies in mid to high latitudes). The behind mechanisms can be consistently

  5. Development of a comprehensive source term model for the Subsurface Disposal Area at the Idaho National Engineering and Environmental Laboratory

    International Nuclear Information System (INIS)

    1997-01-01

    The first detailed comprehensive simulation study to evaluate fate and transport of wastes disposed in the Subsurface Disposal Area (SDA), at the Radioactive Waste Management Complex (RWMC), Idaho National Engineering and Environmental Laboratory (INEEL) has recently been conducted. One of the most crucial parts of this modeling was the source term or release model. The current study used information collected over the last five years defining contaminant specific information including: the amount disposed, the waste form (physical and chemical properties) and the type of container used for each contaminant disposed. This information was used to simulate the release of contaminants disposed in the shallow subsurface at the SDA. The DUST-MS model was used to simulate the release. Modifications were made to allow the yearly disposal information to be incorporated. The modeling includes unique container and release rate information for each of the 42 years of disposal. The results from this simulation effort are used for both a groundwater and a biotic uptake evaluation. As part of this modeling exercise, inadequacies in the available data relating to the release of contaminants have been identified. The results from this modeling study have been used to guide additional data collection activities at the SDA for purposes of increasing confidence in the appropriateness of model predictions

  6. CRITICAL ANALYSIS OF EVALUATION MODEL LOMCE

    Directory of Open Access Journals (Sweden)

    José Luis Bernal Agudo

    2015-06-01

    Full Text Available The evaluation model that the LOMCE projects sinks its roots into the neoliberal beliefs, reflecting a specific way of understanding the world. What matters is not the process but the results, being the evaluation the center of the education-learning processes. It presents an evil planning, since the theory that justifies the model doesn’t specify upon coherent proposals, where there is an excessive worry for excellence and diversity is left out. A comprehensive way of understanding education should be recovered.

  7. Statistical modeling for visualization evaluation through data fusion.

    Science.gov (United States)

    Chen, Xiaoyu; Jin, Ran

    2017-11-01

    There is a high demand of data visualization providing insights to users in various applications. However, a consistent, online visualization evaluation method to quantify mental workload or user preference is lacking, which leads to an inefficient visualization and user interface design process. Recently, the advancement of interactive and sensing technologies makes the electroencephalogram (EEG) signals, eye movements as well as visualization logs available in user-centered evaluation. This paper proposes a data fusion model and the application procedure for quantitative and online visualization evaluation. 15 participants joined the study based on three different visualization designs. The results provide a regularized regression model which can accurately predict the user's evaluation of task complexity, and indicate the significance of all three types of sensing data sets for visualization evaluation. This model can be widely applied to data visualization evaluation, and other user-centered designs evaluation and data analysis in human factors and ergonomics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. US Department of Energy Approach to Probabilistic Evaluation of Long-Term Safety for a Potential Yucca Mountain Repository

    International Nuclear Information System (INIS)

    Dr. R. Dyer; Dr. R. Andrews; Dr. A. Van Luik

    2005-01-01

    Regulatory requirements being addressed in the US geological repository program for spent nuclear fuel and high-level waste disposal specify probabilistically defined mean-value dose limits. These dose limits reflect acceptable levels of risk. The probabilistic approach mandated by regulation calculates a ''risk of a dose,'' a risk of a potential given dose value at a specific time in the future to a hypothetical person. The mean value of the time-dependent performance measure needs to remain below an acceptable level defined by regulation. Because there are uncertain parameters that are important to system performance, the regulation mandates an analysis focused on the mean value of the performance measure, but that also explores the ''full range of defensible and reasonable parameter distributions''...System performance evaluations should not be unduly influenced by...''extreme physical situations and parameter values''. Challenges in this approach lie in defending the scientific basis for the models selected, and the data and distributions sampled. A significant challenge lies in showing that uncertainties are properly identified and evaluated. A single-value parameter has no uncertainty, and where used such values need to be supported by scientific information showing the selected value is appropriate. Uncertainties are inherent in data, but are also introduced by creating parameter distributions from data sets, selecting models from among alternative models, abstracting models for use in probabilistic analysis, and in selecting the range of initiating event probabilities for unlikely events. The goal of the assessment currently in progress is to evaluate the level of risk inherent in moving ahead to the next phase of repository development: construction. During the construction phase, more will be learned to inform a new long-term risk evaluation to support moving to the next phase: accepting waste. Therefore, though there was sufficient confidence of safety

  9. Evaluation Model of the Entrepreneurial Character in EU Countries

    Directory of Open Access Journals (Sweden)

    Sebastian Madalin Munteanu

    2015-02-01

    Full Text Available The evidence of entrepreneurship development as a factor of sustainable growth at national and regional level frequently calls for the interest of theorists and practitioners on identifying and outlining the best conditions and economic essential prerequisites for supporting the entrepreneurial initiatives on the long term. In this context, the objective of the present research is to analyse and measure the entrepreneurial character of the European Union member countries in an integrated manner, by developing an innovative model for proposing specific action lines and objectively evaluating the entrepreneurship development in the investigated states. Our model is based on a synthesis variable of the entrepreneurial national character, which was developed by sequential application of principal component analysis, while the initial variables are from secondary sources with good conceptual representativeness. Depending on the objective relevance of the three model components (cultural, economic and administrative, and entrepreneurial education components, the achieved results confirm the importance of a favourable cultural and economic and administrative background for entrepreneurship development and they reiterate the inefficiency of isolated entrepreneurial education unless supported by good entrepreneurial culture or adequate economic and administrative infrastructure. The case of Romania, in relation with the European Union member countries, is presented in detail.

  10. Intermediate-term emotional bookkeeping is necessary for long-term reciprocal grooming partner preferences in an agent-based model of macaque groups

    Directory of Open Access Journals (Sweden)

    Ellen Evers

    2016-01-01

    Full Text Available Whether and how primates are able to maintain long-term affiliative relationships is still under debate. Emotional bookkeeping (EB, the partner-specific accumulation of emotional responses to earlier interactions, is a candidate mechanism that does not require high cognitive abilities. EB is difficult to study in real animals, due to the complexity of primate social life. Therefore, we developed an agent-based model based on macaque behavior, the EMO-model, that implements arousal and two emotional dimensions, anxiety-FEAR and satisfaction-LIKE, which regulate social behavior. To implement EB, model individuals assign dynamic LIKE attitudes towards their group members, integrating partner-specific emotional responses to earlier received grooming episodes. Two key parameters in the model were varied to explore their effects on long-term affiliative relationships: (1 the timeframe over which earlier affiliation is accumulated into the LIKE attitudes; and (2 the degree of partner selectivity. EB over short and long timeframes gave rise to low variation in LIKE attitudes, and grooming partner preferences were only maintained over one to two months. Only EB over intermediate-term timeframes resulted in enough variation in LIKE attitudes, which, in combination with high partner selectivity, enables individuals to differentiate between regular and incidental grooming partners. These specific settings resulted in a strong feedback between differentiated LIKE attitudes and the distribution of grooming, giving rise to strongly reciprocated partner preferences that could be maintained for longer periods, occasionally up to one or two years. Moreover, at these settings the individual’s internal, socio-emotional memory of earlier affiliative episodes (LIKE attitudes corresponded best to observable behavior (grooming partner preferences. In sum, our model suggests that intermediate-term LIKE dynamics and high partner selectivity seem most plausible for

  11. Intermediate-term emotional bookkeeping is necessary for long-term reciprocal grooming partner preferences in an agent-based model of macaque groups.

    Science.gov (United States)

    Evers, Ellen; de Vries, Han; Spruijt, Berry M; Sterck, Elisabeth H M

    2016-01-01

    Whether and how primates are able to maintain long-term affiliative relationships is still under debate. Emotional bookkeeping (EB), the partner-specific accumulation of emotional responses to earlier interactions, is a candidate mechanism that does not require high cognitive abilities. EB is difficult to study in real animals, due to the complexity of primate social life. Therefore, we developed an agent-based model based on macaque behavior, the EMO-model, that implements arousal and two emotional dimensions, anxiety-FEAR and satisfaction-LIKE, which regulate social behavior. To implement EB, model individuals assign dynamic LIKE attitudes towards their group members, integrating partner-specific emotional responses to earlier received grooming episodes. Two key parameters in the model were varied to explore their effects on long-term affiliative relationships: (1) the timeframe over which earlier affiliation is accumulated into the LIKE attitudes; and (2) the degree of partner selectivity. EB over short and long timeframes gave rise to low variation in LIKE attitudes, and grooming partner preferences were only maintained over one to two months. Only EB over intermediate-term timeframes resulted in enough variation in LIKE attitudes, which, in combination with high partner selectivity, enables individuals to differentiate between regular and incidental grooming partners. These specific settings resulted in a strong feedback between differentiated LIKE attitudes and the distribution of grooming, giving rise to strongly reciprocated partner preferences that could be maintained for longer periods, occasionally up to one or two years. Moreover, at these settings the individual's internal, socio-emotional memory of earlier affiliative episodes (LIKE attitudes) corresponded best to observable behavior (grooming partner preferences). In sum, our model suggests that intermediate-term LIKE dynamics and high partner selectivity seem most plausible for primates relying on

  12. Intermediate-term emotional bookkeeping is necessary for long-term reciprocal grooming partner preferences in an agent-based model of macaque groups

    Science.gov (United States)

    Evers, Ellen; de Vries, Han; Spruijt, Berry M.

    2016-01-01

    Whether and how primates are able to maintain long-term affiliative relationships is still under debate. Emotional bookkeeping (EB), the partner-specific accumulation of emotional responses to earlier interactions, is a candidate mechanism that does not require high cognitive abilities. EB is difficult to study in real animals, due to the complexity of primate social life. Therefore, we developed an agent-based model based on macaque behavior, the EMO-model, that implements arousal and two emotional dimensions, anxiety-FEAR and satisfaction-LIKE, which regulate social behavior. To implement EB, model individuals assign dynamic LIKE attitudes towards their group members, integrating partner-specific emotional responses to earlier received grooming episodes. Two key parameters in the model were varied to explore their effects on long-term affiliative relationships: (1) the timeframe over which earlier affiliation is accumulated into the LIKE attitudes; and (2) the degree of partner selectivity. EB over short and long timeframes gave rise to low variation in LIKE attitudes, and grooming partner preferences were only maintained over one to two months. Only EB over intermediate-term timeframes resulted in enough variation in LIKE attitudes, which, in combination with high partner selectivity, enables individuals to differentiate between regular and incidental grooming partners. These specific settings resulted in a strong feedback between differentiated LIKE attitudes and the distribution of grooming, giving rise to strongly reciprocated partner preferences that could be maintained for longer periods, occasionally up to one or two years. Moreover, at these settings the individual’s internal, socio-emotional memory of earlier affiliative episodes (LIKE attitudes) corresponded best to observable behavior (grooming partner preferences). In sum, our model suggests that intermediate-term LIKE dynamics and high partner selectivity seem most plausible for primates relying on

  13. End-to-side neurorraphy: a long-term study of neural regeneration in a rat model.

    Science.gov (United States)

    Tarasidis, G; Watanabe, O; Mackinnon, S E; Strasberg, S R; Haughey, B H; Hunter, D A

    1998-10-01

    This study evaluated long-term reinnervation of an end-to-side neurorraphy and the resultant functional recovery in a rat model. The divided distal posterior tibial nerve was repaired to the side of an intact peroneal nerve. Control groups included a cut-and-repair of the posterior tibial nerve and an end-to-end repair of the peroneal nerve to the posterior tibial nerve. Evaluations included walking-track analysis, nerve conduction studies, muscle mass measurements, retrograde nerve tracing, and histologic evaluation. Walking tracks indicated poor recovery of posterior tibial nerve function in the experimental group. No significant difference in nerve conduction velocities was seen between the experimental and control groups. Gastrocnemius muscle mass measurements revealed no functional recovery in the experimental group. Similarly, retrograde nerve tracing revealed minimal motor neuron staining in the experimental group. However, some sensory staining was seen within the dorsal root ganglia of the end-to-side group. Histologic study revealed minimal myelinated axonal regeneration in the experimental group as compared with findings in the other groups. These results suggest that predominantly sensory regeneration occurs in an end-to-side neurorraphy at an end point of 6 months.

  14. A Convergent Participation Model for Evaluation of Learning Objects

    Directory of Open Access Journals (Sweden)

    John Nesbit

    2002-10-01

    Full Text Available The properties that distinguish learning objects from other forms of educational software - global accessibility, metadata standards, finer granularity and reusability - have implications for evaluation. This article proposes a convergent participation model for learning object evaluation in which representatives from stakeholder groups (e.g., students, instructors, subject matter experts, instructional designers, and media developers converge toward more similar descriptions and ratings through a two-stage process supported by online collaboration tools. The article reviews evaluation models that have been applied to educational software and media, considers models for gathering and meta-evaluating individual user reviews that have recently emerged on the Web, and describes the peer review model adopted for the MERLOT repository. The convergent participation model is assessed in relation to other models and with respect to its support for eight goals of learning object evaluation: (1 aid for searching and selecting, (2 guidance for use, (3 formative evaluation, (4 influence on design practices, (5 professional development and student learning, (6 community building, (7 social recognition, and (8 economic exchange.

  15. Biology learning evaluation model in Senior High Schools

    Directory of Open Access Journals (Sweden)

    Sri Utari

    2017-06-01

    Full Text Available The study was to develop a Biology learning evaluation model in senior high schools that referred to the research and development model by Borg & Gall and the logic model. The evaluation model included the components of input, activities, output and outcomes. The developing procedures involved a preliminary study in the form of observation and theoretical review regarding the Biology learning evaluation in senior high schools. The product development was carried out by designing an evaluation model, designing an instrument, performing instrument experiment and performing implementation. The instrument experiment involved teachers and Students from Grade XII in senior high schools located in the City of Yogyakarta. For the data gathering technique and instrument, the researchers implemented observation sheet, questionnaire and test. The questionnaire was applied in order to attain information regarding teacher performance, learning performance, classroom atmosphere and scientific attitude; on the other hand, test was applied in order to attain information regarding Biology concept mastery. Then, for the analysis of instrument construct, the researchers performed confirmatory factor analysis by means of Lisrel 0.80 software and the results of this analysis showed that the evaluation instrument valid and reliable. The construct validity was between 0.43-0.79 while the reliability of measurement model was between 0.88-0.94. Last but not the least, the model feasibility test showed that the theoretical model had been supported by the empirical data.

  16. Exploring the predictive power of interaction terms in a sophisticated risk equalization model using regression trees.

    Science.gov (United States)

    van Veen, S H C M; van Kleef, R C; van de Ven, W P M M; van Vliet, R C J A

    2018-02-01

    This study explores the predictive power of interaction terms between the risk adjusters in the Dutch risk equalization (RE) model of 2014. Due to the sophistication of this RE-model and the complexity of the associations in the dataset (N = ~16.7 million), there are theoretically more than a million interaction terms. We used regression tree modelling, which has been applied rarely within the field of RE, to identify interaction terms that statistically significantly explain variation in observed expenses that is not already explained by the risk adjusters in this RE-model. The interaction terms identified were used as additional risk adjusters in the RE-model. We found evidence that interaction terms can improve the prediction of expenses overall and for specific groups in the population. However, the prediction of expenses for some other selective groups may deteriorate. Thus, interactions can reduce financial incentives for risk selection for some groups but may increase them for others. Furthermore, because regression trees are not robust, additional criteria are needed to decide which interaction terms should be used in practice. These criteria could be the right incentive structure for risk selection and efficiency or the opinion of medical experts. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Modeling of long-term energy system of Japan

    International Nuclear Information System (INIS)

    Gotoh, Yoshitaka; Sato, Osamu; Tadokoro, Yoshihiro

    1999-07-01

    In order to analyze the future potential of reducing carbon dioxide emissions, the long-term energy system of Japan was modeled following the framework of the MARKAL model, and the database of energy technology characteristics was developed. First, a reference energy system was built by incorporating all important energy sources and technologies that will be available until the year 2050. This system consists of 25 primary energy sources, 33 technologies for electric power generation and/or low temperature heat production, 97 technologies for energy transformation, storage, and distribution, and 170 end-use technologies. Second, the database was developed for the characteristics of individual technologies in the system. The characteristic data consists of input and output of energy carriers, efficiency, availability, lifetime, investment cost, operation and maintenance cost, CO 2 emission coefficient, and others. Since a large number of technologies are included in the system, this report focuses modeling of a supply side, and involves the database of energy technologies other than for end-use purposes. (author)

  18. Development and validation of a Markov microsimulation model for the economic evaluation of treatments in osteoporosis.

    Science.gov (United States)

    Hiligsmann, Mickaël; Ethgen, Olivier; Bruyère, Olivier; Richy, Florent; Gathon, Henry-Jean; Reginster, Jean-Yves

    2009-01-01

    Markov models are increasingly used in economic evaluations of treatments for osteoporosis. Most of the existing evaluations are cohort-based Markov models missing comprehensive memory management and versatility. In this article, we describe and validate an original Markov microsimulation model to accurately assess the cost-effectiveness of prevention and treatment of osteoporosis. We developed a Markov microsimulation model with a lifetime horizon and a direct health-care cost perspective. The patient history was recorded and was used in calculations of transition probabilities, utilities, and costs. To test the internal consistency of the model, we carried out an example calculation for alendronate therapy. Then, external consistency was investigated by comparing absolute lifetime risk of fracture estimates with epidemiologic data. For women at age 70 years, with a twofold increase in the fracture risk of the average population, the costs per quality-adjusted life-year gained for alendronate therapy versus no treatment were estimated at €9105 and €15,325, respectively, under full and realistic adherence assumptions. All the sensitivity analyses in terms of model parameters and modeling assumptions were coherent with expected conclusions and absolute lifetime risk of fracture estimates were within the range of previous estimates, which confirmed both internal and external consistency of the model. Microsimulation models present some major advantages over cohort-based models, increasing the reliability of the results and being largely compatible with the existing state of the art, evidence-based literature. The developed model appears to be a valid model for use in economic evaluations in osteoporosis.

  19. Model Performance Evaluation and Scenario Analysis (MPESA)

    Science.gov (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  20. A Literature Survey and Experimental Evaluation of the State-of-the-Art in Uplift Modeling: A Stepping Stone Toward the Development of Prescriptive Analytics.

    Science.gov (United States)

    Devriendt, Floris; Moldovan, Darie; Verbeke, Wouter

    2018-03-01

    Prescriptive analytics extends on predictive analytics by allowing to estimate an outcome in function of control variables, allowing as such to establish the required level of control variables for realizing a desired outcome. Uplift modeling is at the heart of prescriptive analytics and aims at estimating the net difference in an outcome resulting from a specific action or treatment that is applied. In this article, a structured and detailed literature survey on uplift modeling is provided by identifying and contrasting various groups of approaches. In addition, evaluation metrics for assessing the performance of uplift models are reviewed. An experimental evaluation on four real-world data sets provides further insight into their use. Uplift random forests are found to be consistently among the best performing techniques in terms of the Qini and Gini measures, although considerable variability in performance across the various data sets of the experiments is observed. In addition, uplift models are frequently observed to be unstable and display a strong variability in terms of performance across different folds in the cross-validation experimental setup. This potentially threatens their actual use for business applications. Moreover, it is found that the available evaluation metrics do not provide an intuitively understandable indication of the actual use and performance of a model. Specifically, existing evaluation metrics do not facilitate a comparison of uplift models and predictive models and evaluate performance either at an arbitrary cutoff or over the full spectrum of potential cutoffs. In conclusion, we highlight the instability of uplift models and the need for an application-oriented approach to assess uplift models as prime topics for further research.