WorldWideScience

Sample records for part ii uncertainty

  1. Uncertainty estimation with a small number of measurements, part II: a redefinition of uncertainty and an estimator method

    Science.gov (United States)

    Huang, Hening

    2018-01-01

    This paper is the second (Part II) in a series of two papers (Part I and Part II). Part I has quantitatively discussed the fundamental limitations of the t-interval method for uncertainty estimation with a small number of measurements. This paper (Part II) reveals that the t-interval is an ‘exact’ answer to a wrong question; it is actually misused in uncertainty estimation. This paper proposes a redefinition of uncertainty, based on the classical theory of errors and the theory of point estimation, and a modification of the conventional approach to estimating measurement uncertainty. It also presents an asymptotic procedure for estimating the z-interval. The proposed modification is to replace the t-based uncertainty with an uncertainty estimator (mean- or median-unbiased). The uncertainty estimator method is an approximate answer to the right question to uncertainty estimation. The modified approach provides realistic estimates of uncertainty, regardless of whether the population standard deviation is known or unknown, or if the sample size is small or large. As an application example of the modified approach, this paper presents a resolution to the Du-Yang paradox (i.e. Paradox 2), one of the three paradoxes caused by the misuse of the t-interval in uncertainty estimation.

  2. Modeling multibody systems with uncertainties. Part II: Numerical applications

    Energy Technology Data Exchange (ETDEWEB)

    Sandu, Corina, E-mail: csandu@vt.edu; Sandu, Adrian; Ahmadian, Mehdi [Virginia Polytechnic Institute and State University, Mechanical Engineering Department (United States)

    2006-04-15

    This study applies generalized polynomial chaos theory to model complex nonlinear multibody dynamic systems operating in the presence of parametric and external uncertainty. Theoretical and computational aspects of this methodology are discussed in the companion paper 'Modeling Multibody Dynamic Systems With Uncertainties. Part I: Theoretical and Computational Aspects .In this paper we illustrate the methodology on selected test cases. The combined effects of parametric and forcing uncertainties are studied for a quarter car model. The uncertainty distributions in the system response in both time and frequency domains are validated against Monte-Carlo simulations. Results indicate that polynomial chaos is more efficient than Monte Carlo and more accurate than statistical linearization. The results of the direct collocation approach are similar to the ones obtained with the Galerkin approach. A stochastic terrain model is constructed using a truncated Karhunen-Loeve expansion. The application of polynomial chaos to differential-algebraic systems is illustrated using the constrained pendulum problem. Limitations of the polynomial chaos approach are studied on two different test problems, one with multiple attractor points, and the second with a chaotic evolution and a nonlinear attractor set. The overall conclusion is that, despite its limitations, generalized polynomial chaos is a powerful approach for the simulation of multibody dynamic systems with uncertainties.

  3. Modeling multibody systems with uncertainties. Part II: Numerical applications

    International Nuclear Information System (INIS)

    Sandu, Corina; Sandu, Adrian; Ahmadian, Mehdi

    2006-01-01

    This study applies generalized polynomial chaos theory to model complex nonlinear multibody dynamic systems operating in the presence of parametric and external uncertainty. Theoretical and computational aspects of this methodology are discussed in the companion paper 'Modeling Multibody Dynamic Systems With Uncertainties. Part I: Theoretical and Computational Aspects .In this paper we illustrate the methodology on selected test cases. The combined effects of parametric and forcing uncertainties are studied for a quarter car model. The uncertainty distributions in the system response in both time and frequency domains are validated against Monte-Carlo simulations. Results indicate that polynomial chaos is more efficient than Monte Carlo and more accurate than statistical linearization. The results of the direct collocation approach are similar to the ones obtained with the Galerkin approach. A stochastic terrain model is constructed using a truncated Karhunen-Loeve expansion. The application of polynomial chaos to differential-algebraic systems is illustrated using the constrained pendulum problem. Limitations of the polynomial chaos approach are studied on two different test problems, one with multiple attractor points, and the second with a chaotic evolution and a nonlinear attractor set. The overall conclusion is that, despite its limitations, generalized polynomial chaos is a powerful approach for the simulation of multibody dynamic systems with uncertainties

  4. Operational hydrological forecasting in Bavaria. Part I: Forecast uncertainty

    Science.gov (United States)

    Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.

    2009-04-01

    In Bavaria, operational flood forecasting has been established since the disastrous flood of 1999. Nowadays, forecasts based on rainfall information from about 700 raingauges and 600 rivergauges are calculated and issued for nearly 100 rivergauges. With the added experience of the 2002 and 2005 floods, awareness grew that the standard deterministic forecast, neglecting the uncertainty associated with each forecast is misleading, creating a false feeling of unambiguousness. As a consequence, a system to identify, quantify and communicate the sources and magnitude of forecast uncertainty has been developed, which will be presented in part I of this study. In this system, the use of ensemble meteorological forecasts plays a key role which will be presented in part II. Developing the system, several constraints stemming from the range of hydrological regimes and operational requirements had to be met: Firstly, operational time constraints obviate the variation of all components of the modeling chain as would be done in a full Monte Carlo simulation. Therefore, an approach was chosen where only the most relevant sources of uncertainty were dynamically considered while the others were jointly accounted for by static error distributions from offline analysis. Secondly, the dominant sources of uncertainty vary over the wide range of forecasted catchments: In alpine headwater catchments, typically of a few hundred square kilometers in size, rainfall forecast uncertainty is the key factor for forecast uncertainty, with a magnitude dynamically changing with the prevailing predictability of the atmosphere. In lowland catchments encompassing several thousands of square kilometers, forecast uncertainty in the desired range (usually up to two days) is mainly dependent on upstream gauge observation quality, routing and unpredictable human impact such as reservoir operation. The determination of forecast uncertainty comprised the following steps: a) From comparison of gauge

  5. Uncertainty for Part Density Determination: An Update

    Energy Technology Data Exchange (ETDEWEB)

    Valdez, Mario Orlando [Los Alamos National Laboratory

    2016-12-14

    Accurate and precise density measurements by hydrostatic weighing requires the use of an analytical balance, configured with a suspension system, to both measure the weight of a part in water and in air. Additionally, the densities of these liquid media (water and air) must be precisely known for the part density determination. To validate the accuracy and precision of these measurements, uncertainty statements are required. The work in this report is a revision of an original report written more than a decade ago, specifically applying principles and guidelines suggested by the Guide to the Expression of Uncertainty in Measurement (GUM) for determining the part density uncertainty through sensitivity analysis. In this work, updated derivations are provided; an original example is revised with the updated derivations and appendix, provided solely to uncertainty evaluations using Monte Carlo techniques, specifically using the NIST Uncertainty Machine, as a viable alternative method.

  6. Modeling Multibody Systems with Uncertainties. Part I: Theoretical and Computational Aspects

    International Nuclear Information System (INIS)

    Sandu, Adrian; Sandu, Corina; Ahmadian, Mehdi

    2006-01-01

    This study explores the use of generalized polynomial chaos theory for modeling complex nonlinear multibody dynamic systems in the presence of parametric and external uncertainty. The polynomial chaos framework has been chosen because it offers an efficient computational approach for the large, nonlinear multibody models of engineering systems of interest, where the number of uncertain parameters is relatively small, while the magnitude of uncertainties can be very large (e.g., vehicle-soil interaction). The proposed methodology allows the quantification of uncertainty distributions in both time and frequency domains, and enables the simulations of multibody systems to produce results with 'error bars'. The first part of this study presents the theoretical and computational aspects of the polynomial chaos methodology. Both unconstrained and constrained formulations of multibody dynamics are considered. Direct stochastic collocation is proposed as less expensive alternative to the traditional Galerkin approach. It is established that stochastic collocation is equivalent to a stochastic response surface approach. We show that multi-dimensional basis functions are constructed as tensor products of one-dimensional basis functions and discuss the treatment of polynomial and trigonometric nonlinearities. Parametric uncertainties are modeled by finite-support probability densities. Stochastic forcings are discretized using truncated Karhunen-Loeve expansions. The companion paper 'Modeling Multibody Dynamic Systems With Uncertainties. Part II: Numerical Applications' illustrates the use of the proposed methodology on a selected set of test problems. The overall conclusion is that despite its limitations, polynomial chaos is a powerful approach for the simulation of multibody systems with uncertainties

  7. Uncertainty and sensitivity studies supporting the interpretation of the results of TVO I/II PRA

    International Nuclear Information System (INIS)

    Holmberg, J.

    1992-01-01

    A comprehensive Level 1 probabilistic risk assessment (PRA) has been performed for the TVO I/II nuclear power units. As a part of the PRA project, uncertainties of risk models and methods were systematically studied in order to describe them and to demonstrate their impact by way of results. The uncertainty study was divided into two phases: a qualitative and a quantitative study. The qualitative study contained identification of uncertainties and qualitative assessments of their importance. The PRA was introduced, and identified assumptions and uncertainties behind the models were documented. The most significant uncertainties were selected by importance measures or other judgements for further quantitative studies. The quantitative study included sensitivity studies and propagation of uncertainty ranges. In the sensitivity studies uncertain assumptions or parameters were varied in order to illustrate the sensitivity of the models. The propagation of the uncertainty ranges demonstrated the impact of the statistical uncertainties of the parameter values. The Monte Carlo method was used as a propagation method. The most significant uncertainties were those involved in modelling human interactions, dependences and common cause failures (CCFs), loss of coolant accident (LOCA) frequencies and pressure suppression. The qualitative mapping out of the uncertainty factors turned out to be useful in planning quantitative studies. It also served as internal review of the assumptions made in the PRA. The sensitivity studies were perhaps the most advantageous part of the quantitative study because they allowed individual analyses of the significance of uncertainty sources identified. The uncertainty study was found reasonable in systematically and critically assessing uncertainties in a risk analysis. The usefulness of this study depends on the decision maker (power company) since uncertainty studies are primarily carried out to support decision making when uncertainties are

  8. Guidelines for uncertainty analysis developed for the participants in the BIOMOVS II study

    International Nuclear Information System (INIS)

    Baeverstam, U.; Davis, P.; Garcia-Olivares, A.; Henrich, E.; Koch, J.

    1993-07-01

    This report has been produced to provide guidelines for uncertainty analysis for use by participants in the BIOMOVS II study. It is hoped that others with an interest in modelling contamination in the biosphere will also find it useful. The report has been prepared by members of the Uncertainty and Validation Working Group and has been reviewed by other BIOMOVS II participants. The opinions expressed are those of the authors and should not be taken to represent the views of the BIOMOVS II sponsors or other BIOMOVS Il participating organisations

  9. Guidelines for uncertainty analysis developed for the participants in the BIOMOVS II study

    Energy Technology Data Exchange (ETDEWEB)

    Baeverstam, U; Davis, P; Garcia-Olivares, A; Henrich, E; Koch, J

    1993-07-01

    This report has been produced to provide guidelines for uncertainty analysis for use by participants in the BIOMOVS II study. It is hoped that others with an interest in modelling contamination in the biosphere will also find it useful. The report has been prepared by members of the Uncertainty and Validation Working Group and has been reviewed by other BIOMOVS II participants. The opinions expressed are those of the authors and should not be taken to represent the views of the BIOMOVS II sponsors or other BIOMOVS Il participating organisations.

  10. Uncertainty in Measurement: Procedures for Determining Uncertainty With Application to Clinical Laboratory Calculations.

    Science.gov (United States)

    Frenkel, Robert B; Farrance, Ian

    2018-01-01

    The "Guide to the Expression of Uncertainty in Measurement" (GUM) is the foundational document of metrology. Its recommendations apply to all areas of metrology including metrology associated with the biomedical sciences. When the output of a measurement process depends on the measurement of several inputs through a measurement equation or functional relationship, the propagation of uncertainties in the inputs to the uncertainty in the output demands a level of understanding of the differential calculus. This review is intended as an elementary guide to the differential calculus and its application to uncertainty in measurement. The review is in two parts. In Part I, Section 3, we consider the case of a single input and introduce the concepts of error and uncertainty. Next we discuss, in the following sections in Part I, such notions as derivatives and differentials, and the sensitivity of an output to errors in the input. The derivatives of functions are obtained using very elementary mathematics. The overall purpose of this review, here in Part I and subsequently in Part II, is to present the differential calculus for those in the medical sciences who wish to gain a quick but accurate understanding of the propagation of uncertainties. © 2018 Elsevier Inc. All rights reserved.

  11. Accounting for Uncertainties in Strengths of SiC MEMS Parts

    Science.gov (United States)

    Nemeth, Noel; Evans, Laura; Beheim, Glen; Trapp, Mark; Jadaan, Osama; Sharpe, William N., Jr.

    2007-01-01

    A methodology has been devised for accounting for uncertainties in the strengths of silicon carbide structural components of microelectromechanical systems (MEMS). The methodology enables prediction of the probabilistic strengths of complexly shaped MEMS parts using data from tests of simple specimens. This methodology is intended to serve as a part of a rational basis for designing SiC MEMS, supplementing methodologies that have been borrowed from the art of designing macroscopic brittle material structures. The need for this or a similar methodology arises as a consequence of the fundamental nature of MEMS and the brittle silicon-based materials of which they are typically fabricated. When tested to fracture, MEMS and structural components thereof show wide part-to-part scatter in strength. The methodology involves the use of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) software in conjunction with the ANSYS Probabilistic Design System (PDS) software to simulate or predict the strength responses of brittle material components while simultaneously accounting for the effects of variability of geometrical features on the strength responses. As such, the methodology involves the use of an extended version of the ANSYS/CARES/PDS software system described in Probabilistic Prediction of Lifetimes of Ceramic Parts (LEW-17682-1/4-1), Software Tech Briefs supplement to NASA Tech Briefs, Vol. 30, No. 9 (September 2006), page 10. The ANSYS PDS software enables the ANSYS finite-element-analysis program to account for uncertainty in the design-and analysis process. The ANSYS PDS software accounts for uncertainty in material properties, dimensions, and loading by assigning probabilistic distributions to user-specified model parameters and performing simulations using various sampling techniques.

  12. Development of electrical efficiency measurement techniques for 10 kW-class SOFC system: Part II. Uncertainty estimation

    International Nuclear Information System (INIS)

    Tanaka, Yohei; Momma, Akihiko; Kato, Ken; Negishi, Akira; Takano, Kiyonami; Nozaki, Ken; Kato, Tohru

    2009-01-01

    Uncertainty of electrical efficiency measurement was investigated for a 10 kW-class SOFC system using town gas. Uncertainty of heating value measured by the gas chromatography method on a mole base was estimated as ±0.12% at 95% level of confidence. Micro-gas chromatography with/without CH 4 quantification may be able to reduce uncertainty of measurement. Calibration and uncertainty estimation methods are proposed for flow-rate measurement of town gas with thermal mass-flow meters or controllers. By adequate calibrations for flowmeters, flow rate of town gas or natural gas at 35 standard litters per minute can be measured within relative uncertainty ±1.0% at 95 % level of confidence. Uncertainty of power measurement can be as low as ±0.14% when a precise wattmeter is used and calibrated properly. It is clarified that electrical efficiency for non-pressurized 10 kW-class SOFC systems can be measured within ±1.0% relative uncertainty at 95% level of confidence with the developed techniques when the SOFC systems are operated relatively stably

  13. Uncertainty Communication. Issues and good practice

    International Nuclear Information System (INIS)

    Kloprogge, P.; Van der Sluijs, J.; Wardekker, A.

    2007-12-01

    policy debates. Finally, in part II assistance is offered in customising the (uncertainty) information of an assessment for communication and reporting purposes. Part 3 contains practical information on how to communicate uncertainties. It addresses aspects of uncertainty that might be important in a specific situation, do's and don'ts, pitfalls to be avoided, and hints on how to communicate this uncertainty information

  14. Cosmological parameter uncertainties from SALT-II type Ia supernova light curve models

    International Nuclear Information System (INIS)

    Mosher, J.; Sako, M.; Guy, J.; Astier, P.; Betoule, M.; El-Hage, P.; Pain, R.; Regnault, N.; Kessler, R.; Frieman, J. A.; Marriner, J.; Biswas, R.; Kuhlmann, S.; Schneider, D. P.

    2014-01-01

    We use simulated type Ia supernova (SN Ia) samples, including both photometry and spectra, to perform the first direct validation of cosmology analysis using the SALT-II light curve model. This validation includes residuals from the light curve training process, systematic biases in SN Ia distance measurements, and a bias on the dark energy equation of state parameter w. Using the SN-analysis package SNANA, we simulate and analyze realistic samples corresponding to the data samples used in the SNLS3 analysis: ∼120 low-redshift (z < 0.1) SNe Ia, ∼255 Sloan Digital Sky Survey SNe Ia (z < 0.4), and ∼290 SNLS SNe Ia (z ≤ 1). To probe systematic uncertainties in detail, we vary the input spectral model, the model of intrinsic scatter, and the smoothing (i.e., regularization) parameters used during the SALT-II model training. Using realistic intrinsic scatter models results in a slight bias in the ultraviolet portion of the trained SALT-II model, and w biases (w input – w recovered ) ranging from –0.005 ± 0.012 to –0.024 ± 0.010. These biases are indistinguishable from each other within the uncertainty; the average bias on w is –0.014 ± 0.007.

  15. Cosmological Parameter Uncertainties from SALT-II Type Ia Supernova Light Curve Models

    Energy Technology Data Exchange (ETDEWEB)

    Mosher, J. [Pennsylvania U.; Guy, J. [LBL, Berkeley; Kessler, R. [Chicago U., KICP; Astier, P. [Paris U., VI-VII; Marriner, J. [Fermilab; Betoule, M. [Paris U., VI-VII; Sako, M. [Pennsylvania U.; El-Hage, P. [Paris U., VI-VII; Biswas, R. [Argonne; Pain, R. [Paris U., VI-VII; Kuhlmann, S. [Argonne; Regnault, N. [Paris U., VI-VII; Frieman, J. A. [Fermilab; Schneider, D. P. [Penn State U.

    2014-08-29

    We use simulated type Ia supernova (SN Ia) samples, including both photometry and spectra, to perform the first direct validation of cosmology analysis using the SALT-II light curve model. This validation includes residuals from the light curve training process, systematic biases in SN Ia distance measurements, and a bias on the dark energy equation of state parameter w. Using the SN-analysis package SNANA, we simulate and analyze realistic samples corresponding to the data samples used in the SNLS3 analysis: ~120 low-redshift (z < 0.1) SNe Ia, ~255 Sloan Digital Sky Survey SNe Ia (z < 0.4), and ~290 SNLS SNe Ia (z ≤ 1). To probe systematic uncertainties in detail, we vary the input spectral model, the model of intrinsic scatter, and the smoothing (i.e., regularization) parameters used during the SALT-II model training. Using realistic intrinsic scatter models results in a slight bias in the ultraviolet portion of the trained SALT-II model, and w biases (w (input) – w (recovered)) ranging from –0.005 ± 0.012 to –0.024 ± 0.010. These biases are indistinguishable from each other within the uncertainty, the average bias on w is –0.014 ± 0.007.

  16. Propagation of uncertainty in nasal spray in vitro performance models using Monte Carlo simulation: Part II. Error propagation during product performance modeling.

    Science.gov (United States)

    Guo, Changning; Doub, William H; Kauffman, John F

    2010-08-01

    Monte Carlo simulations were applied to investigate the propagation of uncertainty in both input variables and response measurements on model prediction for nasal spray product performance design of experiment (DOE) models in the first part of this study, with an initial assumption that the models perfectly represent the relationship between input variables and the measured responses. In this article, we discard the initial assumption, and extended the Monte Carlo simulation study to examine the influence of both input variable variation and product performance measurement variation on the uncertainty in DOE model coefficients. The Monte Carlo simulations presented in this article illustrate the importance of careful error propagation during product performance modeling. Our results show that the error estimates based on Monte Carlo simulation result in smaller model coefficient standard deviations than those from regression methods. This suggests that the estimated standard deviations from regression may overestimate the uncertainties in the model coefficients. Monte Carlo simulations provide a simple software solution to understand the propagation of uncertainty in complex DOE models so that design space can be specified with statistically meaningful confidence levels. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association

  17. Workshop 96. Part II

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    Part II of the seminar proceedings contains contributions in various areas of science and technology, among them materials science in mechanical engineering, materials science in electrical, chemical and civil engineering, and electronics, measuring and communication engineering. In those areas, 6 contributions have been selected for INIS. (P.A.).

  18. Workshop 96. Part II

    International Nuclear Information System (INIS)

    1995-12-01

    Part II of the seminar proceedings contains contributions in various areas of science and technology, among them materials science in mechanical engineering, materials science in electrical, chemical and civil engineering, and electronics, measuring and communication engineering. In those areas, 6 contributions have been selected for INIS. (P.A.)

  19. Sensitivity Analysis of Uncertainty Parameter based on MARS-LMR Code on SHRT-45R of EBR II

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seok-Ju; Kang, Doo-Hyuk; Seo, Jae-Seung [System Engineering and Technology Co., Daejeon (Korea, Republic of); Bae, Sung-Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jeong, Hae-Yong [Sejong University, Seoul (Korea, Republic of)

    2016-10-15

    In order to assess the uncertainty quantification of the MARS-LMR code, the code has been improved by modifying the source code to accommodate calculation process required for uncertainty quantification. In the present study, a transient of Unprotected Loss of Flow(ULOF) is selected as typical cases of as Anticipated Transient without Scram(ATWS) which belongs to DEC category. The MARS-LMR input generation for EBR II SHRT-45R and execution works are performed by using the PAPIRUS program. The sensitivity analysis is carried out with Uncertainty Parameter of the MARS-LMR code for EBR-II SHRT-45R. Based on the results of sensitivity analysis, dominant parameters with large sensitivity to FoM are picked out. Dominant parameters selected are closely related to the development process of ULOF event.

  20. Handling uncertainty and networked structure in robot control

    CERN Document Server

    Tamás, Levente

    2015-01-01

    This book focuses on two challenges posed in robot control by the increasing adoption of robots in the everyday human environment: uncertainty and networked communication. Part I of the book describes learning control to address environmental uncertainty. Part II discusses state estimation, active sensing, and complex scenario perception to tackle sensing uncertainty. Part III completes the book with control of networked robots and multi-robot teams. Each chapter features in-depth technical coverage and case studies highlighting the applicability of the techniques, with real robots or in simulation. Platforms include mobile ground, aerial, and underwater robots, as well as humanoid robots and robot arms. Source code and experimental data are available at http://extras.springer.com. The text gathers contributions from academic and industry experts, and offers a valuable resource for researchers or graduate students in robot control and perception. It also benefits researchers in related areas, such as computer...

  1. Seismic fragility of RC shear walls in nuclear power plant Part 1: Characterization of uncertainty in concrete constitutive model

    International Nuclear Information System (INIS)

    Syed, Sammiuddin; Gupta, Abhinav

    2015-01-01

    Highlights: • A framework is proposed for seismic fragility assessment of Reinforced Concrete structures. • Experimentally validated finite element models are used to conduct nonlinear simulations. • Critical parameters in concrete constitutive model are identified to conduct nonlinear simulations. • Uncertainties in model parameters of concrete damage plasticity model is characterized. • Closed form expressions are used to compute the damage variables and plasticity. - Abstract: This two part manuscript proposes a framework for seismic fragility assessment of reinforced concrete structures in nuclear energy facilities. The novelty of the proposed approach lies in the characterization of uncertainties in the parameters of the material constitutive model. Concrete constitutive models that comprehensively address different damage states such as tensile cracking, compression failure, stiffness degradation, and recovery of degraded stiffness due to closing of previously formed cracks under dynamic loading are generally defined in terms of a large number of variables to characterize the plasticity and damage at material level. Over the past several years, many different studies have been presented on evaluation of fragility for reinforced concrete structures using nonlinear time history simulations. However, almost all of these studies do not consider uncertainties in the parameters of a comprehensive constitutive model. Part-I of this two-part manuscript presents a study that is used to identify uncertainties associated with the critical parameters in nonlinear concrete damage plasticity model proposed by Lubliner et al. (1989. Int. J. Solids Struct., 25(3), 299) and later modified by Lee and Fenves (1998a. J. Eng. Mech., ASCE, 124(8), 892) and Lee and Fenves (1998b. Earthquake Eng. Struct. Dyn., 27(9), 937) for the purpose of seismic fragility assessment. The limitations in implementation of the damage plasticity model within a finite element framework and

  2. Seismic fragility of RC shear walls in nuclear power plant Part 1: Characterization of uncertainty in concrete constitutive model

    Energy Technology Data Exchange (ETDEWEB)

    Syed, Sammiuddin [Department of Civil, Construction, and Environmental Engineering, North Carolina State University, 426 Mann Hall, Campus Box 7908, Raleigh, NC 27695-7908 (United States); Gupta, Abhinav, E-mail: agupta1@ncsu.edu [Department of Civil, Construction, and Environmental Engineering, North Carolina State University, 413 Mann Hall, Campus Box 7908, Raleigh, NC 27695-7908 (United States)

    2015-12-15

    Highlights: • A framework is proposed for seismic fragility assessment of Reinforced Concrete structures. • Experimentally validated finite element models are used to conduct nonlinear simulations. • Critical parameters in concrete constitutive model are identified to conduct nonlinear simulations. • Uncertainties in model parameters of concrete damage plasticity model is characterized. • Closed form expressions are used to compute the damage variables and plasticity. - Abstract: This two part manuscript proposes a framework for seismic fragility assessment of reinforced concrete structures in nuclear energy facilities. The novelty of the proposed approach lies in the characterization of uncertainties in the parameters of the material constitutive model. Concrete constitutive models that comprehensively address different damage states such as tensile cracking, compression failure, stiffness degradation, and recovery of degraded stiffness due to closing of previously formed cracks under dynamic loading are generally defined in terms of a large number of variables to characterize the plasticity and damage at material level. Over the past several years, many different studies have been presented on evaluation of fragility for reinforced concrete structures using nonlinear time history simulations. However, almost all of these studies do not consider uncertainties in the parameters of a comprehensive constitutive model. Part-I of this two-part manuscript presents a study that is used to identify uncertainties associated with the critical parameters in nonlinear concrete damage plasticity model proposed by Lubliner et al. (1989. Int. J. Solids Struct., 25(3), 299) and later modified by Lee and Fenves (1998a. J. Eng. Mech., ASCE, 124(8), 892) and Lee and Fenves (1998b. Earthquake Eng. Struct. Dyn., 27(9), 937) for the purpose of seismic fragility assessment. The limitations in implementation of the damage plasticity model within a finite element framework and

  3. Unlearning Established Organizational Routines--Part II

    Science.gov (United States)

    Fiol, C. Marlena; O'Connor, Edward J.

    2017-01-01

    Purpose: The purpose of Part II of this two-part paper is to uncover important differences in the nature of the three unlearning subprocesses, which call for different leadership interventions to motivate people to move through them. Design/methodology/approach: The paper draws on research in behavioral medicine and psychology to demonstrate that…

  4. Forecast communication through the newspaper Part 2: perceptions of uncertainty

    Science.gov (United States)

    Harris, Andrew J. L.

    2015-04-01

    In the first part of this review, I defined the media filter and how it can operate to frame and blame the forecaster for losses incurred during an environmental disaster. In this second part, I explore the meaning and role of uncertainty when a forecast, and its basis, is communicated through the response and decision-making chain to the newspaper, especially during a rapidly evolving natural disaster which has far-reaching business, political, and societal impacts. Within the media-based communication system, there remains a fundamental disconnect of the definition of uncertainty and the interpretation of the delivered forecast between various stakeholders. The definition and use of uncertainty differs especially between scientific, media, business, and political stakeholders. This is a serious problem for the scientific community when delivering forecasts to the public though the press. As reviewed in Part 1, the media filter can result in a negative frame, which itself is a result of bias, slant, spin, and agenda setting introduced during passage of the forecast and its uncertainty through the media filter. The result is invariably one of anger and fury, which causes loss of credibility and blaming of the forecaster. Generation of a negative frame can be aided by opacity of the decision-making process that the forecast is used to support. The impact of the forecast will be determined during passage through the decision-making chain where the precautionary principle and cost-benefit analysis, for example, will likely be applied. Choice of forecast delivery format, vehicle of communication, syntax of delivery, and lack of follow-up measures can further contribute to causing the forecast and its role to be misrepresented. Follow-up measures to negative frames may include appropriately worded press releases and conferences that target forecast misrepresentation or misinterpretation in an attempt to swing the slant back in favor of the forecaster. Review of

  5. SFR inverse modelling Part 2. Uncertainty factors of predicted flow in deposition tunnels and uncertainty in distribution of flow paths from deposition tunnels

    International Nuclear Information System (INIS)

    Holmen, Johan

    2007-10-01

    The Swedish Nuclear Fuel and Waste Management Co (SKB) is operating the SFR repository for low- and intermediate-level nuclear waste. An update of the safety analysis of SFR was carried out by SKB as the SAFE project (Safety Assessment of Final Disposal of Operational Radioactive Waste). The aim of the project was to update the safety analysis and to produce a safety report. The safety report has been submitted to the Swedish authorities. This study is a continuation of the SAFE project, and concerns the hydrogeological modelling of the SFR repository, which was carried out as part of the SAFE project, it describes the uncertainty in the tunnel flow and distributions of flow paths from the storage tunnels. Uncertainty factors are produced for two different flow situations, corresponding to 2,000 AD (the sea covers the repository) and 4,000 AD (the sea has retreated form the repository area). Uncertainty factors are produced for the different deposition tunnels. The uncertainty factors are discussed in Chapter 2 and two lists (matrix) of uncertainty factors have been delivered as a part of this study. Flow paths are produced for two different flow situations, corresponding to 2,000 AD (the sea covers the repository) and 5,000 AD (the sea has retreated form the repository area). Flow paths from the different deposition tunnels have been simulated, considering the above discussed base case and the 60 realisation that passed all tests of this base case. The flow paths are presented and discussed in Chapter 3 and files presenting the results of the flow path analyses have been delivered as part of this study. The uncertainty factors (see Chapter 2) are not independent from the flow path data (see Chapter 3). When stochastic calculations are performed by use of a transport model and the data presented in this study is used as input to such calculations, the corresponding uncertainty factors and flow path data should be used. This study also includes a brief discussion of

  6. Nuclear medicine and thyroid disease - part II

    International Nuclear Information System (INIS)

    Chatterton, B.E.

    2005-01-01

    Part 1 of this article discussed the anatomy, physiology and basic pathology of the thyroid gland. Techniques of thyroid scanning and a few clinical examples are shown part II Copyright (2005) The Australian and New Zealand Society Of Nuclear Medicine Inc

  7. Recent Economic Perspectives on Political Economy, Part II*

    Science.gov (United States)

    Dewan, Torun; Shepsle, Kenneth A.

    2013-01-01

    In recent years some of the best theoretical work on the political economy of political institutions and processes has begun surfacing outside the political science mainstream in high quality economics journals. This two-part paper surveys these contributions from a recent five-year period. In Part I, the focus is on elections, voting and information aggregation, followed by treatments of parties, candidates, and coalitions. In Part II, papers on economic performance and redistribution, constitutional design, and incentives, institutions, and the quality of political elites are discussed. Part II concludes with a discussion of the methodological bases common to economics and political science, the way economists have used political science research, and some new themes and arbitrage opportunities. PMID:23606754

  8. Discussion of OECD LWR Uncertainty Analysis in Modelling Benchmark

    International Nuclear Information System (INIS)

    Ivanov, K.; Avramova, M.; Royer, E.; Gillford, J.

    2013-01-01

    The demand for best estimate calculations in nuclear reactor design and safety evaluations has increased in recent years. Uncertainty quantification has been highlighted as part of the best estimate calculations. The modelling aspects of uncertainty and sensitivity analysis are to be further developed and validated on scientific grounds in support of their performance and application to multi-physics reactor simulations. The Organization for Economic Co-operation and Development (OECD) / Nuclear Energy Agency (NEA) Nuclear Science Committee (NSC) has endorsed the creation of an Expert Group on Uncertainty Analysis in Modelling (EGUAM). Within the framework of activities of EGUAM/NSC the OECD/NEA initiated the Benchmark for Uncertainty Analysis in Modelling for Design, Operation, and Safety Analysis of Light Water Reactor (OECD LWR UAM benchmark). The general objective of the benchmark is to propagate the predictive uncertainties of code results through complex coupled multi-physics and multi-scale simulations. The benchmark is divided into three phases with Phase I highlighting the uncertainty propagation in stand-alone neutronics calculations, while Phase II and III are focused on uncertainty analysis of reactor core and system respectively. This paper discusses the progress made in Phase I calculations, the Specifications for Phase II and the incoming challenges in defining Phase 3 exercises. The challenges of applying uncertainty quantification to complex code systems, in particular the time-dependent coupled physics models are the large computational burden and the utilization of non-linear models (expected due to the physics coupling). (authors)

  9. Reliability of a new biokinetic model of zirconium in internal dosimetry: part I, parameter uncertainty analysis.

    Science.gov (United States)

    Li, Wei Bo; Greiter, Matthias; Oeh, Uwe; Hoeschen, Christoph

    2011-12-01

    The reliability of biokinetic models is essential in internal dose assessments and radiation risk analysis for the public, occupational workers, and patients exposed to radionuclides. In this paper, a method for assessing the reliability of biokinetic models by means of uncertainty and sensitivity analysis was developed. The paper is divided into two parts. In the first part of the study published here, the uncertainty sources of the model parameters for zirconium (Zr), developed by the International Commission on Radiological Protection (ICRP), were identified and analyzed. Furthermore, the uncertainty of the biokinetic experimental measurement performed at the Helmholtz Zentrum München-German Research Center for Environmental Health (HMGU) for developing a new biokinetic model of Zr was analyzed according to the Guide to the Expression of Uncertainty in Measurement, published by the International Organization for Standardization. The confidence interval and distribution of model parameters of the ICRP and HMGU Zr biokinetic models were evaluated. As a result of computer biokinetic modelings, the mean, standard uncertainty, and confidence interval of model prediction calculated based on the model parameter uncertainty were presented and compared to the plasma clearance and urinary excretion measured after intravenous administration. It was shown that for the most important compartment, the plasma, the uncertainty evaluated for the HMGU model was much smaller than that for the ICRP model; that phenomenon was observed for other organs and tissues as well. The uncertainty of the integral of the radioactivity of Zr up to 50 y calculated by the HMGU model after ingestion by adult members of the public was shown to be smaller by a factor of two than that of the ICRP model. It was also shown that the distribution type of the model parameter strongly influences the model prediction, and the correlation of the model input parameters affects the model prediction to a

  10. Risk, probability and uncertainty in the calculations of gas cooled reactor of PBMR type. Part 2

    International Nuclear Information System (INIS)

    Serbanescu, Dan

    2004-01-01

    The paper presents the main conclusions of the insights to a cooled gas reactor from the perspective of the following notions: probability, uncertainty, entropy and risk. Some results of the on-going comparison between the insights obtained from three models and approaches are presented. The approaches consider the Pebble Bed Module Reactor (PBMR) NPP as a thermodynamic installation and as hierarchical system with or without considering the information exchange between its various levels. The existing model was a basis for a PRA going on in phases for PBMR. In the first part of this paper results from phase II of this PRA were presented. Further activities going on in the preparation for phase II PRA and for the development of a specific application of using PRA during the design phases for PBMR are undergoing with some preliminary results and conclusions. However, for the purposes of this paper and the comparative review of various models in the part two one presents the risk model (model B) based on the assumption and ideas laid down at the basis of the future inter-comparison of this model with other plant models. The assumptions concern: the uncertainties for the quantification of frequencies; list of initiated events; interfaces with the deterministic calculation; integrated evaluation of all the plant states; risk of the release of radionuclide; the balance between the number and function of the active systems and the passive systems; systems interdependencies in PBMR PRA; use of PRA for the evaluation of the impact of various design changes on plant risk. The model B allows basically evaluating the level of risk of the plant by calculating it as a result of acceptance challenge to the plant. By using this model the departure from a reference state is given by the variation in the risk metrics adopted for the study. The paper present also the synergetic model (model C). The evaluation of risk in the model C is considering also the information process. The

  11. Reservoir souring: Problems, uncertainties and modelling. Part I: Problems and uncertainty involved in prediction. Part II: Preliminary investigations of a computational model

    International Nuclear Information System (INIS)

    Paulsen, J.E.; Read, P.A.; Thompson, C.P.; Jelley, C.; Lezeau, P.

    1996-01-01

    The paper relates to improved oil recovery (IOR) techniques by mathematical modelling. The uncertainty involved in modelling of reservoir souring is discussed. IOR processes are speculated to influence a souring process in a positive direction. Most models do not take into account pH in reservoir fluids, and thus do not account for partitioning behaviour of sulfide. Also, sulfide is antagonistic to bacterial metabolism and impedes to bacterial metabolism and impedes the sulfate reduction rate, this may be an important factor in modelling. Biofilms are thought to play a crucial role in a reservoir souring process. Biofilm in a reservoir matrix is different from biofilm in open systems. This has major impact on microbial impact on microbial transport and behaviour. Studies on microbial activity in reservoir matrices must be carried out with model cores, in order to mimic a realistic situation. Sufficient data do not exist today. The main conclusion is that a model does not reflect a true situation before the nature of these elements is understood. A simplified version of an Norwegian developed biofilm model is discussed. The model incorporates all the important physical phenomena studied in the above references such as bacteria growth limited by nutrients and/or energy sources and hydrogen sulfide adsorption. 18 refs., 8 figs., 1 tab

  12. Reservoir souring: Problems, uncertainties and modelling. Part I: Problems and uncertainty involved in prediction. Part II: Preliminary investigations of a computational model

    Energy Technology Data Exchange (ETDEWEB)

    Paulsen, J.E. [Rogalandsforskning, Stavanger (Norway); Read, P.A.; Thompson, C.P.; Jelley, C.; Lezeau, P.

    1996-12-31

    The paper relates to improved oil recovery (IOR) techniques by mathematical modelling. The uncertainty involved in modelling of reservoir souring is discussed. IOR processes are speculated to influence a souring process in a positive direction. Most models do not take into account pH in reservoir fluids, and thus do not account for partitioning behaviour of sulfide. Also, sulfide is antagonistic to bacterial metabolism and impedes to bacterial metabolism and impedes the sulfate reduction rate, this may be an important factor in modelling. Biofilms are thought to play a crucial role in a reservoir souring process. Biofilm in a reservoir matrix is different from biofilm in open systems. This has major impact on microbial impact on microbial transport and behaviour. Studies on microbial activity in reservoir matrices must be carried out with model cores, in order to mimic a realistic situation. Sufficient data do not exist today. The main conclusion is that a model does not reflect a true situation before the nature of these elements is understood. A simplified version of an Norwegian developed biofilm model is discussed. The model incorporates all the important physical phenomena studied in the above references such as bacteria growth limited by nutrients and/or energy sources and hydrogen sulfide adsorption. 18 refs., 8 figs., 1 tab.

  13. PIO I-II tendencies. Part 2. Improving the pilot modeling

    Directory of Open Access Journals (Sweden)

    Ioan URSU

    2011-03-01

    Full Text Available The study is conceived in two parts and aims to get some contributions to the problem ofPIO aircraft susceptibility analysis. Part I, previously published in this journal, highlighted the mainsteps of deriving a complex model of human pilot. The current Part II of the paper considers a properprocedure of the human pilot mathematical model synthesis in order to analyze PIO II typesusceptibility of a VTOL-type aircraft, related to the presence of position and rate-limited actuator.The mathematical tools are those of semi global stability theory developed in recent works.

  14. Metatarsalgia located by synovitis and uncertainty of the articulation metatarsus-phalanges of the II toe

    International Nuclear Information System (INIS)

    Gerstner G, Juan Bernardo

    2002-01-01

    The synovitis and the uncertainty of the articulation metatarsus-phalanges (MP) of the II toe they are the causes more frequent of metatersalgia located in this articulation of the foot, frequently bad diagnosed and not well managed by the general orthopedist. The natural history understands stadiums so precocious as the synovitis without alteration of peri-articular structures, going by the frank uncertainty, and finishing with the angular deformities and the complete luxation of the articulation MP. The meticulous and directed interrogation, the physical exam specifies and the classification of the diagnostic they are the keys for the successful handling of the pathology. The surgical correction of this condition should always be associated to the correction of associate deformities as the hallux valgus and the fingers in claw

  15. Quantifying type I and type II errors in decision-making under uncertainty : The case of GM crops

    NARCIS (Netherlands)

    Ansink, Erik; Wesseler, Justus

    2009-01-01

    In a recent paper, Hennessy and Moschini (American Journal of Agricultural Economics 88(2): 308-323, 2006) analyse the interactions between scientific uncertainty and costly regulatory actions. We use their model to analyse the costs of making type I and type II errors, in the context of the

  16. Quantifying type I and type II errors in decision-making under uncertainty: the case of GM crops

    NARCIS (Netherlands)

    Ansink, E.J.H.; Wesseler, J.H.H.

    2009-01-01

    In a recent paper, Hennessy and Moschini (American Journal of Agricultural Economics 88(2): 308¿323, 2006) analyse the interactions between scientific uncertainty and costly regulatory actions. We use their model to analyse the costs of making type I and type II errors, in the context of the

  17. The Combined ASTER MODIS Emissivity over Land (CAMEL Part 2: Uncertainty and Validation

    Directory of Open Access Journals (Sweden)

    Michelle Feltz

    2018-04-01

    Full Text Available Under the National Aeronautics and Space Administration’s (NASA Making Earth System Data Records for Use in Research Environments (MEaSUREs Land Surface Temperature and Emissivity project, a new global land surface emissivity dataset has been produced by the University of Wisconsin–Madison Space Science and Engineering Center and NASA’s Jet Propulsion Laboratory (JPL. This new dataset termed the Combined ASTER MODIS Emissivity over Land (CAMEL, is created by the merging of the UW–Madison MODIS baseline-fit emissivity dataset (UWIREMIS and JPL’s ASTER Global Emissivity Dataset v4 (GEDv4. CAMEL consists of a monthly, 0.05° resolution emissivity for 13 hinge points within the 3.6–14.3 µm region and is extended to 417 infrared spectral channels using a principal component regression approach. An uncertainty product is provided for the 13 hinge point emissivities by combining temporal, spatial, and algorithm variability as part of a total uncertainty estimate. Part 1 of this paper series describes the methodology for creating the CAMEL emissivity product and the corresponding high spectral resolution algorithm. This paper, Part 2 of the series, details the methodology of the CAMEL uncertainty calculation and provides an assessment of the CAMEL emissivity product through comparisons with (1 ground site lab measurements; (2 a long-term Infrared Atmospheric Sounding Interferometer (IASI emissivity dataset derived from 8 years of data; and (3 forward-modeled IASI brightness temperatures using the Radiative Transfer for TOVS (RTTOV radiative transfer model. Global monthly results are shown for different seasons and International Geosphere-Biosphere Programme land classifications, and case study examples are shown for locations with different land surface types.

  18. Validating the standard for the National Board Dental Examination Part II.

    Science.gov (United States)

    Tsai, Tsung-Hsun; Neumann, Laura M; Littlefield, John H

    2012-05-01

    As part of the overall exam validation process, the Joint Commission on National Dental Examinations periodically reviews and validates the pass/fail standard for the National Board Dental Examination (NBDE), Parts I and II. The most recent standard-setting activities for NBDE Part II used the Objective Standard Setting method. This report describes the process used to set the pass/fail standard for the 2009 exam. The failure rate on the NBDE Part II increased from 5.3 percent in 2008 to 13.7 percent in 2009 and then decreased to 10 percent in 2010. This article describes the Objective Standard Setting method and presents the estimated probabilities of classification errors based on the beta binomial mathematical model. The results show that the probability of correct classifications of candidate performance is very high (0.97) and that probabilities of false negative and false positive errors are very small (.03 and <0.001, respectively). The low probability of classification errors supports the conclusion that the pass/fail score on the NBDE Part II is a valid guide for making decisions about candidates for dental licensure.

  19. Operational hydrological forecasting in Bavaria. Part II: Ensemble forecasting

    Science.gov (United States)

    Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.

    2009-04-01

    In part I of this study, the operational flood forecasting system in Bavaria and an approach to identify and quantify forecast uncertainty was introduced. The approach is split into the calculation of an empirical 'overall error' from archived forecasts and the calculation of an empirical 'model error' based on hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The 'model error' can especially in upstream catchments where forecast uncertainty is strongly dependent on the current predictability of the atrmosphere be superimposed on the spread of a hydrometeorological ensemble forecast. In Bavaria, two meteorological ensemble prediction systems are currently tested for operational use: the 16-member COSMO-LEPS forecast and a poor man's ensemble composed of DWD GME, DWD Cosmo-EU, NCEP GFS, Aladin-Austria, MeteoSwiss Cosmo-7. The determination of the overall forecast uncertainty is dependent on the catchment characteristics: 1. Upstream catchment with high influence of weather forecast a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. b) Corresponding to the characteristics of the meteorological ensemble forecast, each resulting forecast hydrograph can be regarded as equally likely. c) The 'model error' distribution, with parameters dependent on hydrological case and lead time, is added to each forecast timestep of each ensemble member d) For each forecast timestep, the overall (i.e. over all 'model error' distribution of each ensemble member) error distribution is calculated e) From this distribution, the uncertainty range on a desired level (here: the 10% and 90% percentile) is extracted and drawn as forecast envelope. f) As the mean or median of an ensemble forecast does not necessarily exhibit meteorologically sound temporal evolution, a single hydrological forecast termed 'lead forecast' is chosen and shown in addition to the uncertainty bounds. This can be

  20. Fukushima Daiichi Unit 1 Uncertainty Analysis-Exploration of Core Melt Progression Uncertain Parameters-Volume II.

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on key figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .

  1. Globalization in the pharmaceutical industry, Part II.

    Science.gov (United States)

    Casadio Tarabusi, C; Vickery, G

    1998-01-01

    This is the second of a two-part report on the pharmaceutical industry. Part II begins with a discussion of foreign direct investment and inter-firm networks, which covers international mergers, acquisitions, and minority participation; market shares of foreign-controlled firms; international collaboration agreements (with a special note on agreements in biotechnology); and licensing agreements. The final section of the report covers governmental policies on health and safety regulation, price regulation, industry and technology, trade, foreign investment, protection of intellectual property, and competition.

  2. 29 CFR Appendix II to Part 1918 - Tables for Selected Miscellaneous Auxiliary Gear (Mandatory)

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Tables for Selected Miscellaneous Auxiliary Gear (Mandatory) II Appendix II to Part 1918 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND.... 1918, App. II Appendix II to Part 1918—Tables for Selected Miscellaneous Auxiliary Gear (Mandatory...

  3. Articulating uncertainty as part of scientific argumentation during model-based exoplanet detection tasks

    Science.gov (United States)

    Lee, Hee-Sun; Pallant, Amy; Pryputniewicz, Sarah

    2015-08-01

    Teaching scientific argumentation has emerged as an important goal for K-12 science education. In scientific argumentation, students are actively involved in coordinating evidence with theory based on their understanding of the scientific content and thinking critically about the strengths and weaknesses of the cited evidence in the context of the investigation. We developed a one-week-long online curriculum module called "Is there life in space?" where students conduct a series of four model-based tasks to learn how scientists detect extrasolar planets through the “wobble” and transit methods. The simulation model allows students to manipulate various parameters of an imaginary star and planet system such as planet size, orbit size, planet-orbiting-plane angle, and sensitivity of telescope equipment, and to adjust the display settings for graphs illustrating the relative velocity and light intensity of the star. Students can use model-based evidence to formulate an argument on whether particular signals in the graphs guarantee the presence of a planet. Students' argumentation is facilitated by the four-part prompts consisting of multiple-choice claim, open-ended explanation, Likert-scale uncertainty rating, and open-ended uncertainty rationale. We analyzed 1,013 scientific arguments formulated by 302 high school student groups taught by 7 teachers. We coded these arguments in terms of the accuracy of their claim, the sophistication of explanation connecting evidence to the established knowledge base, the uncertainty rating, and the scientific validity of uncertainty. We found that (1) only 18% of the students' uncertainty rationale involved critical reflection on limitations inherent in data and concepts, (2) 35% of students' uncertainty rationale reflected their assessment of personal ability and knowledge, rather than scientific sources of uncertainty related to the evidence, and (3) the nature of task such as the use of noisy data or the framing of

  4. Minimizing measurement uncertainties of coniferous needle-leaf optical properties, part II: experimental set-up and error analysis

    NARCIS (Netherlands)

    Yanez Rausell, L.; Malenovsky, Z.; Clevers, J.G.P.W.; Schaepman, M.E.

    2014-01-01

    We present uncertainties associated with the measurement of coniferous needle-leaf optical properties (OPs) with an integrating sphere using an optimized gap-fraction (GF) correction method, where GF refers to the air gaps appearing between the needles of a measured sample. We used an optically

  5. 46 CFR Table II to Part 150 - Grouping of Cargoes

    Science.gov (United States)

    2010-10-01

    ... solution Potassium oleate Potassium salt of polyolefin acid Propyl acetate Propylene carbonate Propylene... lignosulfonate solution Sodium polyacrylate solution 2 Sodium salt of Ferric hydroxyethylethylenediamine... 46 Shipping 5 2010-10-01 2010-10-01 false Grouping of Cargoes II Table II to Part 150 Shipping...

  6. 40 CFR Appendix II to Part 600 - Sample Fuel Economy Calculations

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Sample Fuel Economy Calculations II... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Pt. 600, App. II Appendix II to Part 600—Sample Fuel Economy Calculations (a) This sample fuel economy calculation is applicable to...

  7. WE-B-19A-01: SRT II: Uncertainties in SRT

    International Nuclear Information System (INIS)

    Dieterich, S; Schlesinger, D; Geneser, S

    2014-01-01

    SRS delivery has undergone major technical changes in the last decade, transitioning from predominantly frame-based treatment delivery to imageguided, frameless SRS. It is important for medical physicists working in SRS to understand the magnitude and sources of uncertainty involved in delivering SRS treatments for a multitude of technologies (Gamma Knife, CyberKnife, linac-based SRS and protons). Sources of SRS planning and delivery uncertainty include dose calculation, dose fusion, and intra- and inter-fraction motion. Dose calculations for small fields are particularly difficult because of the lack of electronic equilibrium and greater effect of inhomogeneities within and near the PTV. Going frameless introduces greater setup uncertainties that allows for potentially increased intra- and interfraction motion, The increased use of multiple imaging modalities to determine the tumor volume, necessitates (deformable) image and contour fusion, and the resulting uncertainties introduced in the image registration process further contribute to overall treatment planning uncertainties. Each of these uncertainties must be quantified and their impact on treatment delivery accuracy understood. If necessary, the uncertainties may then be accounted for during treatment planning either through techniques to make the uncertainty explicit, or by the appropriate addition of PTV margins. Further complicating matters, the statistics of 1-5 fraction SRS treatments differ from traditional margin recipes relying on Poisson statistics. In this session, we will discuss uncertainties introduced during each step of the SRS treatment planning and delivery process and present margin recipes to appropriately account for such uncertainties. Learning Objectives: To understand the major contributors to the total delivery uncertainty in SRS for Gamma Knife, CyberKnife, and linac-based SRS. Learn the various uncertainties introduced by image fusion, deformable image registration, and contouring

  8. Typewriting Syllabus: Part II: Modules. 1976 Revision.

    Science.gov (United States)

    New York State Education Dept., Albany. Bureau of Occupational and Career Curriculum Development.

    The document is the second of a two-part set on typewriting and focuses on the nine modules of instruction. The nine modules are: (1) keyboard mastery and skill development, (2) basic typewriting competencies, (2a) personal use typewriting, (3) introduction to office typewriting I, (4) introduction to office typewriting II, (5) intermediate office…

  9. 10 CFR Appendix II to Part 504 - Fuel Price Computation

    Science.gov (United States)

    2010-01-01

    ... DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS EXISTING POWERPLANTS Pt. 504, App. II Appendix II to Part... effects of future real price increases for each fuel. The delivered price of an alternate fuel used to calculate delivered fuel expenses must reflect the petitioner's delivered price of the alternate fuel and...

  10. 46 CFR Appendix II to Part 150 - Explanation of Figure 1

    Science.gov (United States)

    2010-10-01

    ... COMPATIBILITY OF CARGOES Pt. 150, App. II Appendix II to Part 150—Explanation of Figure 1 Definition of a..., aromatic hydrocarbons or paraffins. Others will form hazardous combinations with many groups: For example...

  11. 40 CFR Appendix II to Part 1042 - Steady-State Duty Cycles

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Steady-State Duty Cycles II Appendix..., App. II Appendix II to Part 1042—Steady-State Duty Cycles (a) The following duty cycles apply as specified in § 1042.505(b)(1): (1) The following duty cycle applies for discrete-mode testing: E3 mode No...

  12. Kick, Glide, Pole! Cross-Country Skiing Fun (Part II)

    Science.gov (United States)

    Duoos, Bridget A.

    2012-01-01

    Part I of Kick, Glide, Pole! Cross-Country Skiing Fun, which was published in last issue, discussed how to select cross-country ski equipment, dress for the activity and the biomechanics of the diagonal stride. Part II focuses on teaching the diagonal stride technique and begins with a progression of indoor activities. Incorporating this fun,…

  13. Inverse modeling and uncertainty analysis of potential groundwater recharge to the confined semi-fossil Ohangwena II Aquifer, Namibia

    Science.gov (United States)

    Wallner, Markus; Houben, Georg; Lohe, Christoph; Quinger, Martin; Himmelsbach, Thomas

    2017-12-01

    The identification of potential recharge areas and estimation of recharge rates to the confined semi-fossil Ohangwena II Aquifer (KOH-2) is crucial for its future sustainable use. The KOH-2 is located within the endorheic transboundary Cuvelai-Etosha-Basin (CEB), shared by Angola and Namibia. The main objective was the development of a strategy to tackle the problem of data scarcity, which is a well-known problem in semi-arid regions. In a first step, conceptual geological cross sections were created to illustrate the possible geological setting of the system. Furthermore, groundwater travel times were estimated by simple hydraulic calculations. A two-dimensional numerical groundwater model was set up to analyze flow patterns and potential recharge zones. The model was optimized against local observations of hydraulic heads and groundwater age. The sensitivity of the model against different boundary conditions and internal structures was tested. Parameter uncertainty and recharge rates were estimated. Results indicate that groundwater recharge to the KOH-2 mainly occurs from the Angolan Highlands in the northeastern part of the CEB. The sensitivity of the groundwater model to different internal structures is relatively small in comparison to changing boundary conditions in the form of influent or effluent streams. Uncertainty analysis underlined previous results, indicating groundwater recharge originating from the Angolan Highlands. The estimated recharge rates are less than 1% of mean yearly precipitation, which are reasonable for semi-arid regions.

  14. PREREM: an interactive data preprocessing code for INREM II. Part I: user's manual. Part II: code structure

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, M.T.; Fields, D.E.

    1981-05-01

    PREREM is an interactive computer code developed as a data preprocessor for the INREM-II (Killough, Dunning, and Pleasant, 1978a) internal dose program. PREREM is intended to provide easy access to current and self-consistent nuclear decay and radionuclide-specific metabolic data sets. Provision is made for revision of metabolic data, and the code is intended for both production and research applications. Documentation for the code is in two parts. Part I is a user's manual which emphasizes interpretation of program prompts and choice of user input. Part II stresses internal structure and flow of program control and is intended to assist the researcher who wishes to revise or modify the code or add to its capabilities. PREREM is written for execution on a Digital Equipment Corporation PDP-10 System and much of the code will require revision before it can be run on other machines. The source program length is 950 lines (116 blocks) and computer core required for execution is 212 K bytes. The user must also have sufficient file space for metabolic and S-factor data sets. Further, 64 100 K byte blocks of computer storage space are required for the nuclear decay data file. Computer storage space must also be available for any output files produced during the PREREM execution. 9 refs., 8 tabs.

  15. The Intolerance of Uncertainty Inventory: Validity and Comparison of Scoring Methods to Assess Individuals Screening Positive for Anxiety and Depression.

    Science.gov (United States)

    Lauriola, Marco; Mosca, Oriana; Trentini, Cristina; Foschi, Renato; Tambelli, Renata; Carleton, R Nicholas

    2018-01-01

    Intolerance of Uncertainty is a fundamental transdiagnostic personality construct hierarchically organized with a core general factor underlying diverse clinical manifestations. The current study evaluated the construct validity of the Intolerance of Uncertainty Inventory, a two-part scale separately assessing a unitary Intolerance of Uncertainty disposition to consider uncertainties to be unacceptable and threatening (Part A) and the consequences of such disposition, regarding experiential avoidance, chronic doubt, overestimation of threat, worrying, control of uncertain situations, and seeking reassurance (Part B). Community members ( N = 1046; Mean age = 36.69 ± 12.31 years; 61% females) completed the Intolerance of Uncertainty Inventory with the Beck Depression Inventory-II and the State-Trait Anxiety Inventory. Part A demonstrated a robust unidimensional structure and an excellent convergent validity with Part B. A bifactor model was the best fitting model for Part B. Based on these results, we compared the hierarchical factor scores with summated ratings clinical proxy groups reporting anxiety and depression symptoms. Summated rating scores were associated with both depression and anxiety and proportionally increased with the co-occurrence of depressive and anxious symptoms. By contrast, hierarchical scores were useful to detect which facets mostly separated between for depression and anxiety groups. In sum, Part A was a reliable and valid transdiagnostic measure of Intolerance of Uncertainty. The Part B was arguably more useful for assessing clinical manifestations of Intolerance of Uncertainty for specific disorders, provided that hierarchical scores are used. Overall, our study suggest that clinical assessments might need to shift toward hierarchical factor scores.

  16. The Intolerance of Uncertainty Inventory: Validity and Comparison of Scoring Methods to Assess Individuals Screening Positive for Anxiety and Depression

    Directory of Open Access Journals (Sweden)

    Marco Lauriola

    2018-03-01

    Full Text Available Intolerance of Uncertainty is a fundamental transdiagnostic personality construct hierarchically organized with a core general factor underlying diverse clinical manifestations. The current study evaluated the construct validity of the Intolerance of Uncertainty Inventory, a two-part scale separately assessing a unitary Intolerance of Uncertainty disposition to consider uncertainties to be unacceptable and threatening (Part A and the consequences of such disposition, regarding experiential avoidance, chronic doubt, overestimation of threat, worrying, control of uncertain situations, and seeking reassurance (Part B. Community members (N = 1046; Mean age = 36.69 ± 12.31 years; 61% females completed the Intolerance of Uncertainty Inventory with the Beck Depression Inventory-II and the State-Trait Anxiety Inventory. Part A demonstrated a robust unidimensional structure and an excellent convergent validity with Part B. A bifactor model was the best fitting model for Part B. Based on these results, we compared the hierarchical factor scores with summated ratings clinical proxy groups reporting anxiety and depression symptoms. Summated rating scores were associated with both depression and anxiety and proportionally increased with the co-occurrence of depressive and anxious symptoms. By contrast, hierarchical scores were useful to detect which facets mostly separated between for depression and anxiety groups. In sum, Part A was a reliable and valid transdiagnostic measure of Intolerance of Uncertainty. The Part B was arguably more useful for assessing clinical manifestations of Intolerance of Uncertainty for specific disorders, provided that hierarchical scores are used. Overall, our study suggest that clinical assessments might need to shift toward hierarchical factor scores.

  17. Propagation of uncertainties through the oil spill model MEDSLIK-II: operational application to the Black Sea

    Science.gov (United States)

    Liubartseva, Svitlana; Coppini, Giovanni; Ciliberti, Stefania Angela; Lecci, Rita

    2017-04-01

    . Uncertainty imprints in the oil mass balance components are also analyzed. This work is conducted in the framework of the REACT Project funded by Fondazione CON IL SUD/Brains2South. References Ciliberti, S.A., Peneva, E., Storto, A., Kandilarov, R., Lecci, R., Yang, C., Coppini, G., Masina, S., Pinardi, N., 2016. Implementation of Black Sea numerical model based on NEMO and 3DVAR data assimilation scheme for operational forecasting, Geophys. Res. Abs., 18, EGU2016-16222. De Dominicis, M., Pinardi, N., Zodiatis, G., Lardner, R., 2013. MEDSLIK-II, a Lagrangian marine surface oil spill model for short term forecasting-Part 1: Theory, Geosci. Model Dev., 6, 1851-1869. Liubartseva, S., Coppini, G., Pinardi, N., De Dominicis, M., Lecci, R., Turrisi, G., Cretì, S., Martinelli, S., Agostini, P., Marra, P., Palermo, F., 2016. Decision support system for emergency management of oil spill accidents in the Mediterranean Sea, Nat. Hazards Earth Syst. Sci., 16, 2009-2020.

  18. Intelligent Information Retrieval: Diagnosing Information Need. Part II. Uncertainty Expansion in a Prototype of a Diagnostic IR Tool.

    Science.gov (United States)

    Cole, Charles; Cantero, Pablo; Sauve, Diane

    1998-01-01

    Outlines a prototype of an intelligent information-retrieval tool to facilitate information access for an undergraduate seeking information for a term paper. Topics include diagnosing the information need, Kuhlthau's information-search-process model, Shannon's mathematical theory of communication, and principles of uncertainty expansion and…

  19. Calculus of Elementary Functions, Part II. Teacher's Commentary. Revised Edition.

    Science.gov (United States)

    Herriot, Sarah T.; And Others

    This course is intended for students who have a thorough knowledge of college preparatory mathematics, including algebra, axiomatic geometry, trigonometry, and analytic geometry. This teacher's guide is for Part II of the course. It is designed to follow Part I of the text. The guide contains background information, suggested instructional…

  20. Calculus of Elementary Functions, Part II. Student Text. Revised Edition.

    Science.gov (United States)

    Herriot, Sarah T.; And Others

    This course is intended for students who have a thorough knowledge of college preparatory mathematics, including algebra, axiomatic geometry, trigonometry, and analytic geometry. This text, Part II, contains material designed to follow Part I. Chapters included in this text are: (6) Derivatives of Exponential and Related Functions; (7) Area and…

  1. Nursing Care of Patients Undergoing Chemotherapy Desensitization: Part II.

    Science.gov (United States)

    Jakel, Patricia; Carsten, Cynthia; Carino, Arvie; Braskett, Melinda

    2016-04-01

    Chemotherapy desensitization protocols are safe, but labor-intensive, processes that allow patients with cancer to receive medications even if they initially experienced severe hypersensitivity reactions. Part I of this column discussed the pathophysiology of hypersensitivity reactions and described the development of desensitization protocols in oncology settings. Part II incorporates the experiences of an academic medical center and provides a practical guide for the nursing care of patients undergoing chemotherapy desensitization.
.

  2. 40 CFR Appendix II to Part 1054 - Duty Cycles for Laboratory Testing

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Duty Cycles for Laboratory Testing II.... 1054, App. II Appendix II to Part 1054—Duty Cycles for Laboratory Testing (a) Test handheld engines with the following steady-state duty cycle: G3 mode No. Engine speed a Torque(percent) b Weighting...

  3. NUPEC BWR Full-size Fine-mesh Bundle Test (BFBT) Benchmark. Volume II: uncertainty and sensitivity analyses of void distribution and critical power - Specification

    International Nuclear Information System (INIS)

    Aydogan, F.; Hochreiter, L.; Ivanov, K.; Martin, M.; Utsuno, H.; Sartori, E.

    2010-01-01

    This report provides the specification for the uncertainty exercises of the international OECD/NEA, NRC and NUPEC BFBT benchmark problem including the elemental task. The specification was prepared jointly by Pennsylvania State University (PSU), USA and the Japan Nuclear Energy Safety (JNES) Organisation, in cooperation with the OECD/NEA and the Commissariat a l'energie atomique (CEA Saclay, France). The work is sponsored by the US NRC, METI-Japan, the OECD/NEA and the Nuclear Engineering Program (NEP) of Pennsylvania State University. This uncertainty specification covers the fourth exercise of Phase I (Exercise-I-4), and the third exercise of Phase II (Exercise II-3) as well as the elemental task. The OECD/NRC BFBT benchmark provides a very good opportunity to apply uncertainty analysis (UA) and sensitivity analysis (SA) techniques and to assess the accuracy of thermal-hydraulic models for two-phase flows in rod bundles. During the previous OECD benchmarks, participants usually carried out sensitivity analysis on their models for the specification (initial conditions, boundary conditions, etc.) to identify the most sensitive models or/and to improve the computed results. The comprehensive BFBT experimental database (NEA, 2006) leads us one step further in investigating modelling capabilities by taking into account the uncertainty analysis in the benchmark. The uncertainties in input data (boundary conditions) and geometry (provided in the benchmark specification) as well as the uncertainties in code models can be accounted for to produce results with calculational uncertainties and compare them with the measurement uncertainties. Therefore, uncertainty analysis exercises were defined for the void distribution and critical power phases of the BFBT benchmark. This specification is intended to provide definitions related to UA/SA methods, sensitivity/ uncertainty parameters, suggested probability distribution functions (PDF) of sensitivity parameters, and selected

  4. 40 CFR Appendix II to Part 1039 - Steady-State Duty Cycles

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Steady-State Duty Cycles II Appendix... Appendix II to Part 1039—Steady-State Duty Cycles (a) The following duty cycles apply for constant-speed engines: (1) The following duty cycle applies for discrete-mode testing: D2 mode number Engine speed...

  5. First international 26Al interlaboratory comparison - Part II

    International Nuclear Information System (INIS)

    Merchel, Silke; Bremser, Wolfram

    2005-01-01

    After finishing Part I of the first international 26 Al interlaboratory comparison with accelerator mass spectrometry (AMS) laboratories [S. Merchel, W. Bremser, Nucl. Instr. and Meth. B 223-224 (2004) 393], the evaluation of Part II with radionuclide counting laboratories took place. The evaluation of the results of the seven participating laboratories on four meteorite samples shows a good overall agreement between laboratories, i.e. it does not reveal any statistically significant differences if results are compared sample-by-sample. However, certain interlaboratory bias is observed with a more detailed statistical analysis including some multivariate approaches

  6. Programming Models for Three-Dimensional Hydrodynamics on the CM-5 (Part II)

    International Nuclear Information System (INIS)

    Amala, P.A.K.; Rodrigue, G.H.

    1994-01-01

    This is a two-part presentation of a timing study on the Thinking Machines CORP. CM-5 computer. Part II is given in this study and represents domain-decomposition and message-passing models. Part I described computational problems using a SIMD model and connection machine FORTRAN (CMF)

  7. Continuum Thermodynamics - Part II: Applications and Examples

    Science.gov (United States)

    Albers, Bettina; Wilmanski, Krzysztof

    The intention by writing Part II of the book on continuum thermodynamics was the deepening of some issues covered in Part I as well as a development of certain skills in dealing with practical problems of oscopic processes. However, the main motivation for this part is the presentation of main facets of thermodynamics which appear when interdisciplinary problems are considered. There are many monographs on the subjects of solid mechanics and thermomechanics, on fluid mechanics and on coupled fields but most of them cover only special problems in great details which are characteristic for the chosen field. It is rather seldom that relations between these fields are discussed. This concerns, for instance, large deformations of the skeleton of porous materials with diffusion (e.g. lungs), couplings of deformable particles with the fluid motion in suspensions, couplings of adsorption processes and chemical reactions in immiscible mixtures with diffusion, various multi-component aspects of the motion, e.g. of avalanches, such as segregation processes, etc...

  8. Benchmark matrix and guide: Part II.

    Science.gov (United States)

    1991-01-01

    In the last issue of the Journal of Quality Assurance (September/October 1991, Volume 13, Number 5, pp. 14-19), the benchmark matrix developed by Headquarters Air Force Logistics Command was published. Five horizontal levels on the matrix delineate progress in TQM: business as usual, initiation, implementation, expansion, and integration. The six vertical categories that are critical to the success of TQM are leadership, structure, training, recognition, process improvement, and customer focus. In this issue, "Benchmark Matrix and Guide: Part II" will show specifically how to apply the categories of leadership, structure, and training to the benchmark matrix progress levels. At the intersection of each category and level, specific behavior objectives are listed with supporting behaviors and guidelines. Some categories will have objectives that are relatively easy to accomplish, allowing quick progress from one level to the next. Other categories will take considerable time and effort to complete. In the next issue, Part III of this series will focus on recognition, process improvement, and customer focus.

  9. 12 CFR Appendix II to Part 27 - Information for Government Monitoring Purposes

    Science.gov (United States)

    2010-01-01

    ... II Appendix II to Part 27 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY... Monitoring Purposes The following language is approved by the Comptroller of the Currency and will satisfy... used separately. This information may also be provided orally by the applicant. The following...

  10. 31 CFR Appendix II to Part 13 - Form of Bill for Reimbursement

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Form of Bill for Reimbursement II Appendix II to Part 13 Money and Finance: Treasury Office of the Secretary of the Treasury PROCEDURES FOR... title) of ______ (Country) to participate in the work of ______ (International Organization) or...

  11. 10 CFR Appendix II to Part 1050 - DOE Form 3735.3-Foreign Travel Statement

    Science.gov (United States)

    2010-01-01

    ... is official agency business. Spouses and dependents may accept such travel and expenses only when... 10 Energy 4 2010-01-01 2010-01-01 false DOE Form 3735.3-Foreign Travel Statement II Appendix II to.... II Appendix II to Part 1050—DOE Form 3735.3—Foreign Travel Statement EC01OC91.041 Statement...

  12. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    NARCIS (Netherlands)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G.; Ring, David; Parisien, Robert

    2016-01-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if

  13. Modelling ecological and human exposure to POPs in Venice lagoon - Part II: Quantitative uncertainty and sensitivity analysis in coupled exposure models.

    Science.gov (United States)

    Radomyski, Artur; Giubilato, Elisa; Ciffroy, Philippe; Critto, Andrea; Brochot, Céline; Marcomini, Antonio

    2016-11-01

    The study is focused on applying uncertainty and sensitivity analysis to support the application and evaluation of large exposure models where a significant number of parameters and complex exposure scenarios might be involved. The recently developed MERLIN-Expo exposure modelling tool was applied to probabilistically assess the ecological and human exposure to PCB 126 and 2,3,7,8-TCDD in the Venice lagoon (Italy). The 'Phytoplankton', 'Aquatic Invertebrate', 'Fish', 'Human intake' and PBPK models available in MERLIN-Expo library were integrated to create a specific food web to dynamically simulate bioaccumulation in various aquatic species and in the human body over individual lifetimes from 1932 until 1998. MERLIN-Expo is a high tier exposure modelling tool allowing propagation of uncertainty on the model predictions through Monte Carlo simulation. Uncertainty in model output can be further apportioned between parameters by applying built-in sensitivity analysis tools. In this study, uncertainty has been extensively addressed in the distribution functions to describe the data input and the effect on model results by applying sensitivity analysis techniques (screening Morris method, regression analysis, and variance-based method EFAST). In the exposure scenario developed for the Lagoon of Venice, the concentrations of 2,3,7,8-TCDD and PCB 126 in human blood turned out to be mainly influenced by a combination of parameters (half-lives of the chemicals, body weight variability, lipid fraction, food assimilation efficiency), physiological processes (uptake/elimination rates), environmental exposure concentrations (sediment, water, food) and eating behaviours (amount of food eaten). In conclusion, this case study demonstrated feasibility of MERLIN-Expo to be successfully employed in integrated, high tier exposure assessment. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Quantifying reactor safety margins: Application of CSAU [Code Scalability, Applicability and Uncertainty] methodology to LBLOCA: Part 3, Assessment and ranging of parameters for the uncertainty analysis of LBLOCA codes

    International Nuclear Information System (INIS)

    Wulff, W.; Boyack, B.E.; Duffey, R.B.

    1988-01-01

    Comparisons of results from TRAC-PF1/MOD1 code calculations with measurements from Separate Effects Tests, and published experimental data for modeling parameters have been used to determine the uncertainty ranges of code input and modeling parameters which dominate the uncertainty in predicting the Peak Clad Temperature for a postulated Large Break Loss of Coolant Accident (LBLOCA) in a four-loop Westinghouse Pressurized Water Reactor. The uncertainty ranges are used for a detailed statistical analysis to calculate the probability distribution function for the TRAC code-predicted Peak Clad Temperature, as is described in an attendant paper. Measurements from Separate Effects Tests and Integral Effects Tests have been compared with results from corresponding TRAC-PF1/MOD1 code calculations to determine globally the total uncertainty in predicting the Peak Clad Temperature for LBLOCAs. This determination is in support of the detailed statistical analysis mentioned above. The analyses presented here account for uncertainties in input parameters, in modeling and scaling, in computing and in measurements. The analyses are an important part of the work needed to implement the Code Scalability, Applicability and Uncertainty (CSAU) methodology. CSAU is needed to determine the suitability of a computer code for reactor safety analyses and the uncertainty in computer predictions. The results presented here are used to estimate the safety margin of a particular nuclear reactor power plant for a postulated accident. 25 refs., 10 figs., 11 tabs

  15. A Survey of Optometry Graduates to Determine Practice Patterns: Part II: Licensure and Practice Establishment Experiences.

    Science.gov (United States)

    Bleimann, Robert L.; Smith, Lee W.

    1985-01-01

    A summary of Part II of a two-volume study of optometry graduates conducted by the Association of Schools and Colleges of Optometry is presented. Part II includes the analysis of the graduates' licensure and practice establishment experiences. (MLW)

  16. Interactions between perceived uncertainty types in service dyads

    DEFF Research Database (Denmark)

    Kreye, Melanie

    2018-01-01

    to avoid business failure. A conceptual framework of four uncertainty types is investigated: environmental, technological, organisational, and relational uncertainty. We present insights from four empirical cases of service dyads collected via multiple sources of evidence including 54 semi-structured...... interviews, observations, and secondary data. The cases show seven interaction paths with direct knock-on effects between two uncertainty types and indirect knock-on effects between three or four uncertainty types. The findings suggest a causal chain from environmental, technological, organisational......, to relational uncertainty. This research contributes to the servitization literature by (i) con-firming the existence of uncertainty types, (ii) providing an in-depth characterisation of technological uncertainty, and (iii) showing the interaction paths between four uncertainty types in the form of a causal...

  17. Healing and relaxation in flows of helium II. Part II. First, second, and fourth sound

    International Nuclear Information System (INIS)

    Hills, R.N.; Roberts, P.H.

    1978-01-01

    In Part I of this series, a theory of helium II incorporating the effects of quantum healing and relaxation was developed. In this paper, the propagation of first, second, and fourth sound is discussed. Particular attention is paid to sound propagation in the vicinity of the lambda point where the effects of relaxation and quantum healing become important

  18. Blade System Design Study. Part II, final project report (GEC).

    Energy Technology Data Exchange (ETDEWEB)

    Griffin, Dayton A. (DNV Global Energy Concepts Inc., Seattle, WA)

    2009-05-01

    As part of the U.S. Department of Energy's Low Wind Speed Turbine program, Global Energy Concepts LLC (GEC)1 has studied alternative composite materials for wind turbine blades in the multi-megawatt size range. This work in one of the Blade System Design Studies (BSDS) funded through Sandia National Laboratories. The BSDS program was conducted in two phases. In the Part I BSDS, GEC assessed candidate innovations in composite materials, manufacturing processes, and structural configurations. GEC also made recommendations for testing composite coupons, details, assemblies, and blade substructures to be carried out in the Part II study (BSDS-II). The BSDS-II contract period began in May 2003, and testing was initiated in June 2004. The current report summarizes the results from the BSDS-II test program. Composite materials evaluated include carbon fiber in both pre-impregnated and vacuum-assisted resin transfer molding (VARTM) forms. Initial thin-coupon static testing included a wide range of parameters, including variation in manufacturer, fiber tow size, fabric architecture, and resin type. A smaller set of these materials and process types was also evaluated in thin-coupon fatigue testing, and in ply-drop and ply-transition panels. The majority of materials used epoxy resin, with vinyl ester (VE) resin also used for selected cases. Late in the project, testing of unidirectional fiberglass was added to provide an updated baseline against which to evaluate the carbon material performance. Numerous unidirectional carbon fabrics were considered for evaluation with VARTM infusion. All but one fabric style considered suffered either from poor infusibility or waviness of fibers combined with poor compaction. The exception was a triaxial carbon-fiberglass fabric produced by SAERTEX. This fabric became the primary choice for infused articles throughout the test program. The generally positive results obtained in this program for the SAERTEX material have led to its

  19. The Mid America Heart Institute: part II.

    Science.gov (United States)

    McCallister, Ben D; Steinhaus, David M

    2003-01-01

    The Mid America Heart Institute (MAHI) is one of the first and largest hospitals developed and designed specifically for cardiovascular care. The MAHI hybrid model, which is a partnership between the not-for-profit Saint Luke's Health System, an independent academic medical center, and a private practice physician group, has been extremely successful in providing high-quality patient care as well as developing strong educational and research programs. The Heart Institute has been the leader in providing cardiovascular care in the Kansas City region since its inception in 1975. Although challenges in the future are substantial, it is felt that the MAHI is in an excellent position to deal with the serious issues in health care because of the Heart Institute, its facility, organization, administration, dedicated medical and support staff, and its unique business model of physician management. In part I, the authors described the background and infrastructure of the Heart Institute. In part II, cardiovascular research and benefits of physician management are addressed.

  20. Decision-Making under Criteria Uncertainty

    Science.gov (United States)

    Kureychik, V. M.; Safronenkova, I. B.

    2018-05-01

    Uncertainty is an essential part of a decision-making procedure. The paper deals with the problem of decision-making under criteria uncertainty. In this context, decision-making under uncertainty, types and conditions of uncertainty were examined. The decision-making problem under uncertainty was formalized. A modification of the mathematical decision support method under uncertainty via ontologies was proposed. A critical distinction of the developed method is ontology usage as its base elements. The goal of this work is a development of a decision-making method under criteria uncertainty with the use of ontologies in the area of multilayer board designing. This method is oriented to improvement of technical-economic values of the examined domain.

  1. Marketing in the E-Business World, Parts I & II | Smith | LBS ...

    African Journals Online (AJOL)

    Marketing in the E-Business World, Parts I & II. ... Open Access DOWNLOAD FULL TEXT ... of many of Americas largest companies gather at the Waldorf Astoria Hotel in New York City for the Conference Boards Annual Marketing Conference.

  2. 40 CFR Appendix II to Part 1045 - Duty Cycles for Propulsion Marine Engines

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Duty Cycles for Propulsion Marine... Pt. 1045, App. II Appendix II to Part 1045—Duty Cycles for Propulsion Marine Engines (a) The following duty cycle applies for discrete-mode testing: E4 Mode No. Enginespeed 1 Torque(percent) 2...

  3. Methods of humidity determination Part II: Determination of material humidity

    OpenAIRE

    Rübner, Katrin; Balköse, Devrim; Robens, E.

    2008-01-01

    Part II covers the most common methods of measuring the humidity of solid material. State of water near solid surfaces, gravimetric measurement of material humidity, measurement of water sorption isotherms, chemical methods for determination of water content, measurement of material humidity via the gas phase, standardisation, cosmonautical observations are reviewed.

  4. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    Science.gov (United States)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  5. THE SPECTRUM AND TERM ANALYSIS OF V II

    International Nuclear Information System (INIS)

    Thorne, A. P.; Pickering, J. C.; Semeniuk, J. I.

    2013-01-01

    The spectrum and extended term analysis of V II are presented. Fourier transform spectrometry was used to record high resolution spectra of singly ionized vanadium in the region 1492-5800 Å (67020-17260 cm –1 ) with vanadium-neon and vanadium-argon hollow cathode lamps as sources. The wavenumber uncertainty for the center of gravity of the strongest lines is typically 0.002 cm –1 , an improvement of an order of magnitude over previous measurements. Most of the lines exhibit partly resolved hyperfine structure. The V II energy levels in the 1985 compilation of Sugar and Corliss have been confirmed and revised, with the exception of the high-lying 4f levels and eight of the lower levels. Thirty-nine of the additional eighty-five high levels published by Iglesias et al. have also been confirmed and revised, and three of their missing levels have been found. The energy uncertainty of the revised levels has been reduced by about an order of magnitude. In total, 176 even levels and 233 odd levels are presented. Wavenumbers and classifications are given for 1242 V II lines

  6. The year 2012 in the European Heart Journal-Cardiovascular Imaging. Part II.

    Science.gov (United States)

    Plein, Sven; Knuuti, Juhani; Edvardsen, Thor; Saraste, Antti; Piérard, Luc A; Maurer, Gerald; Lancellotti, Patrizio

    2013-07-01

    The part II of the best of the European Heart Journal - Cardiovascular Imaging in 2012 specifically focuses on studies of valvular heart diseases, heart failure, cardiomyopathies, and congenital heart diseases.

  7. On the Efficiency of Connection Charges---Part II: Integration of Distributed Energy Resources

    OpenAIRE

    Munoz-Alvarez, Daniel; Garcia-Franco, Juan F.; Tong, Lang

    2017-01-01

    This two-part paper addresses the design of retail electricity tariffs for distribution systems with distributed energy resources (DERs). Part I presents a framework to optimize an ex-ante two-part tariff for a regulated monopolistic retailer who faces stochastic wholesale prices on the one hand and stochastic demand on the other. In Part II, the integration of DERs is addressed by analyzing their endogenous effect on the optimal two-part tariff and the induced welfare gains. Two DER integrat...

  8. Proposed standardized definitions for vertical resolution and uncertainty in the NDACC lidar ozone and temperature algorithms - Part 3: Temperature uncertainty budget

    Science.gov (United States)

    Leblanc, Thierry; Sica, Robert J.; van Gijsel, Joanna A. E.; Haefele, Alexander; Payen, Guillaume; Liberti, Gianluigi

    2016-08-01

    A standardized approach for the definition, propagation, and reporting of uncertainty in the temperature lidar data products contributing to the Network for the Detection for Atmospheric Composition Change (NDACC) database is proposed. One important aspect of the proposed approach is the ability to propagate all independent uncertainty components in parallel through the data processing chain. The individual uncertainty components are then combined together at the very last stage of processing to form the temperature combined standard uncertainty. The identified uncertainty sources comprise major components such as signal detection, saturation correction, background noise extraction, temperature tie-on at the top of the profile, and absorption by ozone if working in the visible spectrum, as well as other components such as molecular extinction, the acceleration of gravity, and the molecular mass of air, whose magnitudes depend on the instrument, data processing algorithm, and altitude range of interest. The expression of the individual uncertainty components and their step-by-step propagation through the temperature data processing chain are thoroughly estimated, taking into account the effect of vertical filtering and the merging of multiple channels. All sources of uncertainty except detection noise imply correlated terms in the vertical dimension, which means that covariance terms must be taken into account when vertical filtering is applied and when temperature is integrated from the top of the profile. Quantitatively, the uncertainty budget is presented in a generic form (i.e., as a function of instrument performance and wavelength), so that any NDACC temperature lidar investigator can easily estimate the expected impact of individual uncertainty components in the case of their own instrument. Using this standardized approach, an example of uncertainty budget is provided for the Jet Propulsion Laboratory (JPL) lidar at Mauna Loa Observatory, Hawai'i, which is

  9. Climate Projections and Uncertainty Communication.

    Science.gov (United States)

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. Copyright © 2015 Cognitive Science Society, Inc.

  10. ARI3SG: Aerosol retention in the secondary side of a steam generator. Part II: Model validation and uncertainty analysis

    International Nuclear Information System (INIS)

    Lopez, Claudia; Herranz, Luis E.

    2012-01-01

    Highlights: ► Validation of a model (ARI3SG) for the aerosol retention in the break stage of a steam generator under SGTR conditions. ► Interpretation of the experimental SGTR and CAAT data by using the ARI3SG model. ► Assessment of the epistemic and stochastic uncertainties effect on the ARI3SG results. - Abstract: A large body of data has been gathered in the last decade through the EU-SGTR, ARTIST and ARTIST 2 projects for aerosol retention in the steam generator during SGTR severe accident sequences. At the same time the attempt to extend the analytical capability has resulted in models that need to be validated. The ARI3SG is one of such developments and it has been built to estimate the aerosol retention in the break stage of a “dry” steam generator. This paper assesses the ARI3SG predictability by comparing its estimates to open data and by analyzing the effect of associated uncertainties. Datamodel comparison has been shown to be satisfactory and highlight the potential use of an ARI3SG-like formulation in system codes.

  11. Subseabed disposal program annual report, January-December 1979. Volume II. Appendices (principal investigator progress reports). Part 2 of 2

    International Nuclear Information System (INIS)

    Talbert, D.M.

    1981-04-01

    Volume II of the sixth annual report describing the progress and evaluating the status of the Subseabed Disposal Program contains the appendices referred to in Volume II, Summary and Status. Because of the length of Volume II, it has been split into two parts for publication purposes. Part 1 contains Appendices A-O; Part 2 contains Appendices P-FF. Separate abstracts have been prepared for each appendix for inclusion in the Energy Data Base

  12. Decay heat uncertainty quantification of MYRRHA

    Directory of Open Access Journals (Sweden)

    Fiorito Luca

    2017-01-01

    Full Text Available MYRRHA is a lead-bismuth cooled MOX-fueled accelerator driven system (ADS currently in the design phase at SCK·CEN in Belgium. The correct evaluation of the decay heat and of its uncertainty level is very important for the safety demonstration of the reactor. In the first part of this work we assessed the decay heat released by the MYRRHA core using the ALEPH-2 burnup code. The second part of the study focused on the nuclear data uncertainty and covariance propagation to the MYRRHA decay heat. Radioactive decay data, independent fission yield and cross section uncertainties/covariances were propagated using two nuclear data sampling codes, namely NUDUNA and SANDY. According to the results, 238U cross sections and fission yield data are the largest contributors to the MYRRHA decay heat uncertainty. The calculated uncertainty values are deemed acceptable from the safety point of view as they are well within the available regulatory limits.

  13. Subseabed disposal program annual report, January-December 1980. Volume II. Appendices (principal investigator progress reports). Part 1

    International Nuclear Information System (INIS)

    Hinga, K.R.

    1981-07-01

    Volume II of the sixth annual report describing the progress and evaluating the status of the Subseabed Disposal Program contains the appendices referred to in Volume I, Summary and Status. Because of the length of Volume II, it has been split into two parts for publication purposes. Part 1 contains Appendices A-Q; Part 2 contains Appendices R-MM. Separate abstracts have been prepared for each appendix for inclusion in the Energy Data Base

  14. Subseabed disposal program annual report, January-December 1980. Volume II. Appendices (principal investigator progress reports). Part 1

    Energy Technology Data Exchange (ETDEWEB)

    Hinga, K.R. (ed.)

    1981-07-01

    Volume II of the sixth annual report describing the progress and evaluating the status of the Subseabed Disposal Program contains the appendices referred to in Volume I, Summary and Status. Because of the length of Volume II, it has been split into two parts for publication purposes. Part 1 contains Appendices A-Q; Part 2 contains Appendices R-MM. Separate abstracts have been prepared for each appendix for inclusion in the Energy Data Base.

  15. Macro Expectations, Aggregate Uncertainty, and Expected Term Premia

    DEFF Research Database (Denmark)

    Dick, Christian D.; Schmeling, Maik; Schrimpf, Andreas

    2013-01-01

    as well as aggregate macroeconomic uncertainty at the level of individual forecasters. We find that expected term premia are (i) time-varying and reasonably persistent, (ii) strongly related to expectations about future output growth, and (iii) positively affected by uncertainty about future output growth...... and in ation rates. Expectations about real macroeconomic variables seem to matter more than expectations about nominal factors. Additional findings on term structure factors suggest that the level and slope factor capture information related to uncertainty about real and nominal macroeconomic prospects...

  16. Uncertainty Quantification in Aerodynamics Simulations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  17. Starting a hospital-based home health agency: Part II--Key success factors.

    Science.gov (United States)

    Montgomery, P

    1993-09-01

    In Part II of a three-part series, the financial, technological and legislative issues of a hospital-based home health-agency are discussed. Beginning a home healthcare service requires intensive research to answer key environmental and operational questions--need, competition, financial projections, initial start-up costs and the impact of delayed depreciation. Assessments involving technology, staffing, legislative and regulatory issues can help project service volume, productivity and cost-control.

  18. Association Between National Board Dental Examination Part II Scores and Comprehensive Examinations at Harvard School of Dental Medicine.

    Science.gov (United States)

    Lee, Min Kyeong; Allareddy, Veerasathpurush; Howell, T Howard; Karimbux, Nadeem Y

    2011-01-01

    Harvard School of Dental Medicine (HSDM) uses a hybrid problem-based approach to teaching in the predoctoral program. The objective structured clinical examination (OSCE) is a formative examination designed to assess the performance of students in the problem-based learning (PBL) curriculum. At HSDM three comprehensive examinations with OSCE components are administered during the third and fourth years of clinical training. The National Board Dental Examination (NBDE) Part II is taken in the final year of the predoctoral program. This study examines the association between the NBDE Part II and the comprehensive exams held at HSDM. Predoctoral students from the HSDM classes of 2005 and 2006 were included in this study. The outcome variable of interest was the scores obtained by students in the NBDE Part II, and the main independent variable of interest was the performance of students in the comprehensive exams (honors, pass, make-up exam to pass). The Mann-Whitney U-test was used to examine the association between the grades obtained in the each of the three comprehensive exams and the NBDE Part II scores. Multivariable linear regression analysis was also used to examine the association between the NBDE Part II scores and the comprehensive exam grades. The effect of potential confounding factors including age, sex, and race/ethnicity was adjusted. The results suggest that students who performed well in the comprehensive exams performed better on the NBDE Part II, even after adjusting for confounding factors. Future studies will examine the long-term impact of PBL on postdoctoral plans and career choices.

  19. Treatment of uncertainties in atmospheric chemical systems: A combined modeling and experimental approach

    Science.gov (United States)

    Pun, Betty Kong-Ling

    1998-12-01

    Uncertainty is endemic in modeling. This thesis is a two- phase program to understand the uncertainties in urban air pollution model predictions and in field data used to validate them. Part I demonstrates how to improve atmospheric models by analyzing the uncertainties in these models and using the results to guide new experimentation endeavors. Part II presents an experiment designed to characterize atmospheric fluctuations, which have significant implications towards the model validation process. A systematic study was undertaken to investigate the effects of uncertainties in the SAPRC mechanism for gas- phase chemistry in polluted atmospheres. The uncertainties of more than 500 parameters were compiled, including reaction rate constants, product coefficients, organic composition, and initial conditions. Uncertainty propagation using the Deterministic Equivalent Modeling Method (DEMM) revealed that the uncertainties in ozone predictions can be up to 45% based on these parametric uncertainties. The key parameters found to dominate the uncertainties of the predictions include photolysis rates of NO2, O3, and formaldehyde; the rate constant for nitric acid formation; and initial amounts of NOx and VOC. Similar uncertainty analysis procedures applied to two other mechanisms used in regional air quality models led to the conclusion that in the presence of parametric uncertainties, the mechanisms cannot be discriminated. Research efforts should focus on reducing parametric uncertainties in photolysis rates, reaction rate constants, and source terms. A new tunable diode laser (TDL) infrared spectrometer was designed and constructed to measure multiple pollutants simultaneously in the same ambient air parcels. The sensitivities of the one hertz measurements were 2 ppb for ozone, 1 ppb for NO, and 0.5 ppb for NO2. Meteorological data were also collected for wind, temperature, and UV intensity. The field data showed clear correlations between ozone, NO, and NO2 in the one

  20. Proposed standardized definitions for vertical resolution and uncertainty in the NDACC lidar ozone and temperature algorithms - Part 2: Ozone DIAL uncertainty budget

    Science.gov (United States)

    Leblanc, Thierry; Sica, Robert J.; van Gijsel, Joanna A. E.; Godin-Beekmann, Sophie; Haefele, Alexander; Trickl, Thomas; Payen, Guillaume; Liberti, Gianluigi

    2016-08-01

    A standardized approach for the definition, propagation, and reporting of uncertainty in the ozone differential absorption lidar data products contributing to the Network for the Detection for Atmospheric Composition Change (NDACC) database is proposed. One essential aspect of the proposed approach is the propagation in parallel of all independent uncertainty components through the data processing chain before they are combined together to form the ozone combined standard uncertainty. The independent uncertainty components contributing to the overall budget include random noise associated with signal detection, uncertainty due to saturation correction, background noise extraction, the absorption cross sections of O3, NO2, SO2, and O2, the molecular extinction cross sections, and the number densities of the air, NO2, and SO2. The expression of the individual uncertainty components and their step-by-step propagation through the ozone differential absorption lidar (DIAL) processing chain are thoroughly estimated. All sources of uncertainty except detection noise imply correlated terms in the vertical dimension, which requires knowledge of the covariance matrix when the lidar signal is vertically filtered. In addition, the covariance terms must be taken into account if the same detection hardware is shared by the lidar receiver channels at the absorbed and non-absorbed wavelengths. The ozone uncertainty budget is presented as much as possible in a generic form (i.e., as a function of instrument performance and wavelength) so that all NDACC ozone DIAL investigators across the network can estimate, for their own instrument and in a straightforward manner, the expected impact of each reviewed uncertainty component. In addition, two actual examples of full uncertainty budget are provided, using nighttime measurements from the tropospheric ozone DIAL located at the Jet Propulsion Laboratory (JPL) Table Mountain Facility, California, and nighttime measurements from the JPL

  1. Post-BEMUSE Reflood Model Input Uncertainty Methods (PREMIUM) Benchmark Phase II: Identification of Influential Parameters

    International Nuclear Information System (INIS)

    Kovtonyuk, A.; Petruzzi, A.; D'Auria, F.

    2015-01-01

    The objective of the Post-BEMUSE Reflood Model Input Uncertainty Methods (PREMIUM) benchmark is to progress on the issue of the quantification of the uncertainty of the physical models in system thermal-hydraulic codes by considering a concrete case: the physical models involved in the prediction of core reflooding. The PREMIUM benchmark consists of five phases. This report presents the results of Phase II dedicated to the identification of the uncertain code parameters associated with physical models used in the simulation of reflooding conditions. This identification is made on the basis of the Test 216 of the FEBA/SEFLEX programme according to the following steps: - identification of influential phenomena; - identification of the associated physical models and parameters, depending on the used code; - quantification of the variation range of identified input parameters through a series of sensitivity calculations. A procedure for the identification of potentially influential code input parameters has been set up in the Specifications of Phase II of PREMIUM benchmark. A set of quantitative criteria has been as well proposed for the identification of influential IP and their respective variation range. Thirteen participating organisations, using 8 different codes (7 system thermal-hydraulic codes and 1 sub-channel module of a system thermal-hydraulic code) submitted Phase II results. The base case calculations show spread in predicted cladding temperatures and quench front propagation that has been characterized. All the participants, except one, predict a too fast quench front progression. Besides, the cladding temperature time trends obtained by almost all the participants show oscillatory behaviour which may have numeric origins. Adopted criteria for identification of influential input parameters differ between the participants: some organisations used the set of criteria proposed in Specifications 'as is', some modified the quantitative thresholds

  2. Intelligent control of HVAC systems. Part II: perceptron performance analysis

    Directory of Open Access Journals (Sweden)

    Ioan URSU

    2013-09-01

    Full Text Available This is the second part of a paper on intelligent type control of Heating, Ventilating, and Air-Conditioning (HVAC systems. The whole study proposes a unified approach in the design of intelligent control for such systems, to ensure high energy efficiency and air quality improving. In the first part of the study it is considered as benchmark system a single thermal space HVAC system, for which it is assigned a mathematical model of the controlled system and a mathematical model(algorithm of intelligent control synthesis. The conception of the intelligent control is of switching type, between a simple neural network, a perceptron, which aims to decrease (optimize a cost index,and a fuzzy logic component, having supervisory antisaturating role for neuro-control. Based on numerical simulations, this Part II focuses on the analysis of system operation in the presence only ofthe neural control component. Working of the entire neuro-fuzzy system will be reported in a third part of the study.

  3. Uncertainty and sensitivity analysis in reactivity-initiated accident fuel modeling: synthesis of organisation for economic co-operation and development (OECD/nuclear energy agency (NEA benchmark on reactivity-initiated accident codes phase-II

    Directory of Open Access Journals (Sweden)

    Olivier Marchand

    2018-03-01

    Full Text Available In the framework of OECD/NEA Working Group on Fuel Safety, a RIA fuel-rod-code Benchmark Phase I was organized in 2010–2013. It consisted of four experiments on highly irradiated fuel rodlets tested under different experimental conditions. This benchmark revealed the need to better understand the basic models incorporated in each code for realistic simulation of the complicated integral RIA tests with high burnup fuel rods. A second phase of the benchmark (Phase II was thus launched early in 2014, which has been organized in two complementary activities: (1 comparison of the results of different simulations on simplified cases in order to provide additional bases for understanding the differences in modelling of the concerned phenomena; (2 assessment of the uncertainty of the results. The present paper provides a summary and conclusions of the second activity of the Benchmark Phase II, which is based on the input uncertainty propagation methodology. The main conclusion is that uncertainties cannot fully explain the difference between the code predictions. Finally, based on the RIA benchmark Phase-I and Phase-II conclusions, some recommendations are made. Keywords: RIA, Codes Benchmarking, Fuel Modelling, OECD

  4. International Working Group on Fast Reactors Thirteenth Annual Meeting. Summary Report. Part II

    International Nuclear Information System (INIS)

    1980-10-01

    The Thirteenth Annual Meeting of the IAEA International Working Group on Fast Reactors was held at the IAEA Headquarters, Vienna, Austria from 9 to 11 April 1980. The Summary Report (Part I) contains the Minutes of the Meeting. The Summary Report (Part II) contains the papers which review the national programme in the field of LMFBRs and other presentations at the Meeting. The Summary Report (Part III) contains the discussions on the review of the national programmes

  5. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  6. Measuring uncertainty within the theory of evidence

    CERN Document Server

    Salicone, Simona

    2018-01-01

    This monograph considers the evaluation and expression of measurement uncertainty within the mathematical framework of the Theory of Evidence. With a new perspective on the metrology science, the text paves the way for innovative applications in a wide range of areas. Building on Simona Salicone’s Measurement Uncertainty: An Approach via the Mathematical Theory of Evidence, the material covers further developments of the Random Fuzzy Variable (RFV) approach to uncertainty and provides a more robust mathematical and metrological background to the combination of measurement results that leads to a more effective RFV combination method. While the first part of the book introduces measurement uncertainty, the Theory of Evidence, and fuzzy sets, the following parts bring together these concepts and derive an effective methodology for the evaluation and expression of measurement uncertainty. A supplementary downloadable program allows the readers to interact with the proposed approach by generating and combining ...

  7. Mama Software Features: Uncertainty Testing

    Energy Technology Data Exchange (ETDEWEB)

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-30

    This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.

  8. Delivery systems for biopharmaceuticals. Part II: Liposomes, Micelles, Microemulsions and Dendrimers.

    Science.gov (United States)

    Silva, Ana C; Lopes, Carla M; Lobo, José M S; Amaral, Maria H

    2015-01-01

    Biopharmaceuticals are a generation of drugs that include peptides, proteins, nucleic acids and cell products. According to their particular molecular characteristics (e.g. high molecular size, susceptibility to enzymatic activity), these products present some limitations for administration and usually parenteral routes are the only option. To avoid these limitations, different colloidal carriers (e.g. liposomes, micelles, microemulsions and dendrimers) have been proposed to improve biopharmaceuticals delivery. Liposomes are promising drug delivery systems, despite some limitations have been reported (e.g. in vivo failure, poor long-term stability and low transfection efficiency), and only a limited number of formulations have reached the market. Micelles and microemulsions require more studies to exclude some of the observed drawbacks and guarantee their potential for use in clinic. According to their peculiar structures, dendrimers have been showing good results for nucleic acids delivery and a great development of these systems during next years is expected. This is the Part II of two review articles, which provides the state of the art of biopharmaceuticals delivery systems. Part II deals with liposomes, micelles, microemulsions and dendrimers.

  9. Signs of revision in Don Quixote, Part II

    Directory of Open Access Journals (Sweden)

    Gonzalo Pontón

    2016-11-01

    Full Text Available This article provides new evidences in favour of the hypothesis that Cervantes, after finishing Don Quixote, Part II, partially revised the original, introducing some significant changes and additions, mainly in the last chapters. The analysis of some narrative inconsistencies, that cannot be interpreted as mere mistakes but as significant textual traces, reveals a process of re-elaboration –a process that affects at least four sections of the novel. Most of the evidence gathered here suggests that this revision is closely linked to Avellaneda’s continuation, in the sense that Cervantes tried to challenge the apocriphal Quixote making last-time interventions in his own text.

  10. Uncertainties in projecting climate-change impacts in marine ecosystems

    DEFF Research Database (Denmark)

    Payne, Mark; Barange, Manuel; Cheung, William W. L.

    2016-01-01

    with a projection and building confidence in its robustness. We review how uncertainties in such projections are handled in marine science. We employ an approach developed in climate modelling by breaking uncertainty down into (i) structural (model) uncertainty, (ii) initialization and internal variability......Projections of the impacts of climate change on marine ecosystems are a key prerequisite for the planning of adaptation strategies, yet they are inevitably associated with uncertainty. Identifying, quantifying, and communicating this uncertainty is key to both evaluating the risk associated...... and highlight the opportunities and challenges associated with doing a better job. We find that even within a relatively small field such as marine science, there are substantial differences between subdisciplines in the degree of attention given to each type of uncertainty. We find that initialization...

  11. Nursing as concrete philosophy, Part II: Engaging with reality.

    Science.gov (United States)

    Theodoridis, Kyriakos

    2018-04-01

    This is the second paper of an essay in two parts. The first paper (Part I) is a critical discussion of Mark Risjord's conception of nursing knowledge where I argued against the conception of nursing knowledge as a kind of nursing science. The aim of the present paper (Part II) is to explicate and substantiate the thesis of nursing as a kind of concrete philosophy. My strategy is to elaborate upon certain themes from Wittgenstein's Tractatus in order to canvass a general scheme of philosophy based on a distinction between reality and the world. This distinction will be employed in the appropriation of certain significant features of nursing and nursing knowledge. By elaborating on the contrast between the abstract and the concrete, I will suggest that nursing may be seen as a kind of concrete philosophy, being primarily concerned with reality (and secondarily with the world). This thesis, I will argue, implies that philosophy is the kind of theory that is essential to nursing (which is not so much a theory than a certain kind of activity). © 2017 John Wiley & Sons Ltd.

  12. Impedance-Source Networks for Electric Power Conversion Part II

    DEFF Research Database (Denmark)

    Siwakoti, Yam P.; Peng, Fang Zheng; Blaabjerg, Frede

    2015-01-01

    Impedance-source networks cover the entire spectrum of electric power conversion applications (dc-dc, dc-ac, ac-dc, ac-ac) controlled and modulated by different modulation strategies to generate the desired dc or ac voltage and current at the output. A comprehensive review of various impedance......-source-network-based power converters has been covered in a previous paper and main topologies were discussed from an application point of view. Now Part II provides a comprehensive review of the most popular control and modulation strategies for impedance-source network-based power converters/inverters. These methods...

  13. Evaluation of uncertainty of adaptive radiation therapy

    International Nuclear Information System (INIS)

    Garcia Molla, R.; Gomez Martin, C.; Vidueira, L.; Juan-Senabre, X.; Garcia Gomez, R.

    2013-01-01

    This work is part of tests to perform to its acceptance in the clinical practice. The uncertainties of adaptive radiation, and which will separate the study, can be divided into two large parts: dosimetry in the CBCT and RDI. At each stage, their uncertainties are quantified and a level of action from which it would be reasonable to adapt the plan may be obtained with the total. (Author)

  14. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  15. Quantification of uncertainties of modeling and simulation

    International Nuclear Information System (INIS)

    Ma Zhibo; Yin Jianwei

    2012-01-01

    The principles of Modeling and Simulation (M and S) is interpreted by a functional relation, from which the total uncertainties of M and S are identified and sorted to three parts considered to vary along with the conceptual models' parameters. According to the idea of verification and validation, the space of the parameters is parted to verified and applied domains, uncertainties in the verified domain are quantified by comparison between numerical and standard results, and those in the applied domain are quantified by a newly developed extrapolating method. Examples are presented to demonstrate and qualify the ideas aimed to build a framework to quantify the uncertainties of M and S. (authors)

  16. Decay heat uncertainty quantification of MYRRHA

    OpenAIRE

    Fiorito Luca; Buss Oliver; Hoefer Axel; Stankovskiy Alexey; Eynde Gert Van den

    2017-01-01

    MYRRHA is a lead-bismuth cooled MOX-fueled accelerator driven system (ADS) currently in the design phase at SCK·CEN in Belgium. The correct evaluation of the decay heat and of its uncertainty level is very important for the safety demonstration of the reactor. In the first part of this work we assessed the decay heat released by the MYRRHA core using the ALEPH-2 burnup code. The second part of the study focused on the nuclear data uncertainty and covariance propagation to the MYRRHA decay hea...

  17. Pragmatic aspects of uncertainty propagation: A conceptual review

    KAUST Repository

    Thacker, W.Carlisle; Iskandarani, Mohamad; Gonç alves, Rafael C.; Srinivasan, Ashwanth; Knio, Omar

    2015-01-01

    When quantifying the uncertainty of the response of a computationally costly oceanographic or meteorological model stemming from the uncertainty of its inputs, practicality demands getting the most information using the fewest simulations. It is widely recognized that, by interpolating the results of a small number of simulations, results of additional simulations can be inexpensively approximated to provide a useful estimate of the variability of the response. Even so, as computing the simulations to be interpolated remains the biggest expense, the choice of these simulations deserves attention. When making this choice, two requirement should be considered: (i) the nature of the interpolation and ii) the available information about input uncertainty. Examples comparing polynomial interpolation and Gaussian process interpolation are presented for three different views of input uncertainty.

  18. Pragmatic aspects of uncertainty propagation: A conceptual review

    KAUST Repository

    Thacker, W.Carlisle

    2015-09-11

    When quantifying the uncertainty of the response of a computationally costly oceanographic or meteorological model stemming from the uncertainty of its inputs, practicality demands getting the most information using the fewest simulations. It is widely recognized that, by interpolating the results of a small number of simulations, results of additional simulations can be inexpensively approximated to provide a useful estimate of the variability of the response. Even so, as computing the simulations to be interpolated remains the biggest expense, the choice of these simulations deserves attention. When making this choice, two requirement should be considered: (i) the nature of the interpolation and ii) the available information about input uncertainty. Examples comparing polynomial interpolation and Gaussian process interpolation are presented for three different views of input uncertainty.

  19. Uncertainty and sensitivity analysis in performance assessment for the proposed high-level radioactive waste repository at Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Helton, Jon C.; Hansen, Clifford W.; Sallaberry, Cédric J.

    2012-01-01

    Extensive work has been carried out by the U.S. Department of Energy (DOE) in the development of a proposed geologic repository at Yucca Mountain (YM), Nevada, for the disposal of high-level radioactive waste. As part of this development, a detailed performance assessment (PA) for the YM repository was completed in 2008 and supported a license application by the DOE to the U.S. Nuclear Regulatory Commission (NRC) for the construction of the YM repository. The following aspects of the 2008 YM PA are described in this presentation: (i) conceptual structure and computational organization, (ii) uncertainty and sensitivity analysis techniques in use, (iii) uncertainty and sensitivity analysis for physical processes, and (iv) uncertainty and sensitivity analysis for expected dose to the reasonably maximally exposed individual (RMEI) specified the NRC’s regulations for the YM repository. - Highlights: ► An overview of performance assessment for the proposed Yucca Mountain radioactive waste repository is presented. ► Conceptual structure and computational organization are described. ► Uncertainty and sensitivity analysis techniques are described. ► Uncertainty and sensitivity analysis results for physical processes are presented. ► Uncertainty and sensitivity analysis results for expected dose are presented.

  20. Effect of activation cross-section uncertainties in selecting steels for the HYLIFE-II chamber to successful waste management

    International Nuclear Information System (INIS)

    Sanz, J.; Cabellos, O.; Reyes, S.

    2005-01-01

    We perform the waste management assessment of the different types of steels proposed as structural material for the inertial fusion energy (IFE) HYLIFE-II concept. Both recycling options, hands-on (HoR) and remote (RR), are unacceptable. Regarding shallow land burial (SLB), 304SS has a very good performance, and both Cr-W ferritic steels (FS) and oxide-dispersion-strengthened (ODS) FS are very likely to be acceptable. The only two impurity elements that question the possibility of obtaining reduced activation (RA) steels for SLB are niobium and molybdenum. The effect of activation cross-section uncertainties on SLB assessments is proved to be important. The necessary improvement of some tungsten and niobium cross-sections is justified

  1. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  2. Model uncertainty in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U; Huovinen, T [VTT Automation, Espoo (Finland). Industrial Automation

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.).

  3. The "Pseudocommando" mass murderer: part II, the language of revenge.

    Science.gov (United States)

    Knoll, James L

    2010-01-01

    In Part I of this article, research on pseudocommandos was reviewed, and the important role that revenge fantasies play in motivating such persons to commit mass murder-suicide was discussed. Before carrying out their mass shootings, pseudocommandos may communicate some final message to the public or news media. These communications are rich sources of data about their motives and psychopathology. In Part II of this article, forensic psycholinguistic analysis is applied to clarify the primary motivations, detect the presence of mental illness, and discern important individual differences in the final communications of two recent pseudocommandos: Seung-Hui Cho (Virginia Tech) and Jiverly Wong (Binghamton, NY). Although both men committed offenses that qualify them as pseudocommandos, their final communications reveal striking differences in their psychopathology.

  4. Nuclear Data Uncertainty Propagation in Depletion Calculations Using Cross Section Uncertainties in One-group or Multi-group

    Energy Technology Data Exchange (ETDEWEB)

    Díez, C.J., E-mail: cj.diez@upm.es [Dpto. de Ingeníera Nuclear, Universidad Politécnica de Madrid, 28006 Madrid (Spain); Cabellos, O. [Dpto. de Ingeníera Nuclear, Universidad Politécnica de Madrid, 28006 Madrid (Spain); Instituto de Fusión Nuclear, Universidad Politécnica de Madrid, 28006 Madrid (Spain); Martínez, J.S. [Dpto. de Ingeníera Nuclear, Universidad Politécnica de Madrid, 28006 Madrid (Spain)

    2015-01-15

    Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has to be performed in order to analyse the limitations of using one-group uncertainties.

  5. Nuclear Data Uncertainty Propagation in Depletion Calculations Using Cross Section Uncertainties in One-group or Multi-group

    International Nuclear Information System (INIS)

    Díez, C.J.; Cabellos, O.; Martínez, J.S.

    2015-01-01

    Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has to be performed in order to analyse the limitations of using one-group uncertainties

  6. Nuclear Data Uncertainty Propagation in Depletion Calculations Using Cross Section Uncertainties in One-group or Multi-group

    Science.gov (United States)

    Díez, C. J.; Cabellos, O.; Martínez, J. S.

    2015-01-01

    Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has to be performed in order to analyse the limitations of using one-group uncertainties.

  7. CE and nanomaterials - Part II: Nanomaterials in CE.

    Science.gov (United States)

    Adam, Vojtech; Vaculovicova, Marketa

    2017-10-01

    The scope of this two-part review is to summarize publications dealing with CE and nanomaterials together. This topic can be viewed from two broad perspectives, and this article is trying to highlight these two approaches: (i) CE of nanomaterials, and (ii) nanomaterials in CE. The second part aims at summarization of publications dealing with application of nanomaterials for enhancement of CE performance either in terms of increasing the separation resolution or for improvement of the detection. To increase the resolution, nanomaterials are employed as either surface modification of the capillary wall forming open tubular column or as additives to the separation electrolyte resulting in a pseudostationary phase. Moreover, nanomaterials have proven to be very beneficial for increasing also the sensitivity of detection employed in CE or even they enable the detection (e.g., fluorescent tags of nonfluorescent molecules). © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Nuclear data sensitivity and uncertainty for the Canadian supercritical water-cooled reactor II: Full core analysis

    International Nuclear Information System (INIS)

    Langton, S.E.; Buijs, A.; Pencer, J.

    2015-01-01

    Highlights: • H-2, Pu-239, and Th-232 make large contributions to SCWR modelling sensitivity. • H-2, Pu-239, and Th-232 make large contributions to SCWR modelling uncertainty. • Isotopes of Zr make large contributions to SCWR modelling uncertainty. - Abstract: Uncertainties in nuclear data are a fundamental source of uncertainty in reactor physics calculations. To determine their contribution to uncertainties in calculated reactor physics parameters, a nuclear data sensitivity and uncertainty study is performed on the Canadian supercritical water reactor (SCWR) concept. The nuclear data uncertainty contributions to the neutron multiplication factor k eff are 6.31 mk for the SCWR at the beginning of cycle (BOC) and 6.99 mk at the end of cycle (EOC). Both of these uncertainties have a statistical uncertainty of 0.02 mk. The nuclear data uncertainty contributions to Coolant Void Reactivity (CVR) are 1.0 mk and 0.9 mk for BOC and EOC, respectively, both with statistical uncertainties of 0.1 mk. The nuclear data uncertainty contributions to other reactivity parameters range from as low as 3% of to as high as ten times the values of the reactivity coefficients. The largest contributors to the uncertainties in the reactor physics parameters are Pu-239, Th-232, H-2, and isotopes of zirconium

  9. 40 CFR Appendix III to Part 266 - Tier II Emission Rate Screening Limits for Free Chlorine and Hydrogen Chloride

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 26 2010-07-01 2010-07-01 false Tier II Emission Rate Screening Limits for Free Chlorine and Hydrogen Chloride III Appendix III to Part 266 Protection of Environment... to Part 266—Tier II Emission Rate Screening Limits for Free Chlorine and Hydrogen Chloride Terrain...

  10. Simplified propagation of standard uncertainties

    International Nuclear Information System (INIS)

    Shull, A.H.

    1997-01-01

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper

  11. LISA package user guide. Part II: LISA (Long Term Isolation Safety Assessment) program description and user guide

    International Nuclear Information System (INIS)

    Prado, P.; Saltelli, A.; Homma, T.

    1992-01-01

    This manual is subdivided into three parts. In this second part, this document describes the LISA (Long term Isolation Safety Assessment) Code and its submodels. LISA is a tool for analysis of the safety of an underground disposal of nuclear waste. It has the capability to handle nuclide chain of arbitrary length and to evaluate the migration of nuclide through a geosphere medium composed of an arbitrary number of segments. LISA makes use of Monte Carlo methodology to evaluate the uncertainty in the quantity being assessed (eg dose) arising from the uncertainty in the model input parameters. In the present version LISA is equipped with a very simple source term submodel, a relatively complex geosphere and a simplified biosphere. The code is closely associated with its statistical pre-processor code (PREP), which generates the input Monte Carlo sample from the assigned parameter probability density functions and with its post-processor code (SPOP) which provides useful statistics on the output sample (uncertainty and sensitivity analysis). This report describes the general structure of LISA, its subroutines and submodels, the code input ant output files. It is intended to provide the user with enough information to know and run the code as well as the capacity to incorporate different submodels. 15 refs., 6 figs

  12. Communicating uncertainty media coverage of new and controversial science

    CERN Document Server

    Dunwoody, Sharon; Rogers, Carol L

    1997-01-01

    This work, by the editors of "Scientists and Journalists: Reporting Science as News", explores scientific uncertainty and media coverage of it in such major public issues as AISA, biotechnology, dioxin, global warming, and nature vs. nurture. It examines the interrelations of the major actors in constructing and explaining uncertainty: scientists, journalists, scholars, and the larger public. Part 1 examines participants in the scientific uncertainty arena and how the major actors react to, cope with and manage uncertain issues. It also describes how scientists and journalists vie for control over uncertain science. The panel discussion at the end of this section is a spirited discourse on how they handle scientific uncertainty. Part 2 explores instances of scientific uncertainty in the public arena, highlighting studies involving uncertainty and biotechnology, dioxin, human resources for science, and human behaviour. The panel discussion concluding this section reacts to several of these specific issues and ...

  13. The year 2013 in the European Heart Journal--Cardiovascular Imaging: Part II.

    Science.gov (United States)

    Plein, Sven; Edvardsen, Thor; Pierard, Luc A; Saraste, Antti; Knuuti, Juhani; Maurer, Gerald; Lancellotti, Patrizio

    2014-08-01

    The new multi-modality cardiovascular imaging journal, European Heart Journal - Cardiovascular Imaging, was created in 2012. Here we summarize the most important studies from the journal's second year in two articles. Part I of the review has summarized studies in myocardial function, myocardial ischaemia, and emerging techniques in cardiovascular imaging. Part II is focussed on valvular heart diseases, heart failure, cardiomyopathies, and congenital heart diseases. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2014. For permissions please email: journals.permissions@oup.com.

  14. Subseabed disposal program annual report, January-December 1979. Volume II. Appendices (principal investigator progress reports). Part 1 of 2

    International Nuclear Information System (INIS)

    Talbert, D.M.

    1981-04-01

    Volume II of the sixth annual report describing the progress and evaluating the status of the Subseabed Disposal Program contains the appendices referred to in Volume I, Summary and Status. Because of the length of Volume II, it has been split into two parts for publication purposes. Part 1 contains Appendices A-O; Part 2 contains Appendices P-FF. Separate abstracts have been prepared of each Appendix for inclusion in the Energy Data Base

  15. Influence of resonance parameters' correlations on the resonance integral uncertainty; 55Mn case

    International Nuclear Information System (INIS)

    Zerovnik, Gasper; Trkov, Andrej; Capote, Roberto; Rochman, Dimitri

    2011-01-01

    For nuclides with a large number of resonances the covariance matrix of resonance parameters can become very large and expensive to process in terms of the computation time. By converting covariance matrix of resonance parameters into covariance matrices of background cross-section in a more or less coarse group structure a considerable amount of computer time and memory can be saved. The question is how important is the information that is discarded in the process. First, the uncertainty of the 55 Mn resonance integral was estimated in narrow resonance approximation for different levels of self-shielding using Bondarenko method by random sampling of resonance parameters according to their covariance matrices from two different 55 Mn evaluations: one from Nuclear Research and Consultancy Group NRG (with large uncertainties but no correlations between resonances), the other from Oak Ridge National Laboratory (with smaller uncertainties but full covariance matrix). We have found out that if all (or at least significant part of the) resonance parameters are correlated, the resonance integral uncertainty greatly depends on the level of self-shielding. Second, it was shown that the commonly used 640-group SAND-II representation cannot describe the increase of the resonance integral uncertainty. A much finer energy mesh for the background covariance matrix would have to be used to take the resonance structure into account explicitly, but then the objective of a more compact data representation is lost.

  16. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Elert, M. [Kemakta Konsult AB, Stockholm (Sweden)] [ed.

    1996-09-01

    In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root

  17. Do oil shocks predict economic policy uncertainty?

    Science.gov (United States)

    Rehman, Mobeen Ur

    2018-05-01

    Oil price fluctuations have influential role in global economic policies for developed as well as emerging countries. I investigate the role of international oil prices disintegrated into structural (i) oil supply shock, (ii) aggregate demand shock and (iii) oil market specific demand shocks, based on the work of Kilian (2009) using structural VAR framework on economic policies uncertainty of sampled markets. Economic policy uncertainty, due to its non-linear behavior is modeled in a regime switching framework with disintegrated structural oil shocks. Our results highlight that Indian, Spain and Japanese economic policy uncertainty responds to the global oil price shocks, however aggregate demand shocks fail to induce any change. Oil specific demand shocks are significant only for China and India in high volatility state.

  18. Bloqueio do nervo supraescapular: procedimento importante na prática clínica. Parte II Suprascapular nerve block: important procedure in clinical practice. Part II

    Directory of Open Access Journals (Sweden)

    Marcos Rassi Fernandes

    2012-08-01

    Full Text Available O bloqueio do nervo supraescapular é um método de tratamento reprodutível, confiável e extremamente efetivo no controle da dor no ombro. Esse método tem sido amplamente utilizado por profissionais na prática clínica, como reumatologistas, ortopedistas, neurologistas e especialistas em dor, na terapêutica de enfermidades crônicas, como lesão irreparável do manguito rotador, artrite reumatoide, sequelas de AVC e capsulite adesiva, o que justifica a presente revisão (Parte II. O objetivo deste estudo foi descrever as técnicas do procedimento e suas complicações descritas na literatura, já que a primeira parte reportou as indicações clínicas, drogas e volumes utilizados em aplicação única ou múltipla. Apresentamse, detalhadamente, os acessos para a realização do procedimento tanto direto como indireto, anterior e posterior, lateral e medial, e superior e inferior. Diversas são as opções para se realizar o bloqueio do nervo supraescapular. Apesar de raras, as complicações podem ocorrer. Quando bem indicado, este método deve ser considerado.The suprascapular nerve block is a reproducible, reliable, and extremely effective treatment method in shoulder pain control. This method has been widely used by professionals in clinical practice such as rheumatologists, orthopedists, neurologists, and pain specialists in the treatment of chronic diseases such as irreparable rotator cuff injury, rheumatoid arthritis, stroke sequelae, and adhesive capsulitis, which justifies the present review (Part II. The objective of this study was to describe the techniques and complications of the procedure described in the literature, as the first part reported the clinical indications, drugs, and volumes used in single or multiple procedures. We present in details the accesses used in the procedure: direct and indirect, anterior and posterior, lateral and medial, upper and lower. There are several options to perform suprascapular nerve block

  19. Epistemic uncertainties and natural hazard risk assessment - Part 1: A review of the issues

    Science.gov (United States)

    Beven, K. J.; Aspinall, W. P.; Bates, P. D.; Borgomeo, E.; Goda, K.; Hall, J. W.; Page, T.; Phillips, J. C.; Rougier, J. T.; Simpson, M.; Stephenson, D. B.; Smith, P. J.; Wagener, T.; Watson, M.

    2015-12-01

    Uncertainties in natural hazard risk assessment are generally dominated by the sources arising from lack of knowledge or understanding of the processes involved. There is a lack of knowledge about frequencies, process representations, parameters, present and future boundary conditions, consequences and impacts, and the meaning of observations in evaluating simulation models. These are the epistemic uncertainties that can be difficult to constrain, especially in terms of event or scenario probabilities, even as elicited probabilities rationalized on the basis of expert judgements. This paper reviews the issues raised by trying to quantify the effects of epistemic uncertainties. Such scientific uncertainties might have significant influence on decisions that are made for risk management, so it is important to communicate the meaning of an uncertainty estimate and to provide an audit trail of the assumptions on which it is based. Some suggestions for good practice in doing so are made.

  20. Reliability of a new biokinetic model of zirconium in internal dosimetry: part II, parameter sensitivity analysis.

    Science.gov (United States)

    Li, Wei Bo; Greiter, Matthias; Oeh, Uwe; Hoeschen, Christoph

    2011-12-01

    The reliability of biokinetic models is essential for the assessment of internal doses and a radiation risk analysis for the public and occupational workers exposed to radionuclides. In the present study, a method for assessing the reliability of biokinetic models by means of uncertainty and sensitivity analysis was developed. In the first part of the paper, the parameter uncertainty was analyzed for two biokinetic models of zirconium (Zr); one was reported by the International Commission on Radiological Protection (ICRP), and one was developed at the Helmholtz Zentrum München-German Research Center for Environmental Health (HMGU). In the second part of the paper, the parameter uncertainties and distributions of the Zr biokinetic models evaluated in Part I are used as the model inputs for identifying the most influential parameters in the models. Furthermore, the most influential model parameter on the integral of the radioactivity of Zr over 50 y in source organs after ingestion was identified. The results of the systemic HMGU Zr model showed that over the first 10 d, the parameters of transfer rates between blood and other soft tissues have the largest influence on the content of Zr in the blood and the daily urinary excretion; however, after day 1,000, the transfer rate from bone to blood becomes dominant. For the retention in bone, the transfer rate from blood to bone surfaces has the most influence out to the endpoint of the simulation; the transfer rate from blood to the upper larger intestine contributes a lot in the later days; i.e., after day 300. The alimentary tract absorption factor (fA) influences mostly the integral of radioactivity of Zr in most source organs after ingestion.

  1. Alignment of the NPL Mark II watt balance

    International Nuclear Information System (INIS)

    Robinson, I A

    2012-01-01

    To reach uncertainties in the region of 1 part in 10 8 a moving-coil watt balance not only requires the accurate measurement of voltage, resistance, velocity, mass and the acceleration due to gravity but, in addition, requires the apparatus to be adjusted correctly to minimize the second order effects which can reduce the accuracy of the measurement. This paper collects together the alignment and correction techniques that have been developed at NPL over many years and are required to minimize the uncertainty of the measurement. Some of these techniques are applicable to all watt balances, whilst a few are specific to watt balances that employ a conventional beam balance to support a circular coil in a radial magnetic field, such as the NPL Mark II watt balance, now known as the NRC watt balance. (paper)

  2. Three Mile Island: a report to the commissioners and to the public. Volume II, Part 3

    International Nuclear Information System (INIS)

    1979-01-01

    This is the third and final part of the second volume of a study of the Three Mile Island accident. Part 3 of Volume II contains descriptions and assessments of responses to the accident by the utility and by the NRC and other government agencies

  3. Uncertainty estimation of Intensity-Duration-Frequency relationships: A regional analysis

    Science.gov (United States)

    Mélèse, Victor; Blanchet, Juliette; Molinié, Gilles

    2018-03-01

    We propose in this article a regional study of uncertainties in IDF curves derived from point-rainfall maxima. We develop two generalized extreme value models based on the simple scaling assumption, first in the frequentist framework and second in the Bayesian framework. Within the frequentist framework, uncertainties are obtained i) from the Gaussian density stemming from the asymptotic normality theorem of the maximum likelihood and ii) with a bootstrap procedure. Within the Bayesian framework, uncertainties are obtained from the posterior densities. We confront these two frameworks on the same database covering a large region of 100, 000 km2 in southern France with contrasted rainfall regime, in order to be able to draw conclusion that are not specific to the data. The two frameworks are applied to 405 hourly stations with data back to the 1980's, accumulated in the range 3 h-120 h. We show that i) the Bayesian framework is more robust than the frequentist one to the starting point of the estimation procedure, ii) the posterior and the bootstrap densities are able to better adjust uncertainty estimation to the data than the Gaussian density, and iii) the bootstrap density give unreasonable confidence intervals, in particular for return levels associated to large return period. Therefore our recommendation goes towards the use of the Bayesian framework to compute uncertainty.

  4. Scientific uncertainty in media content: Introduction to this special issue.

    Science.gov (United States)

    Peters, Hans Peter; Dunwoody, Sharon

    2016-11-01

    This introduction sets the stage for the special issue on the public communication of scientific uncertainty that follows by sketching the wider landscape of issues related to the communication of uncertainty and showing how the individual contributions fit into that landscape. The first part of the introduction discusses the creation of media content as a process involving journalists, scientific sources, stakeholders, and the responsive audience. The second part then provides an overview of the perception of scientific uncertainty presented by the media and the consequences for the recipients' own assessments of uncertainty. Finally, we briefly describe the six research articles included in this special issue. © The Author(s) 2016.

  5. Numerical Simulation of Projectile Impact on Mild Steel Armour Platesusing LS-DYNA, Part II: Parametric Studies

    OpenAIRE

    M. Raguraman; A. Deb; N. K. Gupta; D. K. Kharat

    2008-01-01

    In Part I of the current two-part series, a comprehensive simulation-based study of impact of jacketed projectiles on mild steel armour plates has been presented. Using the modelling procedures developed in Part I, a number of parametric studies have been carried out for the same mild steel plates considered in Part I and reported here in Part II. The current investigation includes determination of ballistic limits of a given target plate for different projectile diameters and impact velociti...

  6. PIC Simulations in Low Energy Part of PIP-II Proton Linac

    Energy Technology Data Exchange (ETDEWEB)

    Romanov, Gennady

    2014-07-01

    The front end of PIP-II linac is composed of a 30 keV ion source, low energy beam transport line (LEBT), 2.1 MeV radio frequency quadrupole (RFQ), and medium energy beam transport line (MEBT). This configuration is currently being assembled at Fermilab to support a complete systems test. The front end represents the primary technical risk with PIP-II, and so this step will validate the concept and demonstrate that the hardware can meet the specified requirements. SC accelerating cavities right after MEBT require high quality and well defined beam after RFQ to avoid excessive particle losses. In this paper we will present recent progress of beam dynamic study, using CST PIC simulation code, to investigate partial neutralization effect in LEBT, halo and tail formation in RFQ, total emittance growth and beam losses along low energy part of the linac.

  7. Compilation of information on uncertainties involved in deposition modeling

    International Nuclear Information System (INIS)

    Lewellen, W.S.; Varma, A.K.; Sheng, Y.P.

    1985-04-01

    The current generation of dispersion models contains very simple parameterizations of deposition processes. The analysis here looks at the physical mechanisms governing these processes in an attempt to see if more valid parameterizations are available and what level of uncertainty is involved in either these simple parameterizations or any more advanced parameterization. The report is composed of three parts. The first, on dry deposition model sensitivity, provides an estimate of the uncertainty existing in current estimates of the deposition velocity due to uncertainties in independent variables such as meteorological stability, particle size, surface chemical reactivity and canopy structure. The range of uncertainty estimated for an appropriate dry deposition velocity for a plume generated by a nuclear power plant accident is three orders of magnitude. The second part discusses the uncertainties involved in precipitation scavenging rates for effluents resulting from a nuclear reactor accident. The conclusion is that major uncertainties are involved both as a result of the natural variability of the atmospheric precipitation process and due to our incomplete understanding of the underlying process. The third part involves a review of the important problems associated with modeling the interaction between the atmosphere and a forest. It gives an indication of the magnitude of the problem involved in modeling dry deposition in such environments. Separate analytics have been done for each section and are contained in the EDB

  8. Uncertainty in prediction and in inference

    International Nuclear Information System (INIS)

    Hilgevoord, J.; Uffink, J.

    1991-01-01

    The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close relationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in inference can be obtained by means of the so-called statistical distance between probability distributions. When applied to quantum mechanics, this distance leads to a measure of the distinguishability of quantum states, which essentially is the absolute value of the matrix element between the states. The importance of this result to the quantum mechanical uncertainty principle is noted. The second part of the paper provides a derivation of the statistical distance on the basis of the so-called method of support

  9. A thermoelectric power generating heat exchanger: Part II – Numerical modeling and optimization

    DEFF Research Database (Denmark)

    Sarhadi, Ali; Bjørk, Rasmus; Lindeburg, N.

    2016-01-01

    In Part I of this study, the performance of an experimental integrated thermoelectric generator (TEG)-heat exchanger was presented. In the current study, Part II, the obtained experimental results are compared with those predicted by a finite element (FE) model. In the simulation of the integrated...... TEG-heat exchanger, the thermal contact resistance between the TEG and the heat exchanger is modeled assuming either an ideal thermal contact or using a combined Cooper–Mikic–Yovanovich (CMY) and parallel plate gap formulation, which takes into account the contact pressure, roughness and hardness...

  10. The uncertainty in physical measurements an introduction to data analysis in the physics laboratory

    CERN Document Server

    Fornasini, Paolo

    2008-01-01

    All measurements of physical quantities are affected by uncertainty. Understanding the origin of uncertainty, evaluating its extent and suitably taking it into account in data analysis is essential for assessing the degree of accuracy of phenomenological relationships and physical laws in both scientific research and technological applications. The Uncertainty in Physical Measurements: An Introduction to Data Analysis in the Physics Laboratory presents an introduction to uncertainty and to some of the most common procedures of data analysis. This book will serve the reader well by filling the gap between tutorial textbooks and highly specialized monographs. The book is divided into three parts. The first part is a phenomenological introduction to measurement and uncertainty: properties of instruments, different causes and corresponding expressions of uncertainty, histograms and distributions, and unified expression of uncertainty. The second part contains an introduction to probability theory, random variable...

  11. Understanding Medicines: Conceptual Analysis of Nurses' Needs for Knowledge and Understanding of Pharmacology (Part I). Understanding Medicines: Extending Pharmacology Education for Dependent and Independent Prescribing (Part II).

    Science.gov (United States)

    Leathard, Helen L.

    2001-01-01

    Part I reviews what nurses need to know about the administration and prescription of medicines. Part II addresses drug classifications, actions and effects, and interactions. Also discussed are the challenges pharmacological issues pose for nursing education. (SK)

  12. Uncertainty in biodiversity science, policy and management: a conceptual overview

    Directory of Open Access Journals (Sweden)

    Yrjö Haila

    2014-10-01

    Full Text Available The protection of biodiversity is a complex societal, political and ultimately practical imperative of current global society. The imperative builds upon scientific knowledge on human dependence on the life-support systems of the Earth. This paper aims at introducing main types of uncertainty inherent in biodiversity science, policy and management, as an introduction to a companion paper summarizing practical experiences of scientists and scholars (Haila et al. 2014. Uncertainty is a cluster concept: the actual nature of uncertainty is inherently context-bound. We use semantic space as a conceptual device to identify key dimensions of uncertainty in the context of biodiversity protection; these relate to [i] data; [ii] proxies; [iii] concepts; [iv] policy and management; and [v] normative goals. Semantic space offers an analytic perspective for drawing critical distinctions between types of uncertainty, identifying fruitful resonances that help to cope with the uncertainties, and building up collaboration between different specialists to support mutual social learning.

  13. Bayesian inference for psychology. Part II: Example applications with JASP.

    Science.gov (United States)

    Wagenmakers, Eric-Jan; Love, Jonathon; Marsman, Maarten; Jamil, Tahira; Ly, Alexander; Verhagen, Josine; Selker, Ravi; Gronau, Quentin F; Dropmann, Damian; Boutin, Bruno; Meerhoff, Frans; Knight, Patrick; Raj, Akash; van Kesteren, Erik-Jan; van Doorn, Johnny; Šmíra, Martin; Epskamp, Sacha; Etz, Alexander; Matzke, Dora; de Jong, Tim; van den Bergh, Don; Sarafoglou, Alexandra; Steingroever, Helen; Derks, Koen; Rouder, Jeffrey N; Morey, Richard D

    2018-02-01

    Bayesian hypothesis testing presents an attractive alternative to p value hypothesis testing. Part I of this series outlined several advantages of Bayesian hypothesis testing, including the ability to quantify evidence and the ability to monitor and update this evidence as data come in, without the need to know the intention with which the data were collected. Despite these and other practical advantages, Bayesian hypothesis tests are still reported relatively rarely. An important impediment to the widespread adoption of Bayesian tests is arguably the lack of user-friendly software for the run-of-the-mill statistical problems that confront psychologists for the analysis of almost every experiment: the t-test, ANOVA, correlation, regression, and contingency tables. In Part II of this series we introduce JASP ( http://www.jasp-stats.org ), an open-source, cross-platform, user-friendly graphical software package that allows users to carry out Bayesian hypothesis tests for standard statistical problems. JASP is based in part on the Bayesian analyses implemented in Morey and Rouder's BayesFactor package for R. Armed with JASP, the practical advantages of Bayesian hypothesis testing are only a mouse click away.

  14. Some target assay uncertainties for passive neutron coincidence counting

    International Nuclear Information System (INIS)

    Ensslin, N.; Langner, D.G.; Menlove, H.O.; Miller, M.C.; Russo, P.A.

    1990-01-01

    This paper provides some target assay uncertainties for passive neutron coincidence counting of plutonium metal, oxide, mixed oxide, and scrap and waste. The target values are based in part on past user experience and in part on the estimated results from new coincidence counting techniques that are under development. The paper summarizes assay error sources and the new coincidence techniques, and recommends the technique that is likely to yield the lowest assay uncertainty for a given material type. These target assay uncertainties are intended to be useful for NDA instrument selection and assay variance propagation studies for both new and existing facilities. 14 refs., 3 tabs

  15. Solution weighting for the SAND-II Monte Carlo code

    International Nuclear Information System (INIS)

    Oster, C.A.; McElroy, W.N.; Simons, R.L.; Lippincott, E.P.; Odette, G.R.

    1976-01-01

    Modifications to the SAND-II Error Analysis Monte Carlo code to include solution weighting based on input data uncertainties have been made and are discussed together with background information on the SAND-II algorithm. The new procedure permits input data having smaller uncertainties to have a greater influence on the solution spectrum than do the data having larger uncertainties. The results of an indepth study to find a practical procedure and the first results of its application to three important Interlaboratory LMFBR Reaction Rate (ILRR) program benchmark spectra (CFRMF, ΣΣ, and 235 U fission) are discussed

  16. IAEA CRP on HTGR Uncertainties in Modeling: Assessment of Phase I Lattice to Core Model Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Rouxelin, Pascal Nicolas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Best-estimate plus uncertainty analysis of reactors is replacing the traditional conservative (stacked uncertainty) method for safety and licensing analysis. To facilitate uncertainty analysis applications, a comprehensive approach and methodology must be developed and applied. High temperature gas cooled reactors (HTGRs) have several features that require techniques not used in light-water reactor analysis (e.g., coated-particle design and large graphite quantities at high temperatures). The International Atomic Energy Agency has therefore launched the Coordinated Research Project on HTGR Uncertainty Analysis in Modeling to study uncertainty propagation in the HTGR analysis chain. The benchmark problem defined for the prismatic design is represented by the General Atomics Modular HTGR 350. The main focus of this report is the compilation and discussion of the results obtained for various permutations of Exercise I 2c and the use of the cross section data in Exercise II 1a of the prismatic benchmark, which is defined as the last and first steps of the lattice and core simulation phases, respectively. The report summarizes the Idaho National Laboratory (INL) best estimate results obtained for Exercise I 2a (fresh single-fuel block), Exercise I 2b (depleted single-fuel block), and Exercise I 2c (super cell) in addition to the first results of an investigation into the cross section generation effects for the super-cell problem. The two dimensional deterministic code known as the New ESC based Weighting Transport (NEWT) included in the Standardized Computer Analyses for Licensing Evaluation (SCALE) 6.1.2 package was used for the cross section evaluation, and the results obtained were compared to the three dimensional stochastic SCALE module KENO VI. The NEWT cross section libraries were generated for several permutations of the current benchmark super-cell geometry and were then provided as input to the Phase II core calculation of the stand alone neutronics Exercise

  17. Uncertainty estimates for theoretical atomic and molecular data

    International Nuclear Information System (INIS)

    Chung, H-K; Braams, B J; Bartschat, K; Császár, A G; Drake, G W F; Kirchner, T; Kokoouline, V; Tennyson, J

    2016-01-01

    Sources of uncertainty are reviewed for calculated atomic and molecular data that are important for plasma modeling: atomic and molecular structures and cross sections for electron-atom, electron-molecule, and heavy particle collisions. We concentrate on model uncertainties due to approximations to the fundamental many-body quantum mechanical equations and we aim to provide guidelines to estimate uncertainties as a routine part of computations of data for structure and scattering. (topical review)

  18. Past changes in the vertical distribution of ozone – Part 1: Measurement techniques, uncertainties and availability

    Directory of Open Access Journals (Sweden)

    B. Hassler

    2014-05-01

    Full Text Available Peak stratospheric chlorofluorocarbon (CFC and other ozone depleting substance (ODS concentrations were reached in the mid- to late 1990s. Detection and attribution of the expected recovery of the stratospheric ozone layer in an atmosphere with reduced ODSs as well as efforts to understand the evolution of stratospheric ozone in the presence of increasing greenhouse gases are key current research topics. These require a critical examination of the ozone changes with an accurate knowledge of the spatial (geographical and vertical and temporal ozone response. For such an examination, it is vital that the quality of the measurements used be as high as possible and measurement uncertainties well quantified. In preparation for the 2014 United Nations Environment Programme (UNEP/World Meteorological Organization (WMO Scientific Assessment of Ozone Depletion, the SPARC/IO3C/IGACO-O3/NDACC (SI2N Initiative was designed to study and document changes in the global ozone profile distribution. This requires assessing long-term ozone profile data sets in regards to measurement stability and uncertainty characteristics. The ultimate goal is to establish suitability for estimating long-term ozone trends to contribute to ozone recovery studies. Some of the data sets have been improved as part of this initiative with updated versions now available. This summary presents an overview of stratospheric ozone profile measurement data sets (ground and satellite based available for ozone recovery studies. Here we document measurement techniques, spatial and temporal coverage, vertical resolution, native units and measurement uncertainties. In addition, the latest data versions are briefly described (including data version updates as well as detailing multiple retrievals when available for a given satellite instrument. Archive location information for each data set is also given.

  19. Scope Oriented Thermoeconomic analysis of energy systems. Part II: Formation Structure of Optimality for robust design

    International Nuclear Information System (INIS)

    Piacentino, Antonio; Cardona, Ennio

    2010-01-01

    This paper represents the Part II of a paper in two parts. In Part I the fundamentals of Scope Oriented Thermoeconomics have been introduced, showing a scarce potential for the cost accounting of existing plants; in this Part II the same concepts are applied to the optimization of a small set of design variables for a vapour compression chiller. The method overcomes the limit of most conventional optimization techniques, which are usually based on hermetic algorithms not enabling the energy analyst to recognize all the margins for improvement. The Scope Oriented Thermoeconomic optimization allows us to disassemble the optimization process, thus recognizing the Formation Structure of Optimality, i.e. the specific influence of any thermodynamic and economic parameter in the path toward the optimal design. Finally, the potential applications of such an in-depth understanding of the inner driving forces of the optimization are discussed in the paper, with a particular focus on the sensitivity analysis to the variation of energy and capital costs and on the actual operation-oriented design.

  20. Structure Learning and Statistical Estimation in Distribution Networks - Part II

    Energy Technology Data Exchange (ETDEWEB)

    Deka, Deepjyoti [Univ. of Texas, Austin, TX (United States); Backhaus, Scott N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Chertkov, Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-13

    Limited placement of real-time monitoring devices in the distribution grid, recent trends notwithstanding, has prevented the easy implementation of demand-response and other smart grid applications. Part I of this paper discusses the problem of learning the operational structure of the grid from nodal voltage measurements. In this work (Part II), the learning of the operational radial structure is coupled with the problem of estimating nodal consumption statistics and inferring the line parameters in the grid. Based on a Linear-Coupled(LC) approximation of AC power flows equations, polynomial time algorithms are designed to identify the structure and estimate nodal load characteristics and/or line parameters in the grid using the available nodal voltage measurements. Then the structure learning algorithm is extended to cases with missing data, where available observations are limited to a fraction of the grid nodes. The efficacy of the presented algorithms are demonstrated through simulations on several distribution test cases.

  1. Uncertainty: the Curate's egg in financial economics.

    Science.gov (United States)

    Pixley, Jocelyn

    2014-06-01

    Economic theories of uncertainty are unpopular with financial experts. As sociologists, we rightly refuse predictions, but the uncertainties of money are constantly sifted and turned into semi-denial by a financial economics set on somehow beating the future. Picking out 'bits' of the future as 'risk' and 'parts' as 'information' is attractive but socially dangerous, I argue, because money's promises are always uncertain. New studies of uncertainty are reversing sociology's neglect of the unavoidable inability to know the forces that will shape the financial future. © London School of Economics and Political Science 2014.

  2. Treatment of measurement uncertainties at the power burst facility

    International Nuclear Information System (INIS)

    Meyer, L.C.

    1980-01-01

    The treatment of measurement uncertainty at the Power Burst Facility provides a means of improving data integrity as well as meeting standard practice reporting requirements. This is accomplished by performing the uncertainty analysis in two parts, test independent uncertainty analysis and test dependent uncertainty analysis. The test independent uncertainty analysis is performed on instrumentation used repeatedly from one test to the next, and does not have to be repeated for each test except for improved or new types of instruments. A test dependent uncertainty analysis is performed on each test based on the test independent uncertainties modified as required by test specifications, experiment fixture design, and historical performance of instruments on similar tests. The methodology for performing uncertainty analysis based on the National Bureau of Standards method is reviewed with examples applied to nuclear instrumentation

  3. Emerging Forms of the Part II of Jonathan Swift's Novel “Gulliver’s Travels”

    Directory of Open Access Journals (Sweden)

    Svitlana Tikhonenko

    2017-11-01

    Full Text Available The article is devoted to the study of grotesque forms in Jonathan Swift's novel "Gulliver’s Travels" based on the text of part II of the novel "A Voyage to Brobdingnag". On the basis of the selected actual material, displays of the grotesque elements in the semantic field of the work’s text are traced. The grotesque world of the novel is the author's model of mankind, in which J. Swift presents his view not only on the state of the modern system of England, but also on the nature of man in general, reveals the peculiarities of the psychology of human nature, especially human socialization. In part II, the author continues to develop a complex and contradictory picture of human existence in front of the reader, the world of giants appears as an ambivalent system in which the features of an ideal society and ideal ruler, in author’s opinion, with the ugly face of man and society, are marvelously combined.

  4. Numerical simulation of projectile impact on mild steel armour plates using LS-DYNA, Part II: Parametric studies

    OpenAIRE

    Raguraman, M; Deb, A; Gupta, NK; Kharat, DK

    2008-01-01

    In Part I of the current two-part series, a comprehensive simulation-based study of impact of Jacketed projectiles on mild steel armour plates has been presented. Using the modelling procedures developed in Part I, a number of parametric studies have been carried out for the same mild steel plates considered in Part I and reported here in Part II. The current investigation includes determination of ballistic limits of a given target plate for different projectile diameters and impact velociti...

  5. When 1+1 can be >2: Uncertainties compound when simulating climate, fisheries and marine ecosystems

    Science.gov (United States)

    Evans, Karen; Brown, Jaclyn N.; Sen Gupta, Alex; Nicol, Simon J.; Hoyle, Simon; Matear, Richard; Arrizabalaga, Haritz

    2015-03-01

    Multi-disciplinary approaches that combine oceanographic, biogeochemical, ecosystem, fisheries population and socio-economic models are vital tools for modelling whole ecosystems. Interpreting the outputs from such complex models requires an appreciation of the many different types of modelling frameworks being used and their associated limitations and uncertainties. Both users and developers of particular model components will often have little involvement or understanding of other components within such modelling frameworks. Failure to recognise limitations and uncertainties associated with components and how these uncertainties might propagate throughout modelling frameworks can potentially result in poor advice for resource management. Unfortunately, many of the current integrative frameworks do not propagate the uncertainties of their constituent parts. In this review, we outline the major components of a generic whole of ecosystem modelling framework incorporating the external pressures of climate and fishing. We discuss the limitations and uncertainties associated with each component of such a modelling system, along with key research gaps. Major uncertainties in modelling frameworks are broadly categorised into those associated with (i) deficient knowledge in the interactions of climate and ocean dynamics with marine organisms and ecosystems; (ii) lack of observations to assess and advance modelling efforts and (iii) an inability to predict with confidence natural ecosystem variability and longer term changes as a result of external drivers (e.g. greenhouse gases, fishing effort) and the consequences for marine ecosystems. As a result of these uncertainties and intrinsic differences in the structure and parameterisation of models, users are faced with considerable challenges associated with making appropriate choices on which models to use. We suggest research directions required to address these uncertainties, and caution against overconfident predictions

  6. Technical Information on the Carbonation of the EBR-II Reactor, Summary Report Part 1: Laboratory Experiments and Application to EBR-II Secondary Sodium System

    Energy Technology Data Exchange (ETDEWEB)

    Steven R. Sherman

    2005-04-01

    Residual sodium is defined as sodium metal that remains behind in pipes, vessels, and tanks after the bulk sodium metal has been melted and drained from such components. The residual sodium has the same chemical properties as bulk sodium, and differs from bulk sodium only in the thickness of the sodium deposit. Typically, sodium is considered residual when the thickness of the deposit is less than 5-6 cm. This residual sodium must be removed or deactivated when a pipe, vessel, system, or entire reactor is permanently taken out of service, in order to make the component or system safer and/or to comply with decommissioning regulations. As an alternative to the established residual sodium deactivation techniques (steam-and-nitrogen, wet vapor nitrogen, etc.), a technique involving the use of moisture and carbon dioxide has been developed. With this technique, sodium metal is converted into sodium bicarbonate by reacting it with humid carbon dioxide. Hydrogen is emitted as a by-product. This technique was first developed in the laboratory by exposing sodium samples to humidified carbon dioxide under controlled conditions, and then demonstrated on a larger scale by treating residual sodium within the Experimental Breeder Reactor II (EBR-II) secondary cooling system, followed by the primary cooling system, respectively. The EBR-II facility is located at the Idaho National Laboratory (INL) in southeastern Idaho, U.S.A. This report is Part 1 of a two-part report. It is divided into three sections. The first section describes the chemistry of carbon dioxide-water-sodium reactions. The second section covers the laboratory experiments that were conducted in order to develop the residual sodium deactivation process. The third section discusses the application of the deactivation process to the treatment of residual sodium within the EBR-II secondary sodium cooling system. Part 2 of the report, under separate cover, describes the application of the technique to residual sodium

  7. Judgment under Uncertainty: Heuristics and Biases.

    Science.gov (United States)

    Tversky, A; Kahneman, D

    1974-09-27

    This article described three heuristics that are employed in making judgements under uncertainty: (i) representativeness, which is usually employed when people are asked to judge the probability that an object or event A belongs to class or process B; (ii) availability of instances or scenarios, which is often employed when people are asked to assess the frequency of a class or the plausibility of a particular development; and (iii) adjustment from an anchor, which is usually employed in numerical prediction when a relevant value is available. These heuristics are highly economical and usually effective, but they lead to systematic and predictable errors. A better understanding of these heuristics and of the biases to which they lead could improve judgements and decisions in situations of uncertainty.

  8. Uncertainty in wave energy resource assessment. Part 2: Variability and predictability

    International Nuclear Information System (INIS)

    Mackay, Edward B.L.; Bahaj, AbuBakr S.; Challenor, Peter G.

    2010-01-01

    The uncertainty in estimates of the energy yield from a wave energy converter (WEC) is considered. The study is presented in two articles. The first article considered the accuracy of the historic data and the second article, presented here, considers the uncertainty which arises from variability in the wave climate. Mean wave conditions exhibit high levels of interannual variability. Moreover, many previous studies have demonstrated longer-term decadal changes in wave climate. The effect of interannual and climatic changes in wave climate on the predictability of long-term mean WEC power is examined for an area off the north coast of Scotland. In this location anomalies in mean WEC power are strongly correlated with the North Atlantic Oscillation (NAO) index. This link enables the results of many previous studies on the variability of the NAO and its sensitivity to climate change to be applied to WEC power levels. It is shown that the variability in 5, 10 and 20 year mean power levels is greater than if annual power anomalies were uncorrelated noise. It is also shown that the change in wave climate from anthropogenic climate change over the life time of a wave farm is likely to be small in comparison to the natural level of variability. Finally, it is shown that despite the uncertainty related to variability in the wave climate, improvements in the accuracy of historic data will improve the accuracy of predictions of future WEC yield. (author)

  9. Sources of uncertainty in flood inundation maps

    Science.gov (United States)

    Bales, J.D.; Wagner, C.R.

    2009-01-01

    Flood inundation maps typically have been used to depict inundated areas for floods having specific exceedance levels. The uncertainty associated with the inundation boundaries is seldom quantified, in part, because all of the sources of uncertainty are not recognized and because data available to quantify uncertainty seldom are available. Sources of uncertainty discussed in this paper include hydrologic data used for hydraulic model development and validation, topographic data, and the hydraulic model. The assumption of steady flow, which typically is made to produce inundation maps, has less of an effect on predicted inundation at lower flows than for higher flows because more time typically is required to inundate areas at high flows than at low flows. Difficulties with establishing reasonable cross sections that do not intersect and that represent water-surface slopes in tributaries contribute additional uncertainties in the hydraulic modelling. As a result, uncertainty in the flood inundation polygons simulated with a one-dimensional model increases with distance from the main channel.

  10. HERBICIDAS INIBIDORES DO FOTOSSISTEMA IIPARTE II / PHOTOSYSTEM II INHIBITOR HERBICIDES - PART

    Directory of Open Access Journals (Sweden)

    ILCA P. DE F. E SILVA

    2013-11-01

    Full Text Available Os herbicidas inibidores do fotossistema II (PSII ligam-se ao sítio da QB localizado na proteína D1 o qual se localiza na membrana dos tilacóides dos cloroplastos, causando, o bloqueia do transporte de elétrons da QA para QB, tendo como consequência, a peroxidação dos lipídios. Os principais fatores que afetam a evolução da resistência de plantas daninhas aos herbicidas têm sido agrupados em: genéticos, bioecológicos e agronômicos. A resistência de plantas daninhas a herbicidas é definida como a habilidade de uma planta sobreviver e reproduzir, após exposição a uma dose de herbicida normalmente letal para um biótipo normal da planta. A seletividade de um herbicida está relacionada à capacidade de eliminar plantas daninhas sem interferir na qualidade da planta de interesse econômico.

  11. Balancing certainty and uncertainty in clinical medicine.

    Science.gov (United States)

    Hayward, Richard

    2006-01-01

    Nothing in clinical medicine is one hundred per cent certain. Part of a doctor's education involves learning how to cope with the anxiety that uncertainty in decisions affecting life and death inevitably produces. This paper examines: (1) the role of anxiety -- both rational and irrational -- in the provision of health care; (2) the effects of uncertainty upon the doctor-patient relationship; (3) the threat uncertainty poses to medical authority (and the assumption of infallibility that props it up); (4) the contribution of clinical uncertainty to the rising popularity of alternative therapies; and (5) the clash between the medical and the legal understanding of how certainty should be defined, particularly as it affects the paediatric community. It concludes by suggesting some strategies that might facilitate successful navigation between the opposing and ever-present forces of certainty and uncertainty.

  12. Polycystic ovary syndrome: a review for dermatologists: Part II. Treatment.

    Science.gov (United States)

    Buzney, Elizabeth; Sheu, Johanna; Buzney, Catherine; Reynolds, Rachel V

    2014-11-01

    Dermatologists are in a key position to treat the manifestations of polycystic ovary syndrome (PCOS). The management of PCOS should be tailored to each woman's specific goals, reproductive interests, and particular constellation of symptoms. Therefore, a multidisciplinary approach is recommended. In part II of this continuing medical education article, we present the available safety and efficacy data regarding treatments for women with acne, hirsutism, and androgenetic alopecia. Therapies discussed include lifestyle modification, topical therapies, combined oral contraceptives, antiandrogen agents, and insulin-sensitizing drugs. Treatment recommendations are made based on the current available evidence. Copyright © 2014 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  13. Recovery in soccer : part ii-recovery strategies.

    Science.gov (United States)

    Nédélec, Mathieu; McCall, Alan; Carling, Chris; Legall, Franck; Berthoin, Serge; Dupont, Gregory

    2013-01-01

    In the formerly published part I of this two-part review, we examined fatigue after soccer matchplay and recovery kinetics of physical performance, and cognitive, subjective and biological markers. To reduce the magnitude of fatigue and to accelerate the time to fully recover after completion, several recovery strategies are now used in professional soccer teams. During congested fixture schedules, recovery strategies are highly required to alleviate post-match fatigue, and then to regain performance faster and reduce the risk of injury. Fatigue following competition is multifactorial and mainly related to dehydration, glycogen depletion, muscle damage and mental fatigue. Recovery strategies should consequently be targeted against the major causes of fatigue. Strategies reviewed in part II of this article were nutritional intake, cold water immersion, sleeping, active recovery, stretching, compression garments, massage and electrical stimulation. Some strategies such as hydration, diet and sleep are effective in their ability to counteract the fatigue mechanisms. Providing milk drinks to players at the end of competition and a meal containing high-glycaemic index carbohydrate and protein within the hour following the match are effective in replenishing substrate stores and optimizing muscle-damage repair. Sleep is an essential part of recovery management. Sleep disturbance after a match is common and can negatively impact on the recovery process. Cold water immersion is effective during acute periods of match congestion in order to regain performance levels faster and repress the acute inflammatory process. Scientific evidence for other strategies reviewed in their ability to accelerate the return to the initial level of performance is still lacking. These include active recovery, stretching, compression garments, massage and electrical stimulation. While this does not mean that these strategies do not aid the recovery process, the protocols implemented up until

  14. Small break LOCA RELAP5/MOD3 uncertainty quantification: Bias and uncertainty evaluation for important phenomena

    International Nuclear Information System (INIS)

    Ortiz, M.G.; Ghan, L.S.; Vogl, J.

    1991-01-01

    The Nuclear Regulatory Commission (NRC) revised the Emergency Core Cooling System (ECCS) licensing rule to allow the use of Best Estimate (BE) computer codes, provided the uncertainty of the calculations are quantified and used in the licensing and regulation process. The NRC developed a generic methodology called Code Scaling, Applicability and Uncertainty (CSAU) to evaluate BE code uncertainties. The CSAU methodology was demonstrated with a specific application to a pressurized water reactor (PWR), experiencing a postulated large break loss-of-coolant accident (LBLOCA). The current work is part of an effort to adapt and demonstrate the CSAU methodology to a small break (SB) LOCA in a PWR of B and W design using RELAP5/MOD3 as the simulation tool. The subject of this paper is the Assessment and Ranging of Parameters (Element 2 of the CSAU methodology), which determines the contribution to uncertainty of specific models in the code

  15. On treatment of uncertainty in system planning

    International Nuclear Information System (INIS)

    Flage, R.; Aven, T.

    2009-01-01

    In system planning and operation considerable efforts and resources are spent to reduce uncertainties, as a part of project management, uncertainty management and safety management. The basic idea seems to be that uncertainties are purely negative and should be reduced. In this paper we challenge this way of thinking, using a common industry practice as an example. In accordance with this industry practice, three uncertainty interval categories are used: ±40% intervals for the feasibility phase, ±30% intervals for the concept development phase and ±20% intervals for the engineering phase. The problem is that such a regime could easily lead to a conservative management regime encouraging the use of existing methods and tools, as new activities and novel solutions and arrangements necessarily mean increased uncertainties. In the paper we suggest an alternative approach based on uncertainty and risk descriptions, but having no predefined uncertainty reduction structures. The approach makes use of risk assessments and economic optimisation tools such as the expected net present value, but acknowledges the need for broad risk management processes which extend beyond the analyses. Different concerns need to be balanced, including economic aspects, uncertainties and risk, and practicability

  16. Uncertainties in the Norwegian greenhouse gas emission inventory

    Energy Technology Data Exchange (ETDEWEB)

    Flugsrud, Ketil; Hoem, Britta

    2011-11-15

    The national greenhouse gas (GHG) emission inventory is compiled from estimates based on emission factors and activity data and from direct measurements by plants. All these data and parameters will contribute to the overall inventory uncertainty. The uncertainties and probability distributions of the inventory input parameters have been assessed based on available data and expert judgements.Finally, the level and trend uncertainties of the national GHG emission inventory have been estimated using Monte Carlo simulation. The methods used in the analysis correspond to an IPCC tier 2 method, as described in the IPCC Good Practice Guidance (IPCC 2000) (IPCC 2000). Analyses have been made both excluding and including the sector LULUCF (land use, land-use change and forestry). The uncertainty analysis performed in 2011 is an update of the uncertainty analyses performed for the greenhouse gas inventory in 2006 and 2000. During the project we have been in contact with experts, and have collected information about uncertainty from them. Main focus has been on the source categories where changes have occured since the last uncertainty analysis was performed in 2006. This includes new methodology for several source categories (for example for solvents and road traffic) as well as revised uncertainty estimates. For the installations included in the emission trading system, new information from the annual ETS reports about uncertainty in activity data and CO2 emission factor (and N2O emission factor for nitric acid production) has been used. This has improved the quality of the uncertainty estimates for the energy and manufacturing sectors. The results show that the uncertainty level in the total calculated greenhouse gas emissions for 2009 is around 4 per cent. When including the LULUCF sector, the total uncertainty is around 17 per cent in 2009. The uncertainty estimate is lower now than previous analyses have shown. This is partly due to a considerable work made to improve

  17. Ocean Thermal Energy Converstion (OTEC) test facilities study program. Final report. Volume II. Part B

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-01-17

    Results are presented of an 8-month study to develop alternative non-site-specific OTEC facilities/platform requirements for an integrated OTEC test program which may include land and floating test facilities. Volume II--Appendixes is bound in three parts (A, B, and C) which together comprise a compendium of the most significant detailed data developed during the study. Part B provides an annotated test list and describes component tests and system tests.

  18. A Carbon Monitoring System Approach to US Coastal Wetland Carbon Fluxes: Progress Towards a Tier II Accounting Method with Uncertainty Quantification

    Science.gov (United States)

    Windham-Myers, L.; Holmquist, J. R.; Bergamaschi, B. A.; Byrd, K. B.; Callaway, J.; Crooks, S.; Drexler, J. Z.; Feagin, R. A.; Ferner, M. C.; Gonneea, M. E.; Kroeger, K. D.; Megonigal, P.; Morris, J. T.; Schile, L. M.; Simard, M.; Sutton-Grier, A.; Takekawa, J.; Troxler, T.; Weller, D.; Woo, I.

    2015-12-01

    Despite their high rates of long-term carbon (C) sequestration when compared to upland ecosystems, coastal C accounting is only recently receiving the attention of policy makers and carbon markets. Assessing accuracy and uncertainty in net C flux estimates requires both direct and derived measurements based on both short and long term dynamics in key drivers, particularly soil accretion rates and soil organic content. We are testing the ability of remote sensing products and national scale datasets to estimate biomass and soil stocks and fluxes over a wide range of spatial and temporal scales. For example, the 2013 Wetlands Supplement to the 2006 IPCC GHG national inventory reporting guidelines requests information on development of Tier I-III reporting, which express increasing levels of detail. We report progress toward development of a Carbon Monitoring System for "blue carbon" that may be useful for IPCC reporting guidelines at Tier II levels. Our project uses a current dataset of publically available and contributed field-based measurements to validate models of changing soil C stocks, across a broad range of U.S. tidal wetland types and landuse conversions. Additionally, development of biomass algorithms for both radar and spectral datasets will be tested and used to determine the "price of precision" of different satellite products. We discuss progress in calculating Tier II estimates focusing on variation introduced by the different input datasets. These include the USFWS National Wetlands Inventory, NOAA Coastal Change Analysis Program, and combinations to calculate tidal wetland area. We also assess the use of different attributes and depths from the USDA-SSURGO database to map soil C density. Finally, we examine the relative benefit of radar, spectral and hybrid approaches to biomass mapping in tidal marshes and mangroves. While the US currently plans to report GHG emissions at a Tier I level, we argue that a Tier II analysis is possible due to national

  19. Not-for-profit versus for-profit health care providers--Part II: Comparing and contrasting their records.

    Science.gov (United States)

    Rotarius, Timothy; Trujillo, Antonio J; Liberman, Aaron; Ramirez, Bernardo

    2006-01-01

    The debate over which health care providers are most capably meeting their responsibilities in serving the public's interest continues unabated, and the comparisons of not-for-profit (NFP) versus for-profit (FP) hospitals remain at the epicenter of the discussion. From the perspective of available factual information, which of the two sides to this debate is correct? This article is part II of a 2-part series on comparing and contrasting the performance records of NFP health care providers with their FP counterparts. Although it is demonstrated that both NFP and FP providers perform virtuous and selfless feats on behalf of America's public, it is also shown that both camps have been accused of being involved in potentially willful clinical and administrative missteps. Part I provided the background information (eg, legal differences, perspectives on social responsibility, and types of questionable and fraudulent behavior) required to adequately understand the scope of the comparison issue. Part II offers actual comparisons of the 2 organizational structures using several disparate factors such as specific organizational behaviors, approach to the health care priorities of cost and quality, and business-focused goals of profits, efficiency, and community benefit.

  20. Optimization Under Uncertainty of Site-Specific Turbine Configurations

    Science.gov (United States)

    Quick, J.; Dykes, K.; Graf, P.; Zahle, F.

    2016-09-01

    Uncertainty affects many aspects of wind energy plant performance and cost. In this study, we explore opportunities for site-specific turbine configuration optimization that accounts for uncertainty in the wind resource. As a demonstration, a simple empirical model for wind plant cost of energy is used in an optimization under uncertainty to examine how different risk appetites affect the optimal selection of a turbine configuration for sites of different wind resource profiles. If there is unusually high uncertainty in the site wind resource, the optimal turbine configuration diverges from the deterministic case and a generally more conservative design is obtained with increasing risk aversion on the part of the designer.

  1. Emission factors of air pollutants from CNG-gasoline bi-fuel vehicles: Part II. CO, HC and NOx.

    Science.gov (United States)

    Huang, Xiaoyan; Wang, Yang; Xing, Zhenyu; Du, Ke

    2016-09-15

    The estimation of emission factors (EFs) is the basis of accurate emission inventory. However, the EFs of air pollutants for motor vehicles vary under different operating conditions, which will cause uncertainty in developing emission inventory. Natural gas (NG), considered as a "cleaner" fuel than gasoline, is increasingly being used to reduce combustion emissions. However, information is scarce about how much emission reduction can be achieved by motor vehicles burning NG (NGVs) under real road driving conditions, which is necessary for evaluating the environmental benefits for NGVs. Here, online, in situ measurements of the emissions from nine bi-fuel vehicles were conducted under different operating conditions on the real road. A comparative study was performed for the EFs of black carbon (BC), carbon monoxide (CO), hydrocarbons (HCs) and nitrogen oxides (NOx) for each operating condition when the vehicles using gasoline and compressed NG (CNG) as fuel. BC EFs were reported in part I. The part II in this paper series reports the influence of operating conditions and fuel types on the EFs of CO, HC and NOx. Fuel-based EFs of CO showed good correlations with speed when burning CNG and gasoline. The correlation between fuel-based HC EFs and speed was relatively weak whether burning CNG or gasoline. The fuel-based NOx EFs moderately correlated with speed when burning CNG, but weakly correlated with gasoline. As for HC, the mileage-based EFs of gasoline vehicles are 2.39-12.59 times higher than those of CNG vehicles. The mileage-based NOx EFs of CNG vehicles are slightly higher than those of gasoline vehicles. These results would facilitate a detailed analysis of the environmental benefits for replacing gasoline with CNG in light duty vehicles. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Ideas underlying the Quantification of Margins and Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Pilch, Martin, E-mail: mpilch@sandia.gov [Department 1514, Sandia National Laboratories, Albuquerque, NM 87185-0828 (United States); Trucano, Timothy G. [Department 1411, Sandia National Laboratories, Albuquerque, NM 87185-0370 (United States); Helton, Jon C. [Department of Mathematics and Statistics, Arizona State University, Tempe, AZ 85287-1804 (United States)

    2011-09-15

    Key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions are described. While QMU is a broad process and methodology for generating critical technical information to be used in U.S. nuclear weapon stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, the following topics are discussed: (i) the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, (ii) the need to separate aleatory and epistemic uncertainty in QMU, and (iii) the properties of risk-informed decision making (RIDM) that are best suited for effective application of QMU. The paper is written at a high level, but provides an extensive bibliography of useful papers for interested readers to deepen their understanding of the presented ideas.

  3. Eleventh annual meeting, Bologna, Italy, 17-20 April 1978. Summary report. Part II

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1978-07-01

    The Summary Report - Part II of the Eleventh Annual Meeting of the IAEA International Working Group on Fast Reactors - includes reports on development of fast reactors in France from 1977 to 1978; review of the activities related to fast reactors in Germany; status of fast breeder reactors development in Belgium and Netherlands; status of activities related to fast reactors in USSR, Japan USA, UK and Italy.

  4. Eleventh annual meeting, Bologna, Italy, 17-20 April 1978. Summary report. Part II

    International Nuclear Information System (INIS)

    1978-07-01

    The Summary Report - Part II of the Eleventh Annual Meeting of the IAEA International Working Group on Fast Reactors - includes reports on development of fast reactors in France from 1977 to 1978; review of the activities related to fast reactors in Germany; status of fast breeder reactors development in Belgium and Netherlands; status of activities related to fast reactors in USSR, Japan USA, UK and Italy

  5. A legacy of struggle: the OSHA ergonomics standard and beyond, Part II.

    Science.gov (United States)

    Delp, Linda; Mojtahedi, Zahra; Sheikh, Hina; Lemus, Jackie

    2014-11-01

    The OSHA ergonomics standard issued in 2000 was repealed within four months through a Congressional resolution that limits future ergonomics rulemaking. This section continues the conversation initiated in Part I, documenting a legacy of struggle for an ergonomics standard through the voices of eight labor, academic, and government key informants. Part I summarized important components of the standard; described the convergence of labor activism, research, and government action that laid the foundation for a standard; and highlighted the debates that characterized the rulemaking process. Part II explores the anti-regulatory political landscape of the 1990s, as well as the key opponents, power dynamics, and legal maneuvers that led to repeal of the standard. This section also describes the impact of the ergonomics struggle beyond the standard itself and ends with a discussion of creative state-level policy initiatives and coalition approaches to prevent work-related musculoskeletal disorders (WMSDs) in today's sociopolitical context.

  6. Accounting for Epistemic and Aleatory Uncertainty in Early System Design, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — This project extends Probability Bounds Analysis to model epistemic and aleatory uncertainty during early design of engineered systems in an Integrated Concurrent...

  7. Extensive neutronic sensitivity-uncertainty analysis of a fusion reactor shielding blanket

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-01-01

    In this paper the results are presented of an extensive neutronic sensitivity-uncertainty study performed for the design of a shielding blanket for a next-step fusion reactor, such as ITER. A code system was used, which was developed at ECN Petten. The uncertainty in an important response parameter, the neutron heating in the inboard superconducting coils, was evaluated. Neutron transport calculations in the 100 neutron group GAM-II structure were performed using the code ANISN. For the sensitivity and uncertainty calculations the code SUSD was used. Uncertainties due to cross-section uncertainties were taken into account as well as uncertainties due to uncertainties in energy and angular distributions of scattered neutrons (SED and SAD uncertainties, respectively). The subject of direct-term uncertainties (i.e. uncertainties due to uncertainties in the kerma factors of the superconducting coils) is briefly touched upon. It is shown that SAD uncertainties, which have been largely neglected until now, contribute significantly to the total uncertainty. Moreover, the contribution of direct-term uncertainties may be large. The total uncertainty in the neutron heating, only due to Fe cross-sections, amounts to approximately 25%, which is rather large. However, uncertainty data are scarce and the data may very well be conservative. It is shown in this paper that with the code system used, sensitivity and uncertainty calculations can be performed in a straightforward way. Therefore, it is suggested that emphasis is now put on the generation of realistic, reliable covariance data for cross-sections as well as for angular and energy distributions. ((orig.))

  8. Title II, Part A: Don't Scrap It, Don't Dilute It, Fix It

    Science.gov (United States)

    Coggshall, Jane G.

    2015-01-01

    The Issue: Washington is taking a close look at Title II, Part A (Title IIA) of the Elementary and Secondary Education Act (ESEA) as Congress debates reauthorization. The program sends roughly $2.5 billion a year to all states and nearly all districts to "(1) increase student academic achievement through strategies such as improving teacher…

  9. Instructional Climates in Preschool Children Who Are At-Risk. Part II: Perceived Physical Competence

    Science.gov (United States)

    Robinson, Leah E.; Rudisill, Mary E.; Goodway, Jacqueline D.

    2009-01-01

    In Part II of this study, we examined the effect of two 9-week instructional climates (low-autonomy [LA] and mastery motivational climate [MMC]) on perceived physical competence (PPC) in preschoolers (N = 117). Participants were randomly assigned to an LA, MMC, or comparison group. PPC was assessed by a pretest, posttest, and retention test with…

  10. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  11. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... and thus the UQ analysis of the associated systems will benefit greatly from the application of methods which require few function evaluations. We first consider the propagation of the uncertainty and the sensitivity analysis of the non-linear dynamics of railway vehicles with suspension components whose......-scale problems, where efficient methods are necessary with today’s computational resources. The outcome of this work was also the creation of several freely available Python modules for Uncertainty Quantification, which are listed and described in the appendix....

  12. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    International Nuclear Information System (INIS)

    Elert, M.

    1996-09-01

    In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root

  13. Optimal recombination in genetic algorithms for combinatorial optimization problems: Part II

    Directory of Open Access Journals (Sweden)

    Eremeev Anton V.

    2014-01-01

    Full Text Available This paper surveys results on complexity of the optimal recombination problem (ORP, which consists in finding the best possible offspring as a result of a recombination operator in a genetic algorithm, given two parent solutions. In Part II, we consider the computational complexity of ORPs arising in genetic algorithms for problems on permutations: the Travelling Salesman Problem, the Shortest Hamilton Path Problem and the Makespan Minimization on Single Machine and some other related problems. The analysis indicates that the corresponding ORPs are NP-hard, but solvable by faster algorithms, compared to the problems they are derived from.

  14. The basic science of dermal fillers: past and present Part II: adverse effects.

    Science.gov (United States)

    Gilbert, Erin; Hui, Andrea; Meehan, Shane; Waldorf, Heidi A

    2012-09-01

    The ideal dermal filler should offer long-lasting aesthetic improvement with a minimal side-effect profile. It should be biocompatible and stable within the injection site, with the risk of only transient undesirable effects from injection alone. However, all dermal fillers can induce serious and potentially long-lasting adverse effects. In Part II of this paper, we review the most common adverse effects related to dermal filler use.

  15. Biology and Mechanics of Blood Flows Part II: Mechanics and Medical Aspects

    CERN Document Server

    Thiriet, Marc

    2008-01-01

    Biology and Mechanics of Blood Flows presents the basic knowledge and state-of-the-art techniques necessary to carry out investigations of the cardiovascular system using modeling and simulation. Part II of this two-volume sequence, Mechanics and Medical Aspects, refers to the extraction of input data at the macroscopic scale for modeling the cardiovascular system, and complements Part I, which focuses on nanoscopic and microscopic components and processes. This volume contains chapters on anatomy, physiology, continuum mechanics, as well as pathological changes in the vasculature walls including the heart and their treatments. Methods of numerical simulations are given and illustrated in particular by application to wall diseases. This authoritative book will appeal to any biologist, chemist, physicist, or applied mathematician interested in the functioning of the cardiovascular system.

  16. Formulation, computation and improvement of steady state security margins in power systems. Part II: Results

    International Nuclear Information System (INIS)

    Echavarren, F.M.; Lobato, E.; Rouco, L.; Gomez, T.

    2011-01-01

    A steady state security margin for a particular operating point can be defined as the distance from this initial point to the secure operating limits of the system. Four of the most used steady state security margins are the power flow feasibility margin, the contingency feasibility margin, the load margin to voltage collapse, and the total transfer capability between system areas. This is the second part of a two part paper. Part I has proposed a novel framework of a general model able to formulate, compute and improve any steady state security margin. In Part II the performance of the general model is validated by solving a variety of practical situations in modern real power systems. Actual examples of the Spanish power system will be used for this purpose. The same computation and improvement algorithms outlined in Part I have been applied for the four security margins considered in the study, outlining the convenience of defining a general framework valid for the four of them. The general model is used here in Part II to compute and improve: (a) the power flow feasibility margin (assessing the influence of the reactive power generation limits in the Spanish power system), (b) the contingency feasibility margin (assessing the influence of transmission and generation capacity in maintaining a correct voltage profile), (c) the load margin to voltage collapse (assessing the location and quantity of loads that must be shed in order to be far away from voltage collapse) and (d) the total transfer capability (assessing the export import pattern of electric power between different areas of the Spanish system). (author)

  17. Formulation, computation and improvement of steady state security margins in power systems. Part II: Results

    Energy Technology Data Exchange (ETDEWEB)

    Echavarren, F.M.; Lobato, E.; Rouco, L.; Gomez, T. [School of Engineering of Universidad Pontificia Comillas, C/Alberto Aguilera, 23, 28015 Madrid (Spain)

    2011-02-15

    A steady state security margin for a particular operating point can be defined as the distance from this initial point to the secure operating limits of the system. Four of the most used steady state security margins are the power flow feasibility margin, the contingency feasibility margin, the load margin to voltage collapse, and the total transfer capability between system areas. This is the second part of a two part paper. Part I has proposed a novel framework of a general model able to formulate, compute and improve any steady state security margin. In Part II the performance of the general model is validated by solving a variety of practical situations in modern real power systems. Actual examples of the Spanish power system will be used for this purpose. The same computation and improvement algorithms outlined in Part I have been applied for the four security margins considered in the study, outlining the convenience of defining a general framework valid for the four of them. The general model is used here in Part II to compute and improve: (a) the power flow feasibility margin (assessing the influence of the reactive power generation limits in the Spanish power system), (b) the contingency feasibility margin (assessing the influence of transmission and generation capacity in maintaining a correct voltage profile), (c) the load margin to voltage collapse (assessing the location and quantity of loads that must be shed in order to be far away from voltage collapse) and (d) the total transfer capability (assessing the export import pattern of electric power between different areas of the Spanish system). (author)

  18. [Method for optimal sensor placement in water distribution systems with nodal demand uncertainties].

    Science.gov (United States)

    Liu, Shu-Ming; Wu, Xue; Ouyang, Le-Yan

    2013-08-01

    The notion of identification fitness was proposed for optimizing sensor placement in water distribution systems. Nondominated Sorting Genetic Algorithm II was used to find the Pareto front between minimum overlap of possible detection times of two events and the best probability of detection, taking nodal demand uncertainties into account. This methodology was applied to an example network. The solutions show that the probability of detection and the number of possible locations are not remarkably affected by nodal demand uncertainties, but the sources identification accuracy declines with nodal demand uncertainties.

  19. Challenges for sustainable resource use : Uncertainty, trade and climate policies

    NARCIS (Netherlands)

    Bretschger, L.; Smulders, Sjak A.

    2012-01-01

    We integrate new challenges to thinking about resource markets and sustainable resource use policies in a general framework. The challenges, emerging from six papers that JEEM publishes in a special issue, are (i) demand uncertainty and stockpiling, (ii) international trade and resource dependence,

  20. Exploring Water Pollution. Part II

    Science.gov (United States)

    Rillo, Thomas J.

    1975-01-01

    This is part two of a three part article related to the science activity of exploring environmental problems. Part one dealt with background information for the classroom teacher. Presented here is a suggested lesson plan on water pollution. Objectives, important concepts and instructional procedures are suggested. (EB)

  1. Part I: $\\beta$-delayed fission, laser spectroscopy and shape-coexistence studies with astatine beams; Part II: Delineating the island of deformation in the light gold isotopes by means of laser spectroscopy

    CERN Document Server

    Andreyev, Andrei

    2013-01-01

    Part I: $\\beta$-delayed fission, laser spectroscopy and shape-coexistence studies with astatine beams; Part II: Delineating the island of deformation in the light gold isotopes by means of laser spectroscopy

  2. Final environmental statement. Final addendum to Part II: Manufacture of floating nuclear power plants by Offshore Power Systems. DOCKET-STN--50-437

    International Nuclear Information System (INIS)

    1978-06-01

    This Addendum to Part II of the Final Environmental Statement related to manufacture of floating nuclear power plants by Offshore Power Systems (OPS), NUREG-0056, issued September 1976, was prepared by the U.S. Nuclear Regulatory Commission (NRC), Office of Nuclear Reactor Regulation. The staff's basic evaluation is presented in NUREG-0056. The current Addendum provides further consideration of a number of topics discussed in NUREG-0056, particularly additional consideration of shore zone siting at estuarine and ocean regions. This Summary and Conclusions recapitulates and is cumulative for Part II of the FES and the current Addendum. Augmentations to the Summary and Conclusions presented in Part II of the FES and arising from the evaluations contained in this Addendum are italicized

  3. Perseveration induces dissociative uncertainty in obsessive-compulsive disorder.

    Science.gov (United States)

    Giele, Catharina L; van den Hout, Marcel A; Engelhard, Iris M; Dek, Eliane C P; Toffolo, Marieke B J; Cath, Danielle C

    2016-09-01

    Obsessive compulsive (OC)-like perseveration paradoxically increases feelings of uncertainty. We studied whether the underlying mechanism between perseveration and uncertainty is a reduced accessibility of meaning ('semantic satiation'). OCD patients (n = 24) and matched non-clinical controls (n = 24) repeated words 2 (non-perseveration) or 20 times (perseveration). They decided whether this word was related to another target word. Speed of relatedness judgments and feelings of dissociative uncertainty were measured. The effects of real-life perseveration on dissociative uncertainty were tested in a smaller subsample of the OCD group (n = 9). Speed of relatedness judgments was not affected by perseveration. However, both groups reported more dissociative uncertainty after perseveration compared to non-perseveration, which was higher in OCD patients. Patients reported more dissociative uncertainty after 'clinical' perseveration compared to non-perseveration.. Both parts of this study are limited by some methodological issues and a small sample size. Although the mechanism behind 'perseveration → uncertainty' is still unclear, results suggest that the effects of perseveration are counterproductive. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Modelling of atmospheric dispersion in a complex medium and associated uncertainties

    International Nuclear Information System (INIS)

    Demael, Emmanuel

    2007-01-01

    This research thesis addresses the study of the digital modelling of atmospheric dispersions. It aimed at validating the Mercure-Saturne tool used with a RANS (Reynolds Averaged Navier-Stokes) approach within the frame of an impact study or of an accidental scenario on a nuclear site while taking buildings and ground relief into account, at comparing the Mercure-Saturne model with a more simple and less costly (in terms of computation time) Gaussian tool (the ADMS software, Atmospheric Dispersion Modelling System), and at quantifying uncertainties related to the use of the Mercure-Saturne model. The first part introduces theoretical elements of atmosphere physics and of the atmospheric dispersion in a boundary layer, presents the Gaussian model and the Mercure-Saturne tool and its associated RANS approach. The second part reports the comparison of the Mercure-Saturne model with conventional Gaussian plume models. The third part reports the study of the atmospheric flow and dispersion about the Bugey nuclear site, based on a study performed in a wind tunnel. The fourth part reports the same kind of study for the Flamanville site. The fifth part reports the use of different approaches for the study of uncertainties in the case of the Bugey site: application of the Morris method (a screening method), and of the Monte Carlo method (quantification of the uncertainty and of the sensitivity of each uncertainty source) [fr

  5. Nuclear fuel technology - Determination of uranium in solutions, uranium hexafluoride and solids - Part 2: Iron(II) reduction/cerium(IV) oxidation titrimetric method

    International Nuclear Information System (INIS)

    2004-01-01

    This first edition of ISO 7097-1 together with ISO 7097-2:2004 cancels and replaces ISO 7097:1983, which has been technically revised, and ISO 9989:1996. ISO 7097 consists of the following parts, under the general title Nuclear fuel technology - Determination of uranium in solutions, uranium hexafluoride and solids: Part 1: Iron(II) reduction/potassium dichromate oxidation titrimetric method; Part 2: Iron(II) reduction/cerium(IV) oxidation titrimetric method. This part 2. of ISO 7097 describes procedures for determination of uranium in solutions, uranium hexafluoride and solids. The procedures described in the two independent parts of this International Standard are similar: this part uses a titration with cerium(IV) and ISO 7097-1 uses a titration with potassium dichromate

  6. Nuclear fuel technology - Determination of uranium in solutions, uranium hexafluoride and solids - Part 1: Iron(II) reduction/potassium dichromate oxidation titrimetric method

    International Nuclear Information System (INIS)

    2004-01-01

    This first edition of ISO 7097-1 together with ISO 7097-2:2004 cancels and replaces ISO 7097:1983, which has been technically revised, and ISO 9989:1996. ISO 7097 consists of the following parts, under the general title Nuclear fuel technology - Determination of uranium in solutions, uranium hexafluoride and solids: Part 1: Iron(II) reduction/potassium dichromate oxidation titrimetric method; Part 2: Iron(II) reduction/cerium(IV) oxidation titrimetric method. This part 1. of ISO 7097 describes procedures for the determination of uranium in solutions, uranium hexafluoride and solids. The procedures described in the two independent parts of this International Standard are similar: this part uses a titration with potassium dichromate and ISO 7097-2 uses a titration with cerium(IV)

  7. Current antiviral drugs and their analysis in biological materials - Part II: Antivirals against hepatitis and HIV viruses.

    Science.gov (United States)

    Nováková, Lucie; Pavlík, Jakub; Chrenková, Lucia; Martinec, Ondřej; Červený, Lukáš

    2018-01-05

    This review is a Part II of the series aiming to provide comprehensive overview of currently used antiviral drugs and to show modern approaches to their analysis. While in the Part I antivirals against herpes viruses and antivirals against respiratory viruses were addressed, this part concerns antivirals against hepatitis viruses (B and C) and human immunodeficiency virus (HIV). Many novel antivirals against hepatitis C virus (HCV) and HIV have been introduced into the clinical practice over the last decade. The recent broadening portfolio of these groups of antivirals is reflected in increasing number of developed analytical methods required to meet the needs of clinical terrain. Part II summarizes the mechanisms of action of antivirals against hepatitis B virus (HBV), HCV, and HIV, their use in clinical practice, and analytical methods for individual classes. It also provides expert opinion on state of art in the field of bioanalysis of these drugs. Analytical methods reflect novelty of these chemical structures and use by far the most current approaches, such as simple and high-throughput sample preparation and fast separation, often by means of UHPLC-MS/MS. Proper method validation based on requirements of bioanalytical guidelines is an inherent part of the developed methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab

  9. Socializing Identity Through Practice: A Mixed Methods Approach to Family Medicine Resident Perspectives on Uncertainty.

    Science.gov (United States)

    Ledford, Christy J W; Cafferty, Lauren A; Seehusen, Dean A

    2015-01-01

    Uncertainty is a central theme in the practice of medicine and particularly primary care. This study explored how family medicine resident physicians react to uncertainty in their practice. This study incorporated a two-phase mixed methods approach, including semi-structured personal interviews (n=21) and longitudinal self-report surveys (n=21) with family medicine residents. Qualitative analysis showed that though residents described uncertainty as an implicit part of their identity, they still developed tactics to minimize or manage uncertainty in their practice. Residents described increasing comfort with uncertainty the longer they practiced and anticipated that growth continuing throughout their careers. Quantitative surveys showed that reactions to uncertainty were more positive over time; however, the difference was not statistically significant. Qualitative and quantitative results show that as family medicine residents practice medicine their perception of uncertainty changes. To reduce uncertainty, residents use relational information-seeking strategies. From a broader view of practice, residents describe uncertainty neutrally, asserting that uncertainty is simply part of the practice of family medicine.

  10. Critical mid-term uncertainties in long-term decarbonisation pathways

    International Nuclear Information System (INIS)

    Usher, Will; Strachan, Neil

    2012-01-01

    Over the next decade, large energy investments are required in the UK to meet growing energy service demands and legally binding emission targets under a pioneering policy agenda. These are necessary despite deep mid-term (2025–2030) uncertainties over which national policy makers have little control. We investigate the effect of two critical mid-term uncertainties on optimal near-term investment decisions using a two-stage stochastic energy system model. The results show that where future fossil fuel prices are uncertain: (i) the near term hedging strategy to 2030 differs from any one deterministic fuel price scenario and is structurally dissimilar to a simple ‘average’ of the deterministic scenarios, and (ii) multiple recourse strategies from 2030 are perturbed by path dependencies caused by hedging investments. Evaluating the uncertainty under a decarbonisation agenda shows that fossil fuel price uncertainty is very expensive at around £20 billion. The addition of novel mitigation options reduces the value of fossil fuel price uncertainty to £11 billion. Uncertain biomass import availability shows a much lower value of uncertainty at £300 million. This paper reveals the complex relationship between the flexibility of the energy system and mitigating the costs of uncertainty due to the path-dependencies caused by the long-life times of both infrastructures and generation technologies. - Highlights: ► Critical mid-term uncertainties affect near-term investments in UK energy system. ► Deterministic scenarios give conflicting near-term actions. ► Stochastic scenarios give one near-term hedging strategy. ► Technologies exhibit path dependency or flexibility. ► Fossil fuel price uncertainty is very expensive, biomass availability uncertainty is not.

  11. Uncertainty evaluation methods for waste package performance assessment

    International Nuclear Information System (INIS)

    Wu, Y.T.; Nair, P.K.; Journel, A.G.; Abramson, L.R.

    1991-01-01

    This report identifies and investigates methodologies to deal with uncertainties in assessing high-level nuclear waste package performance. Four uncertainty evaluation methods (probability-distribution approach, bounding approach, expert judgment, and sensitivity analysis) are suggested as the elements of a methodology that, without either diminishing or enhancing the input uncertainties, can evaluate performance uncertainty. Such a methodology can also help identify critical inputs as a guide to reducing uncertainty so as to provide reasonable assurance that the risk objectives are met. This report examines the current qualitative waste containment regulation and shows how, in conjunction with the identified uncertainty evaluation methodology, a framework for a quantitative probability-based rule can be developed that takes account of the uncertainties. Current US Nuclear Regulatory Commission (NRC) regulation requires that the waste packages provide ''substantially complete containment'' (SCC) during the containment period. The term ''SCC'' is ambiguous and subject to interpretation. This report, together with an accompanying report that describes the technical considerations that must be addressed to satisfy high-level waste containment requirements, provides a basis for a third report to develop recommendations for regulatory uncertainty reduction in the ''containment''requirement of 10 CFR Part 60. 25 refs., 3 figs., 2 tabs

  12. Working and Learning in Times of Uncertainty

    DEFF Research Database (Denmark)

    This book analyses the challenges of globalisation and uncertainty impacting on working and learning at individual, organisational and societal levels. Each of the contributions addresses two overall questions: How is working and learning affected by uncertainty and globalisation? And, in what ways...... do individuals, organisations, political actors and education systems respond to these challenges? Part 1 focuses on the micro level of working and learning for understanding the learning processes from an individual point of view by reflecting on learners’ needs and situations at work and in school......). Finally, Part 3 addresses the macro level of working and learning by analysing how to govern, structure and organise vocational, professional and adult education at the boundaries of work, education and policy making....

  13. Calculation of uncertainties associated to environmental radioactivity measurements and their functions. Practical Procedure II

    International Nuclear Information System (INIS)

    Gascon, C.; Anton, M.P.

    1997-01-01

    Environmental radioactivity measurements are mainly affected by counting uncertainties. In this report the uncertainties associated to certain functions related to activity concentration calculations are determined. Some practical exercise are presented to calculate the uncertainties associated to: a) Chemical recovery of a radiochemical separation when employing tracers (i.e. Pu and Am purification from a sediment sample). b) Indirect determination of a mother radionuclide through one of its daughters (i. e. ''210 Pb quantification following its daughter ''210 Po building-up activity). c) Time span from last separation date of one of the components of a disintegration chain (i.e. Am last purification date from a nuclear weapons following ''241 Am and ''241 Pu measurements). Calculations concerning example b) and c) are based on Baterman equations, regulating radioactive equilibria. Although the exercises here presented are performed with certain radionuclides, they could be applied as generic procedures for other alpha-emitting radioelements

  14. Comparison of two uncertainty dressing methods: SAD VS DAD

    Science.gov (United States)

    Chardon, Jérémy; Mathevet, Thibault; Le-Lay, Matthieu; Gailhard, Joël

    2014-05-01

    Hydrological Ensemble Prediction Systems (HEPSs) allow a better representation of meteorological and hydrological forecast uncertainties and improve human expertise of hydrological forecasts. An operational HEPS has been developed at EDF (French Producer of Electricity) since 2008 and is being used since 2010 on a hundred of watersheds in France. Depending on the hydro-meteorological situation, streamflow forecasts could be issued on a daily basis and are used to help dam management operations during floods or dam works within the river. A part of this HEPS is characterized by a streamflow ensemble post-processing, where a large human expertise is solicited. The aim of post-processing methods is to achieve better overall performances, by dressing hydrological ensemble forecasts with hydrological model uncertainties. The present study compares two post-processing methods, which are based on a logarithmic representation of the residuals distribution of the Rainfall-Runoff (RR) model, based on "perfect" forcing forecasts - i.e. forecasts with observed meteorological variables as inputs. The only difference between the two post-processing methods lies in the sampling of the perfect forcing forecasts for the estimation of the residuals statistics: (i) a first method, referred here as Statistical Analogy Dressing (SAD) model and used for operational HEPS, estimates beforehand the statistics of the residuals by streamflow sub-samples of quantile class and lead-time, since RR model residuals are not homoscedastic. (ii) an alternative method, referred as Dynamical Analogy Dressing (DAD) model, estimates the statistics of the residuals using the N most similar perfect forcing forecasts. The selection of this N forecasts is based on streamflow range and variation. On a set of 20 watersheds used for operational forecasts, both models were evaluated with perfect forcing forecasts and with ensemble forecasts. Results show that both approaches ensure a good post-processing of

  15. Optimization under Uncertainty of Site-Specific Turbine Configurations: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Quick, Julian; Dykes, Katherine; Graf, Peter; Zahle, Frederik

    2016-11-01

    Uncertainty affects many aspects of wind energy plant performance and cost. In this study, we explore opportunities for site-specific turbine configuration optimization that accounts for uncertainty in the wind resource. As a demonstration, a simple empirical model for wind plant cost of energy is used in an optimization under uncertainty to examine how different risk appetites affect the optimal selection of a turbine configuration for sites of different wind resource profiles. If there is unusually high uncertainty in the site wind resource, the optimal turbine configuration diverges from the deterministic case and a generally more conservative design is obtained with increasing risk aversion on the part of the designer.

  16. Uncertainty governance: an integrated framework for managing and communicating uncertainties

    International Nuclear Information System (INIS)

    Umeki, H.; Naito, M.; Takase, H.

    2004-01-01

    Treatment of uncertainty, or in other words, reasoning with imperfect information is widely recognised as being of great importance within performance assessment (PA) of the geological disposal mainly because of the time scale of interest and spatial heterogeneity that geological environment exhibits. A wide range of formal methods have been proposed for the optimal processing of incomplete information. Many of these methods rely on the use of numerical information, the frequency based concept of probability in particular, to handle the imperfections. However, taking quantitative information as a base for models that solve the problem of handling imperfect information merely creates another problem, i.e., how to provide the quantitative information. In many situations this second problem proves more resistant to solution, and in recent years several authors have looked at a particularly ingenious way in accordance with the rules of well-founded methods such as Bayesian probability theory, possibility theory, and the Dempster-Shafer theory of evidence. Those methods, while drawing inspiration from quantitative methods, do not require the kind of complete numerical information required by quantitative methods. Instead they provide information that, though less precise than that provided by quantitative techniques, is often, if not sufficient, the best that could be achieved. Rather than searching for the best method for handling all imperfect information, our strategy for uncertainty management, that is recognition and evaluation of uncertainties associated with PA followed by planning and implementation of measures to reduce them, is to use whichever method best fits the problem at hand. Such an eclectic position leads naturally to integration of the different formalisms. While uncertainty management based on the combination of semi-quantitative methods forms an important part of our framework for uncertainty governance, it only solves half of the problem

  17. Classification and moral evaluation of uncertainties in engineering modeling.

    Science.gov (United States)

    Murphy, Colleen; Gardoni, Paolo; Harris, Charles E

    2011-09-01

    Engineers must deal with risks and uncertainties as a part of their professional work and, in particular, uncertainties are inherent to engineering models. Models play a central role in engineering. Models often represent an abstract and idealized version of the mathematical properties of a target. Using models, engineers can investigate and acquire understanding of how an object or phenomenon will perform under specified conditions. This paper defines the different stages of the modeling process in engineering, classifies the various sources of uncertainty that arise in each stage, and discusses the categories into which these uncertainties fall. The paper then considers the way uncertainty and modeling are approached in science and the criteria for evaluating scientific hypotheses, in order to highlight the very different criteria appropriate for the development of models and the treatment of the inherent uncertainties in engineering. Finally, the paper puts forward nine guidelines for the treatment of uncertainty in engineering modeling.

  18. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  19. Uncertainty in predictions of oil spill trajectories in a coastal zone

    Science.gov (United States)

    Sebastião, P.; Guedes Soares, C.

    2006-12-01

    A method is introduced to determine the uncertainties in the predictions of oil spill trajectories using a classic oil spill model. The method considers the output of the oil spill model as a function of random variables, which are the input parameters, and calculates the standard deviation of the output results which provides a measure of the uncertainty of the model as a result of the uncertainties of the input parameters. In addition to a single trajectory that is calculated by the oil spill model using the mean values of the parameters, a band of trajectories can be defined when various simulations are done taking into account the uncertainties of the input parameters. This band of trajectories defines envelopes of the trajectories that are likely to be followed by the spill given the uncertainties of the input. The method was applied to an oil spill that occurred in 1989 near Sines in the southwestern coast of Portugal. This model represented well the distinction between a wind driven part that remained offshore, and a tide driven part that went ashore. For both parts, the method defined two trajectory envelopes, one calculated exclusively with the wind fields, and the other using wind and tidal currents. In both cases reasonable approximation to the observed results was obtained. The envelope of likely trajectories that is obtained with the uncertainty modelling proved to give a better interpretation of the trajectories that were simulated by the oil spill model.

  20. Uncertainty analysis for Ulysses safety evaluation report

    International Nuclear Information System (INIS)

    Frank, M.V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source---Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show that INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low

  1. II: Through the Western Part of the City: Charlottenburg

    Science.gov (United States)

    Hoffmann, Dieter

    Until 1920 the city we now call Berlin was a collection of independent towns and villages — among them Charlottenburg, which was one of the most important and was the proud sister of Berlin, Prussia’s and Germany’s capital, where the wealthy and innovative bourgeoisie lived. Werner von Siemens, Germany’s pioneer in the modern electrical industry, was a prime example of that elite. His castle-like villa was located not far from today’s Ernst-Reuter-Platz at Otto-Suhr-Allee 10-16, and important parts of his enterprise expanded into the “meadows outside of Charlottenburg” during the second half of the 19th century. It was no accident that the efforts to unite Berlin’s two colleges for trade and construction (both founded around 1800) led to the foundation of a modern Technical College in Charlottenburg in 1879, today’s Technical University of Berlin. Its magnificent main building (figure 1), which was opened in 1882 by the German Emperor, was an expression of the great self-confidence of this new institution of higher learning and of Charlottenburg’s bourgeoisie. Although large parts of the building were destroyed by bombs during World War II, you can still get an impression of its monumentality from what survived at number 135 Strasse des 17. Juni.

  2. Estimating Coastal Digital Elevation Model (DEM) Uncertainty

    Science.gov (United States)

    Amante, C.; Mesick, S.

    2017-12-01

    Integrated bathymetric-topographic digital elevation models (DEMs) are representations of the Earth's solid surface and are fundamental to the modeling of coastal processes, including tsunami, storm surge, and sea-level rise inundation. Deviations in elevation values from the actual seabed or land surface constitute errors in DEMs, which originate from numerous sources, including: (i) the source elevation measurements (e.g., multibeam sonar, lidar), (ii) the interpolative gridding technique (e.g., spline, kriging) used to estimate elevations in areas unconstrained by source measurements, and (iii) the datum transformation used to convert bathymetric and topographic data to common vertical reference systems. The magnitude and spatial distribution of the errors from these sources are typically unknown, and the lack of knowledge regarding these errors represents the vertical uncertainty in the DEM. The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) has developed DEMs for more than 200 coastal communities. This study presents a methodology developed at NOAA NCEI to derive accompanying uncertainty surfaces that estimate DEM errors at the individual cell-level. The development of high-resolution (1/9th arc-second), integrated bathymetric-topographic DEMs along the southwest coast of Florida serves as the case study for deriving uncertainty surfaces. The estimated uncertainty can then be propagated into the modeling of coastal processes that utilize DEMs. Incorporating the uncertainty produces more reliable modeling results, and in turn, better-informed coastal management decisions.

  3. Uncertainties in extreme precipitation under climate change conditions

    DEFF Research Database (Denmark)

    Sunyer Pinya, Maria Antonia

    of adaptation strategies, but these changes are subject to uncertainties. The focus of this PhD thesis is the quantification of uncertainties in changes in extreme precipitation. It addresses two of the main sources of uncertainty in climate change impact studies: regional climate models (RCMs) and statistical...... downscaling methods (SDMs). RCMs provide information on climate change at the regional scale. SDMs are used to bias-correct and downscale the outputs of the RCMs to the local scale of interest in adaptation strategies. In the first part of the study, a multi-model ensemble of RCMs from the European ENSEMBLES...... project was used to quantify the uncertainty in RCM projections over Denmark. Three aspects of the RCMs relevant for the uncertainty quantification were first identified and investigated. These are: the interdependency of the RCMs; the performance in current climate; and the change in the performance...

  4. Music in the exercise domain: a review and synthesis (Part II).

    Science.gov (United States)

    Karageorghis, Costas I; Priest, David-Lee

    2012-03-01

    Since a 1997 review by Karageorghis and Terry, which highlighted the state of knowledge and methodological weaknesses, the number of studies investigating musical reactivity in relation to exercise has swelled considerably. In this two-part review paper, the development of conceptual approaches and mechanisms underlying the effects of music are explicated (Part I), followed by a critical review and synthesis of empirical work (spread over Parts I and II). Pre-task music has been shown to optimise arousal, facilitate task-relevant imagery and improve performance in simple motoric tasks. During repetitive, endurance-type activities, self-selected, motivational and stimulative music has been shown to enhance affect, reduce ratings of perceived exertion, improve energy efficiency and lead to increased work output. There is evidence to suggest that carefully selected music can promote ergogenic and psychological benefits during high-intensity exercise, although it appears to be ineffective in reducing perceptions of exertion beyond the anaerobic threshold. The effects of music appear to be at their most potent when it is used to accompany self-paced exercise or in externally valid conditions. When selected according to its motivational qualities, the positive impact of music on both psychological state and performance is magnified. Guidelines are provided for future research and exercise practitioners.

  5. Shrinkage calibration method for μPIM manufactured parts

    DEFF Research Database (Denmark)

    Quagliotti, Danilo; Tosello, Guido; Salaga, J.

    2016-01-01

    Five green and five sintered parts of a micro mechanical component, produced by micro powder injection moulding, were measured using an optical coordinate measuring machine. The aim was to establish a method for quality assurance of the final produced parts. Initially, the so called “green” parts...... were compared with the sintered parts (final products) calculating the percentage of shrinkage after sintering. Successively, the expanded uncertainty of the measured dimensions were evaluated for each single part as well as for the overall parts. Finally, the estimated uncertainty for the shrinkage...... was evaluated propagating the expanded uncertainty previously stated and considering green and sintered parts correlated. Results showed that the proposed method can be effective instating tolerances if it is assumed that the variability on the dimensions induced by the shrinkage equals the propagated expanded...

  6. Uncertainties in segmentation and their visualisation

    NARCIS (Netherlands)

    Lucieer, Arko

    2004-01-01

    This thesis focuses on uncertainties in remotely sensed image segmentation and their visualisation. The first part describes a visualisation tool, allowing interaction with the parameters of a fuzzy classification algorithm by visually adjusting fuzzy membership functions of classes in a 3D feature

  7. Summary from the epistemic uncertainty workshop: consensus amid diversity

    International Nuclear Information System (INIS)

    Ferson, Scott; Joslyn, Cliff A.; Helton, Jon C.; Oberkampf, William L.; Sentz, Kari

    2004-01-01

    The 'Epistemic Uncertainty Workshop' sponsored by Sandia National Laboratories was held in Albuquerque, New Mexico, on 6-7 August 2002. The workshop was organized around a set of Challenge Problems involving both epistemic and aleatory uncertainty that the workshop participants were invited to solve and discuss. This concluding article in a special issue of Reliability Engineering and System Safety based on the workshop discusses the intent of the Challenge Problems, summarizes some discussions from the workshop, and provides a technical comparison among the papers in this special issue. The Challenge Problems were computationally simple models that were intended as vehicles for the illustration and comparison of conceptual and numerical techniques for use in analyses that involve: (i) epistemic uncertainty, (ii) aggregation of multiple characterizations of epistemic uncertainty, (iii) combination of epistemic and aleatory uncertainty, and (iv) models with repeated parameters. There was considerable diversity of opinion at the workshop about both methods and fundamental issues, and yet substantial consensus about what the answers to the problems were, and even about how each of the four issues should be addressed. Among the technical approaches advanced were probability theory, Dempster-Shafer evidence theory, random sets, sets of probability measures, imprecise coherent probabilities, coherent lower previsions, probability boxes, possibility theory, fuzzy sets, joint distribution tableaux, polynomial chaos expansions, and info-gap models. Although some participants maintained that a purely probabilistic approach is fully capable of accounting for all forms of uncertainty, most agreed that the treatment of epistemic uncertainty introduces important considerations and that the issues underlying the Challenge Problems are legitimate and significant. Topics identified as meriting additional research include elicitation of uncertainty representations, aggregation of

  8. Survey of sampling-based methods for uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Helton, J.C.; Johnson, J.D.; Sallaberry, C.J.; Storlie, C.B.

    2006-01-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (i) definition of probability distributions to characterize epistemic uncertainty in analysis inputs (ii) generation of samples from uncertain analysis inputs (iii) propagation of sampled inputs through an analysis (iv) presentation of uncertainty analysis results, and (v) determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two-dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition

  9. Detecting quantum entanglement. Entanglement witnesses and uncertainty relations

    International Nuclear Information System (INIS)

    Guehne, O.

    2004-01-01

    This thesis deals with methods of the detection of entanglement. After recalling some facts and definitions concerning entanglement and separability, we investigate two methods of the detection of entanglement. In the first part of this thesis we consider so-called entanglement witnesses, mainly in view of the detection of multipartite entanglement. Entanglement witnesses are observables for which a negative expectation value indicates entanglement. We first present a simple method to construct these witnesses. Since witnesses are nonlocal observables, they are not easy to measure in a real experiment. However, as we will show, one can circumvent this problem by decomposing the witness into several local observables which can be measured separately. We calculate the local decompositions for several interesting witnesses for two, three and four qubits. Local decompositions can be optimized in the number of measurement settings which are needed for an experimental implementation. We present a method to prove that a given local decomposition is optimal and discuss with this the optimality of our decompositions. Then we present another method of designing witnesses which are by construction measurable with local measurements. Finally, we shortly report on experiments where some of the witnesses derived in this part have been used to detect three- and four-partite entanglement of polarized photons. The second part of this thesis deals with separability criteria which are written in terms of uncertainty relations. There are two different formulations of uncertainty relations since one can measure the uncertainty of an observable by its variance as well as by entropic quantities. We show that both formulations are useful tools for the derivation of separability criteria for finite-dimensional systems and investigate the resulting criteria. Our results in this part exhibit also some more fundamental properties of entanglement: We show how known separability criteria for

  10. The Uncertainty Test for the MAAP Computer Code

    International Nuclear Information System (INIS)

    Park, S. H.; Song, Y. M.; Park, S. Y.; Ahn, K. I.; Kim, K. R.; Lee, Y. J.

    2008-01-01

    After the Three Mile Island Unit 2 (TMI-2) and Chernobyl accidents, safety issues for a severe accident are treated in various aspects. Major issues in our research part include a level 2 PSA. The difficulty in expanding the level 2 PSA as a risk information activity is the uncertainty. In former days, it attached a weight to improve the quality in a internal accident PSA, but the effort is insufficient for decrease the phenomenon uncertainty in the level 2 PSA. In our country, the uncertainty degree is high in the case of a level 2 PSA model, and it is necessary to secure a model to decrease the uncertainty. We have not yet experienced the uncertainty assessment technology, the assessment system itself depends on advanced nations. In advanced nations, the severe accident simulator is implemented in the hardware level. But in our case, basic function in a software level can be implemented. In these circumstance at home and abroad, similar instances are surveyed such as UQM and MELCOR. Referred to these instances, SAUNA (Severe Accident UNcertainty Analysis) system is being developed in our project to assess and decrease the uncertainty in a level 2 PSA. It selects the MAAP code to analyze the uncertainty in a severe accident

  11. Uncertainty evaluation of data and information fusion within the context of the decision loop

    CSIR Research Space (South Africa)

    De Villiers, J Pieter

    2016-07-01

    Full Text Available . Here, the uncertainties in the combination and decision parts of the information flow are considered. The objective of this paper is to make explicit how uncertainties that arise during design, combine with uncertainties during runtime, as well...

  12. Evaluation of uncertainties in benefit-cost studies of electrical power plants. II. Development and application of a procedure for quantifying environmental uncertainties of a nuclear power plant. Final report

    International Nuclear Information System (INIS)

    Sullivan, W.G.

    1977-07-01

    Steam-electric generation plants are evaluated on a benefit-cost basis. Non-economic factors in the development and application of a procedure for quantifying environmental uncertainties of a nuclear power plant are discussed. By comparing monetary costs of a particular power plant assessed in Part 1 with non-monetary values arrived at in Part 2 and using an evaluation procedure developed in this study, a proposed power plant can be selected as a preferred alternative. This procedure enables policymakers to identify the incremental advantages and disadvantages of different power plants in view of their geographic locations. The report presents the evaluation procedure on a task by task basis and shows how it can be applied to a particular power plant. Because of the lack of objective data, it draws heavily on subjectively-derived inputs of individuals who are knowledgeable about the plant being investigated. An abbreviated study at another power plant demonstrated the transferability of the general evaluation procedure. Included in the appendices are techniques for developing scoring functions and a user's manual for the Fortran IV Program

  13. A stochastic optimization model under modeling uncertainty and parameter certainty for groundwater remediation design-Part I. Model development

    Energy Technology Data Exchange (ETDEWEB)

    He, L., E-mail: li.he@ryerson.ca [Department of Civil Engineering, Faculty of Engineering, Architecture and Science, Ryerson University, 350 Victoria Street, Toronto, Ontario, M5B 2K3 (Canada); Huang, G.H. [Environmental Systems Engineering Program, Faculty of Engineering, University of Regina, Regina, Saskatchewan, S4S 0A2 (Canada); College of Urban Environmental Sciences, Peking University, Beijing 100871 (China); Lu, H.W. [Environmental Systems Engineering Program, Faculty of Engineering, University of Regina, Regina, Saskatchewan, S4S 0A2 (Canada)

    2010-04-15

    Solving groundwater remediation optimization problems based on proxy simulators can usually yield optimal solutions differing from the 'true' ones of the problem. This study presents a new stochastic optimization model under modeling uncertainty and parameter certainty (SOMUM) and the associated solution method for simultaneously addressing modeling uncertainty associated with simulator residuals and optimizing groundwater remediation processes. This is a new attempt different from the previous modeling efforts. The previous ones focused on addressing uncertainty in physical parameters (i.e. soil porosity) while this one aims to deal with uncertainty in mathematical simulator (arising from model residuals). Compared to the existing modeling approaches (i.e. only parameter uncertainty is considered), the model has the advantages of providing mean-variance analysis for contaminant concentrations, mitigating the effects of modeling uncertainties on optimal remediation strategies, offering confidence level of optimal remediation strategies to system designers, and reducing computational cost in optimization processes.

  14. Uncertainty visualisation in the Model Web

    Science.gov (United States)

    Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.

    2012-04-01

    : (i) adjacent maps showing data and uncertainty separately, and (ii) multidimensional mapping providing different visualisation methods in combination to explore the spatial, temporal and uncertainty distribution of the data. Adjacent maps allow a simpler visualisation by separating value and uncertainty maps for non-experts and a first overview. The multidimensional approach allows a more complex exploration of the data for experts by browsing through the different dimensions. It offers the visualisation of maps, statistic plots and time series in different windows and sliders to interactively move through time, space and uncertainty (thresholds).

  15. Impact of petrophysical uncertainty on Bayesian hydrogeophysical inversion and model selection

    Science.gov (United States)

    Brunetti, Carlotta; Linde, Niklas

    2018-01-01

    Quantitative hydrogeophysical studies rely heavily on petrophysical relationships that link geophysical properties to hydrogeological properties and state variables. Coupled inversion studies are frequently based on the questionable assumption that these relationships are perfect (i.e., no scatter). Using synthetic examples and crosshole ground-penetrating radar (GPR) data from the South Oyster Bacterial Transport Site in Virginia, USA, we investigate the impact of spatially-correlated petrophysical uncertainty on inferred posterior porosity and hydraulic conductivity distributions and on Bayes factors used in Bayesian model selection. Our study shows that accounting for petrophysical uncertainty in the inversion (I) decreases bias of the inferred variance of hydrogeological subsurface properties, (II) provides more realistic uncertainty assessment and (III) reduces the overconfidence in the ability of geophysical data to falsify conceptual hydrogeological models.

  16. Evaluating Prognostics Performance for Algorithms Incorporating Uncertainty Estimates

    Data.gov (United States)

    National Aeronautics and Space Administration — Uncertainty Representation and Management (URM) are an integral part of the prognostic system development.1As capabilities of prediction algorithms evolve, research...

  17. Uncertainty in the environmental modelling process – A framework and guidance

    NARCIS (Netherlands)

    Refsgaard, J.C.; van der Sluijs, J.P.|info:eu-repo/dai/nl/073427489; Hojberg, A.L.; Vanrolleghem, P.

    2007-01-01

    A terminology and typology of uncertainty is presented together with a framework for the modelling process, its interaction with the broader water management process and the role of uncertainty at different stages in the modelling processes. Brief reviews have been made of 14 different (partly

  18. Propagation of nuclear data uncertainties for fusion power measurements

    Directory of Open Access Journals (Sweden)

    Sjöstrand Henrik

    2017-01-01

    Full Text Available Neutron measurements using neutron activation systems are an essential part of the diagnostic system at large fusion machines such as JET and ITER. Nuclear data is used to infer the neutron yield. Consequently, high-quality nuclear data is essential for the proper determination of the neutron yield and fusion power. However, uncertainties due to nuclear data are not fully taken into account in uncertainty analysis for neutron yield calibrations using activation foils. This paper investigates the neutron yield uncertainty due to nuclear data using the so-called Total Monte Carlo Method. The work is performed using a detailed MCNP model of the JET fusion machine; the uncertainties due to the cross-sections and angular distributions in JET structural materials, as well as the activation cross-sections in the activation foils, are analysed. It is found that a significant contribution to the neutron yield uncertainty can come from uncertainties in the nuclear data.

  19. Quantifying uncertainties in precipitation: a case study from Greece

    Directory of Open Access Journals (Sweden)

    C. Anagnostopoulou

    2008-04-01

    Full Text Available The main objective of the present study was the examination and the quantification of the uncertainties in the precipitation time series over the Greek area, for a 42-year time period. The uncertainty index applied to the rainfall data is a combination (total of the departures of the rainfall season length, of the median data of the accumulated percentages and of the total amounts of rainfall. Results of the study indicated that all the stations are characterized, on an average basis, by medium to high uncertainty. The stations that presented an increasing rainfall uncertainty were the ones located mainly to the continental parts of the study region. From the temporal analysis of the uncertainty index, it was demonstrated that the greatest percentage of the years, for all the stations time-series, was characterized by low to high uncertainty (intermediate categories of the index. Most of the results of the uncertainty index for the Greek region are similar to the corresponding results of various stations all over the European region.

  20. Study of the Effect of Temporal Sampling Frequency on DSCOVR Observations Using the GEOS-5 Nature Run Results. Part II; Cloud Coverage

    Science.gov (United States)

    Holdaway, Daniel; Yang, Yuekui

    2016-01-01

    This is the second part of a study on how temporal sampling frequency affects satellite retrievals in support of the Deep Space Climate Observatory (DSCOVR) mission. Continuing from Part 1, which looked at Earth's radiation budget, this paper presents the effect of sampling frequency on DSCOVR-derived cloud fraction. The output from NASA's Goddard Earth Observing System version 5 (GEOS-5) Nature Run is used as the "truth". The effect of temporal resolution on potential DSCOVR observations is assessed by subsampling the full Nature Run data. A set of metrics, including uncertainty and absolute error in the subsampled time series, correlation between the original and the subsamples, and Fourier analysis have been used for this study. Results show that, for a given sampling frequency, the uncertainties in the annual mean cloud fraction of the sunlit half of the Earth are larger over land than over ocean. Analysis of correlation coefficients between the subsamples and the original time series demonstrates that even though sampling at certain longer time intervals may not increase the uncertainty in the mean, the subsampled time series is further and further away from the "truth" as the sampling interval becomes larger and larger. Fourier analysis shows that the simulated DSCOVR cloud fraction has underlying periodical features at certain time intervals, such as 8, 12, and 24 h. If the data is subsampled at these frequencies, the uncertainties in the mean cloud fraction are higher. These results provide helpful insights for the DSCOVR temporal sampling strategy.

  1. A stochastic optimization model under modeling uncertainty and parameter certainty for groundwater remediation design--part I. Model development.

    Science.gov (United States)

    He, L; Huang, G H; Lu, H W

    2010-04-15

    Solving groundwater remediation optimization problems based on proxy simulators can usually yield optimal solutions differing from the "true" ones of the problem. This study presents a new stochastic optimization model under modeling uncertainty and parameter certainty (SOMUM) and the associated solution method for simultaneously addressing modeling uncertainty associated with simulator residuals and optimizing groundwater remediation processes. This is a new attempt different from the previous modeling efforts. The previous ones focused on addressing uncertainty in physical parameters (i.e. soil porosity) while this one aims to deal with uncertainty in mathematical simulator (arising from model residuals). Compared to the existing modeling approaches (i.e. only parameter uncertainty is considered), the model has the advantages of providing mean-variance analysis for contaminant concentrations, mitigating the effects of modeling uncertainties on optimal remediation strategies, offering confidence level of optimal remediation strategies to system designers, and reducing computational cost in optimization processes. 2009 Elsevier B.V. All rights reserved.

  2. Characterization of cDNA for human tripeptidyl peptidase II: The N-terminal part of the enzyme is similar to subtilisin

    International Nuclear Information System (INIS)

    Tomkinson, B.; Jonsson, A-K

    1991-01-01

    Tripeptidyl peptidase II is a high molecular weight serine exopeptidase, which has been purified from rat liver and human erythrocytes. Four clones, representing 4453 bp, or 90% of the mRNA of the human enzyme, have been isolated from two different cDNA libraries. One clone, designated A2, was obtained after screening a human B-lymphocyte cDNA library with a degenerated oligonucleotide mixture. The B-lymphocyte cDNA library, obtained from human fibroblasts, were rescreened with a 147 bp fragment from the 5' part of the A2 clone, whereby three different overlapping cDNA clones could be isolated. The deduced amino acid sequence, 1196 amino acid residues, corresponding to the longest open rading frame of the assembled nucleotide sequence, was compared to sequences of current databases. This revealed a 56% similarity between the bacterial enzyme subtilisin and the N-terminal part of tripeptidyl peptidase II. The enzyme was found to be represented by two different mRNAs of 4.2 and 5.0 kilobases, respectively, which probably result from the utilziation of two different polyadenylation sites. Futhermore, cDNA corresponding to both the N-terminal and C-terminal part of tripeptidyl peptidase II hybridized with genomic DNA from mouse, horse, calf, and hen, even under fairly high stringency conditions, indicating that tripeptidyl peptidase II is highly conserved

  3. Benchmarking NLDAS-2 Soil Moisture and Evapotranspiration to Separate Uncertainty Contributions

    Science.gov (United States)

    Nearing, Grey S.; Mocko, David M.; Peters-Lidard, Christa D.; Kumar, Sujay V.; Xia, Youlong

    2016-01-01

    Model benchmarking allows us to separate uncertainty in model predictions caused 1 by model inputs from uncertainty due to model structural error. We extend this method with a large-sample approach (using data from multiple field sites) to measure prediction uncertainty caused by errors in (i) forcing data, (ii) model parameters, and (iii) model structure, and use it to compare the efficiency of soil moisture state and evapotranspiration flux predictions made by the four land surface models in the North American Land Data Assimilation System Phase 2 (NLDAS-2). Parameters dominated uncertainty in soil moisture estimates and forcing data dominated uncertainty in evapotranspiration estimates; however, the models themselves used only a fraction of the information available to them. This means that there is significant potential to improve all three components of the NLDAS-2 system. In particular, continued work toward refining the parameter maps and look-up tables, the forcing data measurement and processing, and also the land surface models themselves, has potential to result in improved estimates of surface mass and energy balances.

  4. Strain gauge measurement uncertainties on hydraulic turbine runner blade

    International Nuclear Information System (INIS)

    Arpin-Pont, J; Gagnon, M; Tahan, S A; Coutu, A; Thibault, D

    2012-01-01

    Strains experimentally measured with strain gauges can differ from those evaluated using the Finite Element (FE) method. This difference is due mainly to the assumptions and uncertainties inherent to each method. To circumvent this difficulty, we developed a numerical method based on Monte Carlo simulations to evaluate measurement uncertainties produced by the behaviour of a unidirectional welded gauge, its position uncertainty and its integration effect. This numerical method uses the displacement fields of the studied part evaluated by an FE analysis. The paper presents a study case using in situ data measured on a hydraulic turbine runner. The FE analysis of the turbine runner blade was computed, and our numerical method used to evaluate uncertainties on strains measured at five locations with welded strain gauges. Then, measured strains and their uncertainty ranges are compared to the estimated strains. The uncertainty ranges obtained extended from 74 με to 165 με. Furthermore, the biases observed between the median of the uncertainty ranges and the FE strains varied from −36 to 36 με. Note that strain gauge measurement uncertainties depend mainly on displacement fields and gauge geometry.

  5. Uncertainty of a hydrological climate change impact assessment - Is it really all about climate uncertainty?

    Science.gov (United States)

    Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian

    2013-04-01

    climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on 2 small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment method with 2 different likelihood functions. One was a time-series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was a likelihood function for the flow quantiles directly. Due to the better data coverage and smaller hydrological complexity in one of our test catchments we had better performance from the hydrological model and thus could observe that the relative importance of different uncertainty sources varied between sites, boundary conditions and flow indicators. The uncertainty of future climate was important, but not dominant. The deficiencies of the hydrological model were on the same scale, especially for the sites and flow components where model performance for the past observations was further from optimal (Nash-Sutcliffe index = 0.5 - 0.7). The overall uncertainty of predictions was well beyond the expected change signal even for the best performing site and flow indicator.

  6. Rise, fall and resurrection of chromosome territories: a historical perspective Part II. Fall and resurrection of chromosome territories during the 1950s to 1980s. Part III. Chromosome territories and the functional nuclear architecture: experiments and m

    OpenAIRE

    T Cremer; C Cremer

    2009-01-01

    Part II of this historical review on the progress of nuclear architecture studies points out why the original hypothesis of chromosome territories from Carl Rabl and Theodor Boveri (described in part I) was abandoned during the 1950s and finally proven by compelling evidence forwarded by laser-uvmicrobeam studies and in situ hybridization experiments. Part II also includes a section on the development of advanced light microscopic techniques breaking the classical Abbe limit written for reade...

  7. EBR-II Reactor Physics Benchmark Evaluation Report

    Energy Technology Data Exchange (ETDEWEB)

    Pope, Chad L. [Idaho State Univ., Pocatello, ID (United States); Lum, Edward S [Idaho State Univ., Pocatello, ID (United States); Stewart, Ryan [Idaho State Univ., Pocatello, ID (United States); Byambadorj, Bilguun [Idaho State Univ., Pocatello, ID (United States); Beaulieu, Quinton [Idaho State Univ., Pocatello, ID (United States)

    2017-12-28

    This report provides a reactor physics benchmark evaluation with associated uncertainty quantification for the critical configuration of the April 1986 Experimental Breeder Reactor II Run 138B core configuration.

  8. Part I: quantum fluctuations in chains of Josephson junctions. Part II: directed aggregation on the Bethe lattice

    International Nuclear Information System (INIS)

    Bradley, R.M.

    1985-01-01

    Part I studies the effect of quantum fluctuations of the phase on the low temperature behavior of two models of Josephson junction chains with Coulomb interactions taken into account. The first model, which represents a chain of junctions close to a ground plane, is the Hamiltonian version of the two-dimensional XY model in one space and one time dimension. In the second model, the charging energy for a single junction in the chain is just the parallel-plate capacitor energy. It is shown that quantum fluctuations produce exponential decay of the order parameter correlation junction for any finite value of the junction capacitance. Part II deals with two types of directed aggregation on the Bethe lattice - directed diffusion-limited aggregation DDLA and ballistic aggregation (BA). In the DDLA problem on finite lattices, an exact nonlinear recursion relation is constructed for the probability distribution of the density. The mean density tends to zero as the lattice size is taken into infinity. Using a mapping between the model with perfect adhesion on contact and another model with a particular value of the adhesion probability, it is shown that the adhesion probability is irrelevant over an interval of values

  9. Design of site specific radiopharmaceuticals for tumor imaging. (Parts I and II)

    International Nuclear Information System (INIS)

    Van Dort, M.E.

    1983-01-01

    Part I. Synthetic methods were developed for the preparation of several iodinated benzoic acid hydrazides as labeling moieties for indirect tagging of carbonyl-containing bio-molecules and potential tumor-imaging agents. Biodistribution studies conducted in mice on the derivatives having the I-125 label ortho to a phenolic OH demonstrated a rapid in vivo deiodination. Part II. The reported high melanin binding affinity of quinoline and other heterocyclic antimalarial drugs led to the development of many analogues of such molecules as potential melanoma-imaging agents. Once such analogue iodochloroquine does exhibit high melanin binding, but has found limited clinical use due to appreciable accumulation in non-target tissues such as the adrenal cortex and inner ear. This project developed a new series of candidate melanoma imaging agents which would be easier to radio-label, could yield higher specific activity product, and which might demonstrate more favorable pharmacokinetic and dosimetric characteristics compared to iodochloroquine

  10. Developing guidelines for economic evaluation of environmental impacts in EIAs. Part II: Case studies and dose-response literature

    International Nuclear Information System (INIS)

    2005-01-01

    This Part II of the report contains full versions of the case studies for air, water and land (Chapters 2-4), which were only summarised in Part I. In addition, during the work the research team has collected a large amount of literature and information on dose response relationships for air and water pollution relevant to China. This information is included as Chapters 5 and 6

  11. Developing guidelines for economic evaluation of environmental impacts in EIAs. Part II: Case studies and dose-response literature

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    This Part II of the report contains full versions of the case studies for air, water and land (Chapters 2-4), which were only summarised in Part I. In addition, during the work the research team has collected a large amount of literature and information on dose response relationships for air and water pollution relevant to China. This information is included as Chapters 5 and 6.

  12. Repository Planning, Design, and Engineering: Part II-Equipment and Costing.

    Science.gov (United States)

    Baird, Phillip M; Gunter, Elaine W

    2016-08-01

    Part II of this article discusses and provides guidance on the equipment and systems necessary to operate a repository. The various types of storage equipment and monitoring and support systems are presented in detail. While the material focuses on the large repository, the requirements for a small-scale startup are also presented. Cost estimates and a cost model for establishing a repository are presented. The cost model presents an expected range of acquisition costs for the large capital items in developing a repository. A range of 5,000-7,000 ft(2) constructed has been assumed, with 50 frozen storage units, to reflect a successful operation with growth potential. No design or engineering costs, permit or regulatory costs, or smaller items such as the computers, software, furniture, phones, and barcode readers required for operations have been included.

  13. Plurality of Type A evaluations of uncertainty

    Science.gov (United States)

    Possolo, Antonio; Pintar, Adam L.

    2017-10-01

    The evaluations of measurement uncertainty involving the application of statistical methods to measurement data (Type A evaluations as specified in the Guide to the Expression of Uncertainty in Measurement, GUM) comprise the following three main steps: (i) developing a statistical model that captures the pattern of dispersion or variability in the experimental data, and that relates the data either to the measurand directly or to some intermediate quantity (input quantity) that the measurand depends on; (ii) selecting a procedure for data reduction that is consistent with this model and that is fit for the purpose that the results are intended to serve; (iii) producing estimates of the model parameters, or predictions based on the fitted model, and evaluations of uncertainty that qualify either those estimates or these predictions, and that are suitable for use in subsequent uncertainty propagation exercises. We illustrate these steps in uncertainty evaluations related to the measurement of the mass fraction of vanadium in a bituminous coal reference material, including the assessment of the homogeneity of the material, and to the calibration and measurement of the amount-of-substance fraction of a hydrochlorofluorocarbon in air, and of the age of a meteorite. Our goal is to expose the plurality of choices that can reasonably be made when taking each of the three steps outlined above, and to show that different choices typically lead to different estimates of the quantities of interest, and to different evaluations of the associated uncertainty. In all the examples, the several alternatives considered represent choices that comparably competent statisticians might make, but who differ in the assumptions that they are prepared to rely on, and in their selection of approach to statistical inference. They represent also alternative treatments that the same statistician might give to the same data when the results are intended for different purposes.

  14. Problems due to icing of overhead lines - Part II

    International Nuclear Information System (INIS)

    Havard, D.G.; Pon, C.J.; Krishnasamy, S.G.

    1985-01-01

    A companion paper describes uncertainties in overhead line design due to the variability of ice and wind loads. This paper reviews two other effects due to icing; conductor galloping and torsional instability, which require further study. (author)

  15. Financial development, uncertainty and economic growth

    NARCIS (Netherlands)

    Lensink, B.W.

    By performing a cross-country growth regression for the 1970-1998 period this paper finds evidence for the fact that the impact of policy uncertainty on economic growth depends on the development of the financial sector. It appears that a higher level of financial development partly mitigates the

  16. Three Mile Island: a report to the commissioners and to the public. Volume II, Part 1

    International Nuclear Information System (INIS)

    1979-01-01

    This is part one of three parts of the second volume of the Special Inquiry Group's report to the Nuclear Regulatory Commission on the accident at Three Mile Island. The first volume contained a narrative description of the accident and a discussion of the major conclusions and recommendations. This second volume is divided into three parts. Part 1 of Volume II focuses on the pre-accident licensing and regulatory background. This part includes an examination of the overall licensing and regulatory system for nuclear powerplants viewed from different perspectives: the system as it is set forth in statutes and regulations, as described in Congressional testimony, and an overview of the system as it really works. In addition, Part 1 includes the licensing, operating, and inspection history of Three Mile Island Unit 2, discussions of relevant regulatory matters, a discussion of specific precursor events related to the accident, a case study of the pressurizer design issue, and an analysis of incentives to declare commercial operation

  17. OpenTURNS, an open source uncertainty engineering software

    International Nuclear Information System (INIS)

    Popelin, A.L.; Dufoy, A.

    2013-01-01

    The needs to assess robust performances for complex systems have lead to the emergence of a new industrial simulation challenge: to take into account uncertainties when dealing with complex numerical simulation frameworks. EDF has taken part in the development of an Open Source software platform dedicated to uncertainty propagation by probabilistic methods, named OpenTURNS for Open source Treatment of Uncertainty, Risk and Statistics. OpenTURNS includes a large variety of qualified algorithms in order to manage uncertainties in industrial studies, from the uncertainty quantification step (with possibilities to model stochastic dependence thanks to the copula theory and stochastic processes), to the uncertainty propagation step (with some innovative simulation algorithms as the ziggurat method for normal variables) and the sensitivity analysis one (with some sensitivity index based on the evaluation of means conditioned to the realization of a particular event). It also enables to build some response surfaces that can include the stochastic modeling (with the chaos polynomial method for example). Generic wrappers to link OpenTURNS to the modeling software are proposed. At last, OpenTURNS is largely documented to provide rules to help use and contribution

  18. [Dealing with diagnostic uncertainty in general practice].

    Science.gov (United States)

    Wübken, Magdalena; Oswald, Jana; Schneider, Antonius

    2013-01-01

    In general, the prevalence of diseases is low in primary care. Therefore, the positive predictive value of diagnostic tests is lower than in hospitals where patients are highly selected. In addition, the patients present with milder forms of disease; and many diseases might hide behind the initial symptom(s). These facts lead to diagnostic uncertainty which is somewhat inherent to general practice. This narrative review discusses different sources of and reasons for uncertainty and strategies to deal with it in the context of the current literature. Fear of uncertainty correlates with higher diagnostic activities. The attitude towards uncertainty correlates with the choice of medical speciality by vocational trainees or medical students. An intolerance of uncertainty, which still increases as medicine is making steady progress, might partly explain the growing shortage of general practitioners. The bio-psycho-social context appears to be important to diagnostic decision-making. The effect of intuition and heuristics are investigated by cognitive psychologists. It is still unclear whether these aspects are prone to bias or useful, which might depend on the context of medical decisions. Good communication is of great importance to share uncertainty with the patients in a transparent way and to alleviate shared decision-making. Dealing with uncertainty should be seen as an important core component of general practice and needs to be investigated in more detail to improve the respective medical decisions. Copyright © 2013. Published by Elsevier GmbH.

  19. Market Analysis and Consumer Impacts Source Document. Part II. Review of Motor Vehicle Market and Consumer Expenditures on Motor Vehicle Transportation

    Science.gov (United States)

    1980-12-01

    This source document on motor vehicle market analysis and consumer impacts consists of three parts. Part II consists of studies and review on: motor vehicle sales trends; motor vehicle fleet life and fleet composition; car buying patterns of the busi...

  20. Calibration of tri-axial MEMS accelerometers in the low-frequency range – Part 2: Uncertainty assessment

    Directory of Open Access Journals (Sweden)

    G. D'Emilia

    2018-05-01

    Full Text Available A comparison among three methods for the calibration of tri-axial accelerometers, in particular MEMS, is presented in this paper, paying attention to the uncertainty assessment of each method. The first method is performed according to the ISO 16063 standards. Two innovative methods are analysed, both suitable for in-field application. The effects on the whole uncertainty of the following aspects have been evaluated: the test bench performances in realizing the reference motion, the vibration reference sensor, the geometrical parameters and the data processing techniques. The uncertainty contributions due to the offset and the transverse sensitivity are also studied, by calibrating two different types of accelerometers, a piezoelectric one and a capacitive one, to check their effect on the accuracy of the methods under comparison. The reproducibility of methods is demonstrated. Relative uncertainty of methods ranges from 3 to 5 %, depending on the complexity of the model and of the requested operations. The results appear promising for low-cost calibration of new tri-axial accelerometers of MEMS type.

  1. Uncertainty modelling and analysis of environmental systems: a river sediment yield example

    NARCIS (Netherlands)

    Keesman, K.J.; Koskela, J.; Guillaume, J.H.; Norton, J.P.; Croke, B.; Jakeman, A.

    2011-01-01

    Abstract: Throughout the last decades uncertainty analysis has become an essential part of environmental model building (e.g. Beck 1987; Refsgaard et al., 2007). The objective of the paper is to introduce stochastic and setmembership uncertainty modelling concepts, which basically differ in the

  2. Accounting for Model Uncertainties Using Reliability Methods - Application to Carbon Dioxide Geologic Sequestration System. Final Report

    International Nuclear Information System (INIS)

    Mok, Chin Man; Doughty, Christine; Zhang, Keni; Pruess, Karsten; Kiureghian, Armen; Zhang, Miao; Kaback, Dawn

    2010-01-01

    A new computer code, CALRELTOUGH, which uses reliability methods to incorporate parameter sensitivity and uncertainty analysis into subsurface flow and transport models, was developed by Geomatrix Consultants, Inc. in collaboration with Lawrence Berkeley National Laboratory and University of California at Berkeley. The CALREL reliability code was developed at the University of California at Berkely for geotechnical applications and the TOUGH family of codes was developed at Lawrence Berkeley National Laboratory for subsurface flow and tranport applications. The integration of the two codes provides provides a new approach to deal with uncertainties in flow and transport modeling of the subsurface, such as those uncertainties associated with hydrogeology parameters, boundary conditions, and initial conditions of subsurface flow and transport using data from site characterization and monitoring for conditioning. The new code enables computation of the reliability of a system and the components that make up the system, instead of calculating the complete probability distributions of model predictions at all locations at all times. The new CALRELTOUGH code has tremendous potential to advance subsurface understanding for a variety of applications including subsurface energy storage, nuclear waste disposal, carbon sequestration, extraction of natural resources, and environmental remediation. The new code was tested on a carbon sequestration problem as part of the Phase I project. Phase iI was not awarded.

  3. Quantitative impact of aerosols on numerical weather prediction. Part II: Impacts to IR radiance assimilation

    Science.gov (United States)

    Marquis, J. W.; Campbell, J. R.; Oyola, M. I.; Ruston, B. C.; Zhang, J.

    2017-12-01

    This is part II of a two-part series examining the impacts of aerosol particles on weather forecasts. In this study, the aerosol indirect effects on weather forecasts are explored by examining the temperature and moisture analysis associated with assimilating dust contaminated hyperspectral infrared radiances. The dust induced temperature and moisture biases are quantified for different aerosol vertical distribution and loading scenarios. The overall impacts of dust contamination on temperature and moisture forecasts are quantified over the west coast of Africa, with the assistance of aerosol retrievals from AERONET, MPL, and CALIOP. At last, methods for improving hyperspectral infrared data assimilation in dust contaminated regions are proposed.

  4. International Working Group on Fast Reactors Eight Annual Meeting, Vienna, Austria, 15-18 April 1975. Summary Report. Part II

    International Nuclear Information System (INIS)

    1975-07-01

    The Eighth Annual Meeting of the IAEA International Working Group on Past Reactors was held at the IAEA Headquarters in Vienna, Austria, from 15 to 18 April 1975. The Summary Report (Part I) contains the Minutes of the Meeting. The Summary Report (Part II) contains the papers which review the national programmes in the field of LMPBR’s and other presentations at the Meeting. The Summary Report (Part III) contains the discussions on the review of the national programmes

  5. GUIDANCE2: accurate detection of unreliable alignment regions accounting for the uncertainty of multiple parameters.

    Science.gov (United States)

    Sela, Itamar; Ashkenazy, Haim; Katoh, Kazutaka; Pupko, Tal

    2015-07-01

    Inference of multiple sequence alignments (MSAs) is a critical part of phylogenetic and comparative genomics studies. However, from the same set of sequences different MSAs are often inferred, depending on the methodologies used and the assumed parameters. Much effort has recently been devoted to improving the ability to identify unreliable alignment regions. Detecting such unreliable regions was previously shown to be important for downstream analyses relying on MSAs, such as the detection of positive selection. Here we developed GUIDANCE2, a new integrative methodology that accounts for: (i) uncertainty in the process of indel formation, (ii) uncertainty in the assumed guide tree and (iii) co-optimal solutions in the pairwise alignments, used as building blocks in progressive alignment algorithms. We compared GUIDANCE2 with seven methodologies to detect unreliable MSA regions using extensive simulations and empirical benchmarks. We show that GUIDANCE2 outperforms all previously developed methodologies. Furthermore, GUIDANCE2 also provides a set of alternative MSAs which can be useful for downstream analyses. The novel algorithm is implemented as a web-server, available at: http://guidance.tau.ac.il. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  6. Social class, political power, and the state: their implications in medicine--parts I and II.

    Science.gov (United States)

    Navarro, V

    1976-01-01

    This three part article presents an anlysis of the distribution of power and of the nature of the state in Western industrialized societies and details their implications in medicine. Part I presents a critique of contemporary theories of the Western system of power; discusses the countervailing pluralist and power elite theories, as well as those of bureaucratic and professional control; and concludes with an examination of the Marxist theories of economic determinism, structural determinism, and corporate statism. Part II presents a Marxist theory of the role, nature, and characteristics of state intervention. Part III (which will appear in the next issue of this journal) focuses on the mode of that intervention and the reasons for its growth, with an added analysis of the attributes of state intervention in the health sector, and of the dialectical relationship between its growth and the current fiscal crisis of the state. In all three parts, the focus is on Western European countries and on North America, with many examples and categories from the area of medicine.

  7. Uncertainty and equipoise: at interplay between epistemology, decision making and ethics.

    Science.gov (United States)

    Djulbegovic, Benjamin

    2011-10-01

    In recent years, various authors have proposed that the concept of equipoise be abandoned because it conflates the practice of clinical care with clinical research. At the same time, the equipoise opponents acknowledge the necessity of clinical research if there are unresolved uncertainties about the effects of proposed healthcare interventions. As equipoise represents just 1 measure of uncertainty, proposals to abandon equipoise while maintaining a requirement for addressing uncertainties are contradictory and ultimately not valid. As acknowledgment and articulation of uncertainties represent key scientific and moral requirements for human experimentation, the concept of equipoise remains the most useful framework to link the theory of human experimentation with the theory of rational choice. In this article, I show how uncertainty (equipoise) is at the intersection between epistemology, decision making and ethics of clinical research. In particular, I show how our formulation of responses to uncertainties of hoped-for benefits and unknown harms of testing is a function of the way humans cognitively process information. This approach is based on the view that considerations of ethics and rationality cannot be separated. I analyze the response to uncertainties as it relates to the dual-processing theory, which postulates that rational approach to (clinical research) decision making depends both on analytical, deliberative processes embodied in scientific method (system II), and good human intuition (system I). Ultimately, our choices can only become wiser if we understand a close and intertwined relationship between irreducible uncertainty, inevitable errors and unavoidable injustice.

  8. Reforming Science Education: Part II. Utilizing Kieran Egan's Educational Metatheory

    Science.gov (United States)

    Schulz, Roland M.

    2009-04-01

    This paper is the second of two parts and continues the conversation which had called for a shift in the conceptual focus of science education towards philosophy of education, with the requirement to develop a discipline-specific “philosophy” of science education. In Part I, conflicting conceptions of science literacy were identified with disparate “visions” tied to competing research programs as well as school-based curricular paradigms. The impasse in the goals of science education and thereto, the contending views of science literacy, were themselves associated with three underlying fundamental aims of education (knowledge-itself; personal development; socialization) which, it was argued, usually undercut the potential of each other. During periods of “crisis-talk” and throughout science educational history these three aims have repeatedly attempted to assert themselves. The inability of science education research to affect long-term change in classrooms was correlated not only to the failure to reach a consensus on the aims (due to competing programs and to the educational ideologies of their social groups), but especially to the failure of developing true educational theories (largely neglected since Hirst). Such theories, especially metatheories, could serve to reinforce science education’s growing sense of academic autonomy and independence from socio-economic demands. In Part II, I offer as a suggestion Egan’s cultural-linguistic theory as a metatheory to help resolve the impasse. I hope to make reformers familiar with his important ideas in general, and more specifically, to show how they can complement HPS rationales and reinforce the work of those researchers who have emphasized the value of narrative in learning science.

  9. Modified Phenomena Identification and Ranking Table (PIRT) for Uncertainty Analysis

    International Nuclear Information System (INIS)

    Gol-Mohamad, Mohammad P.; Modarres, Mohammad; Mosleh, Ali

    2006-01-01

    This paper describes a methodology of characterizing important phenomena, which is also part of a broader research by the authors called 'Modified PIRT'. The methodology provides robust process of phenomena identification and ranking process for more precise quantification of uncertainty. It is a two-step process of identifying and ranking methodology based on thermal-hydraulics (TH) importance as well as uncertainty importance. Analytical Hierarchical Process (AHP) has been used for as a formal approach for TH identification and ranking. Formal uncertainty importance technique is used to estimate the degree of credibility of the TH model(s) used to represent the important phenomena. This part uses subjective justification by evaluating available information and data from experiments, and code predictions. The proposed methodology was demonstrated by developing a PIRT for large break loss of coolant accident LBLOCA for the LOFT integral facility with highest core power (test LB-1). (authors)

  10. Advances in Knowledge Discovery and Data Mining 21st Pacific Asia Conference, PAKDD 2017 Held in Jeju, South Korea, May 23 26, 2017. Proceedings Part I, Part II.

    Science.gov (United States)

    2017-06-27

    Data Mining 21’’ Pacific-Asia Conference, PAKDD 2017Jeju, South Korea, May 23-26, Sb. GRANT NUMBER 2017 Proceedings, Part I, Part II Sc. PROGRAM...Springer; Switzerland. 14. ABSTRACT The Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD) is a leading international conference...in the areas of knowledge discovery and data mining (KDD). We had three keynote speeches, delivered by Sang Cha from Seoul National University

  11. (II) complexes

    African Journals Online (AJOL)

    activities of Schiff base tin (II) complexes. Neelofar1 ... Conclusion: All synthesized Schiff bases and their Tin (II) complexes showed high antimicrobial and ...... Singh HL. Synthesis and characterization of tin (II) complexes of fluorinated Schiff bases derived from amino acids. Spectrochim Acta Part A: Molec Biomolec.

  12. Economic uncertainty and its impact on the Croatian economy

    Directory of Open Access Journals (Sweden)

    Petar Soric

    2017-12-01

    Full Text Available The aim of this paper is to quantify institutional (political and fiscal and non-institutional uncertainty (economic policy uncertainty, Economists’ recession index, natural disasters-related uncertainty, and several disagreement measures. The stated indicators are based on articles from highly popular Croatian news portals, the repository of law amendments (Narodne novine, and Business and Consumer Surveys. We also introduce a composite uncertainty indicator, obtained by the principal components method. The analysis of a structural VAR model of the Croatian economy (both with fixed and time-varying parameters has showed that a vast part of the analysed indicators are significant predictors of economic activity. It is demonstrated that their impact on industrial production is the strongest in the onset of a crisis. On the other hand, the influence of fiscal uncertainty exhibits just the opposite tendencies. It strengthens with the intensification of economic activity, which partially exculpates the possible utilization of fiscal expansion as a counter-crisis tool.

  13. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  14. HERBICIDAS INIBIDORES DO FOTOSSISTEMA IIPARTE I /\tPHOTOSYSTEM II INHIBITOR HERBICIDES - PART I

    Directory of Open Access Journals (Sweden)

    ILCA P. DE F. E SILVA

    2013-11-01

    Full Text Available O controle químico tem sido o mais utilizado em grandes áreas de plantio, principalmente por ser um método rápido e eficiente. Os herbicidas inibidores do fotossistema II (PSII são fundamentais para o manejo integrado de plantas daninhas e práticas conservacionista de solo. A aplicação é realizada em pré-emergência ou pós-emergência inicial das plantas daninhas. A absorção é pelas raízes, tendo como barreira as estrias de Caspari, sendo a translocação realizada pelo xilema. O processo de absorção e translocação também são dependentes das próprias características do produto, como as propriedades lipofílicas e hidrofílicas, as quais podem ser medidas através do coeficiente de partição octanol-água (Kow. A inibição da fotossíntese acontece pela ligação dos herbicidas deste grupo ao sítio de ligação da QB, na proteína D1 do fotossistema II, o qual se localiza na membrana dos tilacóides dos cloroplastos, causando, o bloqueia do transporte de elétrons da QA para QB, interrompendo a fixação do CO2 e a produção de ATP e NAPH2.

  15. The Intolerance of Uncertainty Index: Replication and Extension with an English Sample

    Science.gov (United States)

    Carleton, R. Nicholas; Gosselin, Patrick; Asmundson, Gordon J. G.

    2010-01-01

    Intolerance of uncertainty (IU) is related to anxiety, depression, worry, and anxiety sensitivity. Precedent IU measures were criticized for psychometric instability and redundancy; alternative measures include the novel 45-item measure (Intolerance of Uncertainty Index; IUI). The IUI was developed in French with 2 parts, assessing general…

  16. Societal Planning: Identifying a New Role for the Transport Planner-Part II: Planning Guidelines

    DEFF Research Database (Denmark)

    Khisty, C. Jotin; Leleur, Steen

    1997-01-01

    The paper seeks to formulate planning guidelines based on Habermas's theory of communicative action. Specifically, this has led to the formulation of a set of four planning validity claims concerned to four types of planning guidelines concerning adequacy, dependency, suitability and adaptability......-a-vis the planning validity claims. Among other things the contingency of this process is outlined. It is concluded (part I & II) that transport planners can conveniently utilize the guidelines in their professional practice, tailored to their particular settings....

  17. Comments on Uncertainty in Groundwater Governance in the Volcanic Canary Islands, Spain

    OpenAIRE

    Custodio, Emilio; Cabrera, María; Poncela, Roberto; Cruz-Fuentes, Tatiana; Naranjo, Gema; Miguel, Luis de

    2015-01-01

    The uncertainty associated with natural magnitudes and processes is conspicuous in water resources and groundwater evaluation. This uncertainty has an essential component and a part that can be reduced to some extent by increasing knowledge, improving monitoring coverage, continuous elaboration of data and accuracy and addressing the related economic and social aspects involved. Reducing uncertainty has a cost that may not be justified by the improvement that is obtainable, but that has to be...

  18. Toward a definition of intolerance of uncertainty: a review of factor analytical studies of the Intolerance of Uncertainty Scale.

    Science.gov (United States)

    Birrell, Jane; Meares, Kevin; Wilkinson, Andrew; Freeston, Mark

    2011-11-01

    Since its emergence in the early 1990s, a narrow but concentrated body of research has developed examining the role of intolerance of uncertainty (IU) in worry, and yet we still know little about its phenomenology. In an attempt to clarify our understanding of this construct, this paper traces the way in which our understanding and definition of IU have evolved throughout the literature. This paper also aims to further our understanding of IU by exploring the latent variables measures by the Intolerance of Uncertainty Scale (IUS; Freeston, Rheaume, Letarte, Dugas & Ladouceur, 1994). A review of the literature surrounding IU confirmed that the current definitions are categorical and lack specificity. A critical review of existing factor analytic studies was carried out in order to determine the underlying factors measured by the IUS. Systematic searches yielded 9 papers for review. Two factors with 12 consistent items emerged throughout the exploratory studies, and the stability of models containing these two factors was demonstrated in subsequent confirmatory studies. It is proposed that these factors represent (i) desire for predictability and an active engagement in seeking certainty, and (ii) paralysis of cognition and action in the face of uncertainty. It is suggested that these factors may represent approach and avoidance responses to uncertainty. Further research is required to confirm the construct validity of these factors and to determine the stability of this structure within clinical samples. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  20. Uncertainty of climate change impacts and consequences on the prediction of future hydrological trends

    International Nuclear Information System (INIS)

    Minville, M.; Brissette, F.; Leconte, R.

    2008-01-01

    In the future, water is very likely to be the resource that will be most severely affected by climate change. It has been shown that small perturbations in precipitation frequency and/or quantity can result in significant impacts on the mean annual discharge. Moreover, modest changes in natural inflows result in larger changes in reservoir storage. There is however great uncertainty linked to changes in both the magnitude and direction of future hydrological trends. This presentation discusses the various sources of this uncertainty and their potential impact on the prediction of future hydrological trends. A companion paper will look at adaptation potential, taking into account some of the sources of uncertainty discussed in this presentation. Uncertainty is separated into two main components: climatic uncertainty and 'model and methods' uncertainty. Climatic uncertainty is linked to uncertainty in future greenhouse gas emission scenarios (GHGES) and to general circulation models (GCMs), whose representation of topography and climate processes is imperfect, in large part due to computational limitations. The uncertainty linked to natural variability (which may or may not increase) is also part of the climatic uncertainty. 'Model and methods' uncertainty regroups the uncertainty linked to the different approaches and models needed to transform climate data so that they can be used by hydrological models (such as downscaling methods) and the uncertainty of the models themselves and of their use in a changed climate. The impacts of the various sources of uncertainty on the hydrology of a watershed are demonstrated on the Peribonka River basin (Quebec, Canada). The results indicate that all sources of uncertainty can be important and outline the importance of taking these sources into account for any impact and adaptation studies. Recommendations are outlined for such studies. (author)

  1. Measurement uncertainties for vacuum standards at Korea Research Institute of Standards and Science

    International Nuclear Information System (INIS)

    Hong, S. S.; Shin, Y. H.; Chung, K. H.

    2006-01-01

    The Korea Research Institute of Standards and Science has three major vacuum systems: an ultrasonic interferometer manometer (UIM) (Sec. II, Figs. 1 and 2) for low vacuum, a static expansion system (SES) (Sec. III, Figs. 3 and 4) for medium vacuum, and an orifice-type dynamic expansion system (DES) (Sec. IV, Figs. 5 and 6) for high and ultrahigh vacuum. For each system explicit measurement model equations with multiple variables are, respectively, given. According to ISO standards, all these system variable errors were used to calculate the expanded uncertainty (U). For each system the expanded uncertainties (k=1, confidence level=95%) and relative expanded uncertainty (expanded uncertainty/generated pressure) are summarized in Table IV and are estimated to be as follows. For UIM, at 2.5-300 Pa generated pressure, the expanded uncertainty is -2 Pa and the relative expanded uncertainty is -2 ; at 1-100 kPa generated pressure, the expanded uncertainty is -5 . For SES, at 3-100 Pa generated pressure, the expanded uncertainty is -1 Pa and the relative expanded uncertainty is -3 . For DES, at 4.6x10 -3 -1.3x10 -2 Pa generated pressure, the expanded uncertainty is -4 Pa and the relative expanded uncertainty is -3 ; at 3.0x10 -6 -9.0x10 -4 Pa generated pressure, the expanded uncertainty is -6 Pa and the relative expanded uncertainty is -2 . Within uncertainty limits our bilateral and key comparisons [CCM.P-K4 (10 Pa-1 kPa)] are extensive and in good agreement with those of other nations (Fig. 8 and Table V)

  2. Two-loop renormalization in the standard model, part II. Renormalization procedures and computational techniques

    Energy Technology Data Exchange (ETDEWEB)

    Actis, S. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Passarino, G. [Torino Univ. (Italy). Dipt. di Fisica Teorica; INFN, Sezione di Torino (Italy)

    2006-12-15

    In part I general aspects of the renormalization of a spontaneously broken gauge theory have been introduced. Here, in part II, two-loop renormalization is introduced and discussed within the context of the minimal Standard Model. Therefore, this paper deals with the transition between bare parameters and fields to renormalized ones. The full list of one- and two-loop counterterms is shown and it is proven that, by a suitable extension of the formalism already introduced at the one-loop level, two-point functions suffice in renormalizing the model. The problem of overlapping ultraviolet divergencies is analyzed and it is shown that all counterterms are local and of polynomial nature. The original program of 't Hooft and Veltman is at work. Finite parts are written in a way that allows for a fast and reliable numerical integration with all collinear logarithms extracted analytically. Finite renormalization, the transition between renormalized parameters and physical (pseudo-)observables, are discussed in part III where numerical results, e.g. for the complex poles of the unstable gauge bosons, are shown. An attempt is made to define the running of the electromagnetic coupling constant at the two-loop level. (orig.)

  3. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  4. Uncertainty contributions to low flow projections in Austria

    Science.gov (United States)

    Parajka, J.; Blaschke, A. P.; Blöschl, G.; Haslinger, K.; Hepp, G.; Laaha, G.; Schöner, W.; Trautvetter, H.; Viglione, A.; Zessner, M.

    2015-11-01

    The main objective of the paper is to understand the contributions to the uncertainty in low flow projections resulting from hydrological model uncertainty and climate projection uncertainty. Model uncertainty is quantified by different parameterizations of a conceptual semi-distributed hydrologic model (TUWmodel) using 11 objective functions in three different decades (1976-1986, 1987-1997, 1998-2008), which allows disentangling the effect of modeling uncertainty and temporal stability of model parameters. Climate projection uncertainty is quantified by four future climate scenarios (ECHAM5-A1B, A2, B1 and HADCM3-A1B) using a delta change approach. The approach is tested for 262 basins in Austria. The results indicate that the seasonality of the low flow regime is an important factor affecting the performance of model calibration in the reference period and the uncertainty of Q95 low flow projections in the future period. In Austria, the calibration uncertainty in terms of Q95 is larger in basins with summer low flow regime than in basins with winter low flow regime. Using different calibration periods may result in a range of up to 60 % in simulated Q95 low flows. The low flow projections show an increase of low flows in the Alps, typically in the range of 10-30 % and a decrease in the south-eastern part of Austria mostly in the range -5 to -20 % for the period 2021-2050 relative the reference period 1976-2008. The change in seasonality varies between scenarios, but there is a tendency for earlier low flows in the Northern Alps and later low flows in Eastern Austria. In 85 % of the basins, the uncertainty in Q95 from model calibration is larger than the uncertainty from different climate scenarios. The total uncertainty of Q95 projections is the largest in basins with winter low flow regime and, in some basins, exceeds 60 %. In basins with summer low flows and the total uncertainty is mostly less than 20 %. While the calibration uncertainty dominates over climate

  5. Handling Uncertainties within R&D Modules of a developing Technology

    DEFF Research Database (Denmark)

    Pedersen, Jørgen Lindgaard; Perunovic, Zoran

    2004-01-01

    . Third, the modules that have had always been present in the insulin's R&D, enabled companies to develop mechanism for internal learning and are able to master that part of the process. Finally, in the R&D, outsourcing is related to the whole knowledge acquisition while it seems that minor uncertainties...... and an interview conducted have generated following. First, the further along the process train a module is, it accumulates uncertainties from previous modules. Second, with the growth of complexity, uncertainties grew as well, resulting in the necessity for companies to seek for knowledge on them externally...

  6. Nuclear physics II

    International Nuclear Information System (INIS)

    Elze, T.

    1988-01-01

    This script consisting of two parts contains the matter of the courses Nuclear Pyhsics I and II, as they were presented in the winter term 1987/88 and summer term 1988 for students of physics at Frankfurt University. In the present part II the matter of the summer term is summarized. (orig.) [de

  7. Quantification of Uncertainty in Thermal Building Simulation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Haghighat, F.; Frier, Christian

    In order to quantify uncertainty in thermal building simulation stochastic modelling is applied on a building model. An application of stochastic differential equations is presented in Part 1 comprising a general heat balance for an arbitrary number of loads and zones in a building to determine...

  8. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    Science.gov (United States)

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  9. Hierarchical Bayesian analysis to incorporate age uncertainty in growth curve analysis and estimates of age from length: Florida manatee (Trichechus manatus) carcasses

    Science.gov (United States)

    Schwarz, L.K.; Runge, M.C.

    2009-01-01

    Age estimation of individuals is often an integral part of species management research, and a number of ageestimation techniques are commonly employed. Often, the error in these techniques is not quantified or accounted for in other analyses, particularly in growth curve models used to describe physiological responses to environment and human impacts. Also, noninvasive, quick, and inexpensive methods to estimate age are needed. This research aims to provide two Bayesian methods to (i) incorporate age uncertainty into an age-length Schnute growth model and (ii) produce a method from the growth model to estimate age from length. The methods are then employed for Florida manatee (Trichechus manatus) carcasses. After quantifying the uncertainty in the aging technique (counts of ear bone growth layers), we fit age-length data to the Schnute growth model separately by sex and season. Independent prior information about population age structure and the results of the Schnute model are then combined to estimate age from length. Results describing the age-length relationship agree with our understanding of manatee biology. The new methods allow us to estimate age, with quantified uncertainty, for 98% of collected carcasses: 36% from ear bones, 62% from length.

  10. Roots/Routes: Part II

    Science.gov (United States)

    Swanson, Dalene M.

    2009-01-01

    This narrative acts as an articulation of a journey of many routes. Following Part I of the same research journey of rootedness/routedness, it debates the nature of transformation and transcendence beyond personal and political paradoxes informed by neoliberalism and related repressive globalizing discourses. Through a more personal, descriptive,…

  11. The neurobiology of uncertainty: implications for statistical learning.

    Science.gov (United States)

    Hasson, Uri

    2017-01-05

    The capacity for assessing the degree of uncertainty in the environment relies on estimating statistics of temporally unfolding inputs. This, in turn, allows calibration of predictive and bottom-up processing, and signalling changes in temporally unfolding environmental features. In the last decade, several studies have examined how the brain codes for and responds to input uncertainty. Initial neurobiological experiments implicated frontoparietal and hippocampal systems, based largely on paradigms that manipulated distributional features of visual stimuli. However, later work in the auditory domain pointed to different systems, whose activation profiles have interesting implications for computational and neurobiological models of statistical learning (SL). This review begins by briefly recapping the historical development of ideas pertaining to the sensitivity to uncertainty in temporally unfolding inputs. It then discusses several issues at the interface of studies of uncertainty and SL. Following, it presents several current treatments of the neurobiology of uncertainty and reviews recent findings that point to principles that serve as important constraints on future neurobiological theories of uncertainty, and relatedly, SL. This review suggests it may be useful to establish closer links between neurobiological research on uncertainty and SL, considering particularly mechanisms sensitive to local and global structure in inputs, the degree of input uncertainty, the complexity of the system generating the input, learning mechanisms that operate on different temporal scales and the use of learnt information for online prediction.This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).

  12. Estimation of uncertainty in pKa values determined by potentiometric titration.

    Science.gov (United States)

    Koort, Eve; Herodes, Koit; Pihl, Viljar; Leito, Ivo

    2004-06-01

    A procedure is presented for estimation of uncertainty in measurement of the pK(a) of a weak acid by potentiometric titration. The procedure is based on the ISO GUM. The core of the procedure is a mathematical model that involves 40 input parameters. A novel approach is used for taking into account the purity of the acid, the impurities are not treated as inert compounds only, their possible acidic dissociation is also taken into account. Application to an example of practical pK(a) determination is presented. Altogether 67 different sources of uncertainty are identified and quantified within the example. The relative importance of different uncertainty sources is discussed. The most important source of uncertainty (with the experimental set-up of the example) is the uncertainty of pH measurement followed by the accuracy of the burette and the uncertainty of weighing. The procedure gives uncertainty separately for each point of the titration curve. The uncertainty depends on the amount of titrant added, being lowest in the central part of the titration curve. The possibilities of reducing the uncertainty and interpreting the drift of the pK(a) values obtained from the same curve are discussed.

  13. Programming an interim report on the SETL project. Part I: generalities. Part II: the SETL language and examples of its use

    Energy Technology Data Exchange (ETDEWEB)

    Schwartz, J T

    1975-06-01

    A summary of work during the past several years on SETL, a new programming language drawing its dictions and basic concepts from the mathematical theory of sets, is presented. The work was started with the idea that a programming language modeled after an appropriate version of the formal language of mathematics might allow a programming style with some of the succinctness of mathematics, and that this might ultimately enable one to express and experiment with more complex algorithms than are now within reach. Part I discusses the general approach followed in the work. Part II focuses directly on the details of the SETL language as it is now defined. It describes the facilities of SETL, includes short libraries of miscellaneous and of code optimization algorithms illustrating the use of SETL, and gives a detailed description of the manner in which the set-theoretic primitives provided by SETL are currently implemented. (RWR)

  14. THE PROPAGATION OF UNCERTAINTIES IN STELLAR POPULATION SYNTHESIS MODELING. II. THE CHALLENGE OF COMPARING GALAXY EVOLUTION MODELS TO OBSERVATIONS

    International Nuclear Information System (INIS)

    Conroy, Charlie; Gunn, James E.; White, Martin

    2010-01-01

    Models for the formation and evolution of galaxies readily predict physical properties such as star formation rates, metal-enrichment histories, and, increasingly, gas and dust content of synthetic galaxies. Such predictions are frequently compared to the spectral energy distributions of observed galaxies via the stellar population synthesis (SPS) technique. Substantial uncertainties in SPS exist, and yet their relevance to the task of comparing galaxy evolution models to observations has received little attention. In the present work, we begin to address this issue by investigating the importance of uncertainties in stellar evolution, the initial stellar mass function (IMF), and dust and interstellar medium (ISM) properties on the translation from models to observations. We demonstrate that these uncertainties translate into substantial uncertainties in the ultraviolet, optical, and near-infrared colors of synthetic galaxies. Aspects that carry significant uncertainties include the logarithmic slope of the IMF above 1 M sun , dust attenuation law, molecular cloud disruption timescale, clumpiness of the ISM, fraction of unobscured starlight, and treatment of advanced stages of stellar evolution including blue stragglers, the horizontal branch, and the thermally pulsating asymptotic giant branch. The interpretation of the resulting uncertainties in the derived colors is highly non-trivial because many of the uncertainties are likely systematic, and possibly correlated with the physical properties of galaxies. We therefore urge caution when comparing models to observations.

  15. A GLUE uncertainty analysis of a drying model of pharmaceutical granules

    DEFF Research Database (Denmark)

    Mortier, Séverine Thérèse F.C.; Van Hoey, Stijn; Cierkens, Katrijn

    2013-01-01

    unit, which is part of the full continuous from-powder-to-tablet manufacturing line (Consigma™, GEA Pharma Systems). A validated model describing the drying behaviour of a single pharmaceutical granule in two consecutive phases is used. First of all, the effect of the assumptions at the particle level...... on the prediction uncertainty is assessed. Secondly, the paper focuses on the influence of the most sensitive parameters in the model. Finally, a combined analysis (particle level plus most sensitive parameters) is performed and discussed. To propagate the uncertainty originating from the parameter uncertainty...

  16. Fractional Programming for Communication Systems—Part II: Uplink Scheduling via Matching

    Science.gov (United States)

    Shen, Kaiming; Yu, Wei

    2018-05-01

    This two-part paper develops novel methodologies for using fractional programming (FP) techniques to design and optimize communication systems. Part I of this paper proposes a new quadratic transform for FP and treats its application for continuous optimization problems. In this Part II of the paper, we study discrete problems, such as those involving user scheduling, which are considerably more difficult to solve. Unlike the continuous problems, discrete or mixed discrete-continuous problems normally cannot be recast as convex problems. In contrast to the common heuristic of relaxing the discrete variables, this work reformulates the original problem in an FP form amenable to distributed combinatorial optimization. The paper illustrates this methodology by tackling the important and challenging problem of uplink coordinated multi-cell user scheduling in wireless cellular systems. Uplink scheduling is more challenging than downlink scheduling, because uplink user scheduling decisions significantly affect the interference pattern in nearby cells. Further, the discrete scheduling variable needs to be optimized jointly with continuous variables such as transmit power levels and beamformers. The main idea of the proposed FP approach is to decouple the interaction among the interfering links, thereby permitting a distributed and joint optimization of the discrete and continuous variables with provable convergence. The paper shows that the well-known weighted minimum mean-square-error (WMMSE) algorithm can also be derived from a particular use of FP; but our proposed FP-based method significantly outperforms WMMSE when discrete user scheduling variables are involved, both in term of run-time efficiency and optimizing results.

  17. Noncardiac findings on cardiac CT. Part II: spectrum of imaging findings.

    LENUS (Irish Health Repository)

    Killeen, Ronan P

    2012-02-01

    Cardiac computed tomography (CT) has evolved into an effective imaging technique for the evaluation of coronary artery disease in selected patients. Two distinct advantages over other noninvasive cardiac imaging methods include its ability to directly evaluate the coronary arteries and to provide a unique opportunity to evaluate for alternative diagnoses by assessing the extracardiac structures, such as the lungs and mediastinum, particularly in patients presenting with the chief symptom of acute chest pain. Some centers reconstruct a small field of view (FOV) cropped around the heart but a full FOV (from skin to skin in the area irradiated) is obtainable in the raw data of every scan so that clinically relevant noncardiac findings are identifiable. Debate in the scientific community has centered on the necessity for this large FOV. A review of noncardiac structures provides the opportunity to make alternative diagnoses that may account for the patient\\'s presentation or to detect important but clinically silent problems such as lung cancer. Critics argue that the yield of biopsy-proven cancers is low and that the follow-up of incidental noncardiac findings is expensive, resulting in increased radiation exposure and possibly unnecessary further testing. In this 2-part review we outline the issues surrounding the concept of the noncardiac read, looking for noncardiac findings on cardiac CT. Part I focused on the pros and cons for and against the practice of identifying noncardiac findings on cardiac CT. Part II illustrates the imaging spectrum of cardiac CT appearances of benign and malignant noncardiac pathology.

  18. Treating Uncertainties in A Nuclear Seismic Probabilistic Risk Assessment by Means of the Distemper-Safer Theory of Evidence

    International Nuclear Information System (INIS)

    Lo, Chungkung; Pedroni, N.; Zio, E.

    2014-01-01

    The analyses carried out within the Seismic Probabilistic Risk Assessments (SPRAs) of Nuclear Power Plants (NPPs) are affected by significant aleatory and epistemic uncertainties. These uncertainties have to be represented and quantified coherently with the data, information and knowledge available, to provide reasonable assurance that related decisions can be taken robustly and with confidence. The amount of data, information and knowledge available for seismic risk assessment is typically limited, so that the analysis must strongly rely on expert judgments. In this paper, a Dempster-Shafer Theory (DST) framework for handling uncertainties in NPP SPRAs is proposed and applied to an example case study. The main contributions of this paper are two: (i) applying the complete DST framework to SPRA models, showing how to build the Dempster-Shafer structures of the uncertainty parameters based on industry generic data, and (ii) embedding Bayesian updating based on plant specific data into the framework. The results of the application to a case study show that the approach is feasible and effective in (i) describing and jointly propagating aleatory and epistemic uncertainties in SPRA models and (ii) providing 'conservative' bounds on the safety quantities of interest (i. e. Core Damage Frequency, CDF) that reflect the (limited) state of knowledge of the experts about the system of interest

  19. Treating Uncertainties in A Nuclear Seismic Probabilistic Risk Assessment by Means of the Distemper-Safer Theory of Evidence

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Chungkung [Chair on Systems Science and the Energetic Challenge, Paris (France); Pedroni, N.; Zio, E. [Politecnico di Milano, Milano (Italy)

    2014-02-15

    The analyses carried out within the Seismic Probabilistic Risk Assessments (SPRAs) of Nuclear Power Plants (NPPs) are affected by significant aleatory and epistemic uncertainties. These uncertainties have to be represented and quantified coherently with the data, information and knowledge available, to provide reasonable assurance that related decisions can be taken robustly and with confidence. The amount of data, information and knowledge available for seismic risk assessment is typically limited, so that the analysis must strongly rely on expert judgments. In this paper, a Dempster-Shafer Theory (DST) framework for handling uncertainties in NPP SPRAs is proposed and applied to an example case study. The main contributions of this paper are two: (i) applying the complete DST framework to SPRA models, showing how to build the Dempster-Shafer structures of the uncertainty parameters based on industry generic data, and (ii) embedding Bayesian updating based on plant specific data into the framework. The results of the application to a case study show that the approach is feasible and effective in (i) describing and jointly propagating aleatory and epistemic uncertainties in SPRA models and (ii) providing 'conservative' bounds on the safety quantities of interest (i. e. Core Damage Frequency, CDF) that reflect the (limited) state of knowledge of the experts about the system of interest.

  20. COMMUNICATING THE PARAMETER UNCERTAINTY IN THE IQWIG EFFICIENCY FRONTIER TO DECISION-MAKERS

    Science.gov (United States)

    Stollenwerk, Björn; Lhachimi, Stefan K; Briggs, Andrew; Fenwick, Elisabeth; Caro, Jaime J; Siebert, Uwe; Danner, Marion; Gerber-Grote, Andreas

    2015-01-01

    The Institute for Quality and Efficiency in Health Care (IQWiG) developed—in a consultation process with an international expert panel—the efficiency frontier (EF) approach to satisfy a range of legal requirements for economic evaluation in Germany's statutory health insurance system. The EF approach is distinctly different from other health economic approaches. Here, we evaluate established tools for assessing and communicating parameter uncertainty in terms of their applicability to the EF approach. Among these are tools that perform the following: (i) graphically display overall uncertainty within the IQWiG EF (scatter plots, confidence bands, and contour plots) and (ii) communicate the uncertainty around the reimbursable price. We found that, within the EF approach, most established plots were not always easy to interpret. Hence, we propose the use of price reimbursement acceptability curves—a modification of the well-known cost-effectiveness acceptability curves. Furthermore, it emerges that the net monetary benefit allows an intuitive interpretation of parameter uncertainty within the EF approach. This research closes a gap for handling uncertainty in the economic evaluation approach of the IQWiG methods when using the EF. However, the precise consequences of uncertainty when determining prices are yet to be defined. © 2014 The Authors. Health Economics published by John Wiley & Sons Ltd. PMID:24590819

  1. Uncertainty quantification in reactor physics using adjoint/perturbation techniques and adaptive spectral methods

    NARCIS (Netherlands)

    Gilli, L.

    2013-01-01

    This thesis presents the development and the implementation of an uncertainty propagation algorithm based on the concept of spectral expansion. The first part of the thesis is dedicated to the study of uncertainty propagation methodologies and to the analysis of spectral techniques. The concepts

  2. The Historiography of British Imperial Education Policy, Part II: Africa and the Rest of the Colonial Empire

    Science.gov (United States)

    Whitehead, Clive

    2005-01-01

    Part II of this historiographical study examines British education policy in Africa, and in the many crown colonies, protectorates, and mandated territories around the globe. Up until 1920, the British government took far less interest than in India, in the development of schooling in Africa and the rest of the colonial empire, and education was…

  3. Precise Wavelengths and Energy Levels for the Spectra of Cr I, Mn I, and Mn III, and Branching Fractions for the Spectra of Fe II and Cr II

    Science.gov (United States)

    Nave, Gillian

    I propose to measure wavelengths and energy levels for the spectra of Cr I, Mn I, and Mn III covering the wavelength range 80 nm to 5500 nm, and oscillator strengths for Fe II and Cr II in the region 120 nm to 2500 nm. I shall also produce intensity calibrated atlases and linelists of the iron-neon and chromium-neon hollow cathode lamps that can be compared with astrophysical spectra. The spectra will be obtained from archival data from spectrometers at NIST and Kitt Peak National Observatory and additional experimental observations as necessary from Fourier transform (FT) and grating spectrometers at NIST. The wavelength uncertainty of the strong lines will be better than 1 part in 10^7. The radiometric calibration of the spectra will be improved in order to reduce the uncertainty of measured oscillator strengths in the near UV region and extend the wavelength range of these measurements down to 120 nm. These will complement and support the measurements of lifetimes and branching fractions by J. E. Lawler in the near UV region. An intensive effort by NIST and Imperial College London that was partly funded by previous NASA awards has resulted in comprehensive analyses of the spectra of Fe II, Cr II and Cu II, with similar analyses of Mn II, Ni II, and Sc II underway. The species included in this proposal will complete the analysis of the first two ionization stages of the elements titanium through nickel using the same techniques, and add the spectrum of Mn III - one of the most important doubly-ionized elements. The elements Cr I and Mn I give large numbers of spectral lines in spectra of cool stars and important absorption lines in the interstellar medium. The spectrum of Mn III is important in chemically peculiar stars and can often only be studied in the UV region. Analyses of many stellar spectra depend on comprehensive analyses of iron-group elements and are hampered by incomplete spectroscopic data. As a result of many decades of work by the group at the

  4. DOE program guide for universities and other research groups. Part I. DOE Research and Development Programs; Part II. DOE Procurement and Assistance Policies/Procedures

    Energy Technology Data Exchange (ETDEWEB)

    1980-03-01

    This guide addresses the DOE responsibility for fostering advanced research and development of all energy resources, both current and potential. It is intended to provide, in a single publication, all the fundamental information needed by an institution to develop a potential working relationship with DOE. Part I describes DOE research and development programs and facilities, and identifies areas of additional research needs and potential areas for new research opportunities. It also summarizes budget data and identifies the DOE program information contacts for each program. Part II provides researchers and research administrators with an introduction to the DOE administrative policies and procedures for submission and evaluation of proposals and the administration of resulting grants, cooperative agreements, and research contracts. (RWR)

  5. Assessing and addressing moral distress and ethical climate Part II: neonatal and pediatric perspectives.

    Science.gov (United States)

    Sauerland, Jeanie; Marotta, Kathleen; Peinemann, Mary Anne; Berndt, Andrea; Robichaux, Catherine

    2015-01-01

    Moral distress remains a pervasive and, at times, contested concept in nursing and other health care disciplines. Ethical climate, the conditions and practices in which ethical situations are identified, discussed, and decided, has been shown to exacerbate or ameliorate perceptions of moral distress. The purpose of this mixed-methods study was to explore perceptions of moral distress, moral residue, and ethical climate among registered nurses working in an academic medical center. Two versions of the Moral Distress Scale in addition to the Hospital Ethical Climate Survey were used, and participants were invited to respond to 2 open-ended questions. Part I reported the findings among nurses working in adult acute and critical care units. Part II presents the results from nurses working in pediatric/neonatal units. Significant differences in findings between the 2 groups are discussed. Subsequent interventions developed are also presented.

  6. Uncertainties in the simulation of groundwater recharge at different scales

    Directory of Open Access Journals (Sweden)

    H. Bogena

    2005-01-01

    Full Text Available Digital spatial data always imply some kind of uncertainty. The source of this uncertainty can be found in their compilation as well as the conceptual design that causes a more or less exact abstraction of the real world, depending on the scale under consideration. Within the framework of hydrological modelling, in which numerous data sets from diverse sources of uneven quality are combined, the various uncertainties are accumulated. In this study, the GROWA model is taken as an example to examine the effects of different types of uncertainties on the calculated groundwater recharge. Distributed input errors are determined for the parameters' slope and aspect using a Monte Carlo approach. Landcover classification uncertainties are analysed by using the conditional probabilities of a remote sensing classification procedure. The uncertainties of data ensembles at different scales and study areas are discussed. The present uncertainty analysis showed that the Gaussian error propagation method is a useful technique for analysing the influence of input data on the simulated groundwater recharge. The uncertainties involved in the land use classification procedure and the digital elevation model can be significant in some parts of the study area. However, for the specific model used in this study it was shown that the precipitation uncertainties have the greatest impact on the total groundwater recharge error.

  7. Transformation & uncertainty : some thoughts on quantum probability theory, quantum statistics, and natural bundles

    NARCIS (Netherlands)

    Janssens, B.

    2010-01-01

    This PHD thesis is concerned partly with uncertainty relations in quantum probability theory, partly with state estimation in quantum stochastics, and partly with natural bundles in differential geometry. The laws of quantum mechanics impose severe restrictions on the performance of measurement.

  8. Mineral resources of parts of the Departments of Antioquia and Caldas, Zone II, Colombia

    Science.gov (United States)

    Hall, R.B.; Feininger, Tomas; Barrero, L.; Dario, Rico H.; ,; Alvarez, A.

    1970-01-01

    The mineral resources of an area of 40,000 sq km, principally in the Department of Antioquia, but including small parts of the Departments of Caldas, C6rdoba, Risaralda, and Tolima, were investigated during the period 1964-68. The area is designated Zone II by the Colombian Inventario Minero Nacional(lMN). The geology of approximately 45 percent of this area, or 18,000 sq km, has been mapped by IMN. Zone II has been a gold producer for centuries, and still produces 75 percent of Colombia's gold. Silver is recovered as a byproduct. Ferruginous laterites have been investigated as potential sources of iron ore but are not commercially exploitable. Nickeliferous laterite on serpentinite near Ure in the extreme northwest corner of the Zone is potentially exploitable, although less promising than similar laterites at Cerro Matoso, north of the Zone boundary. Known deposits of mercury, chromium, manganese, and copper are small and have limited economic potentia1. Cement raw materials are important among nonmetallic resources, and four companies are engaged in the manufacture of portland cement. The eastern half of Zone II contains large carbonate rock reserves, but poor accessibility is a handicap to greater development at present. Dolomite near Amalfi is quarried for the glass-making and other industries. Clay saprolite is abundant and widely used in making brick and tiles in backyard kilns. Kaolin of good quality near La Union is used by the ceramic industry. Subbituminous coal beds of Tertiary are an important resource in the western part of the zone and have good potential for greater development. Aggregate materials for construction are varied and abundant. Deposits of sodic feldspar, talc, decorative stone, and silica are exploited on a small scale. Chrysotils asbestos deposits north of Campamento are being developed to supply fiber for Colombia's thriving asbestos-cement industry, which is presently dependent upon imported fiber. Wollastonite and andalusite are

  9. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  10. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  11. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  12. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, C. J.

    2017-12-01

    How to derive and present uncertainty in climate data records (CDRs) has been debated within the European Space Agency Climate Change Initiative, in search of common principles applicable across a range of essential climate variables. Various points of consensus have been reached, including the importance of improving provision of uncertainty information and the benefit of adopting international norms of metrology for language around the distinct concepts of uncertainty and error. Providing an estimate of standard uncertainty per datum (or the means to readily calculate it) emerged as baseline good practice, and should be highly relevant to users of CDRs when the uncertainty in data is variable (the usual case). Given this baseline, the role of quality flags is clarified as being complementary to and not repetitive of uncertainty information. Data with high uncertainty are not poor quality if a valid estimate of the uncertainty is available. For CDRs and their applications, the error correlation properties across spatio-temporal scales present important challenges that are not fully solved. Error effects that are negligible in the uncertainty of a single pixel may dominate uncertainty in the large-scale and long-term. A further principle is that uncertainty estimates should themselves be validated. The concepts of estimating and propagating uncertainty are generally acknowledged in geophysical sciences, but less widely practised in Earth observation and development of CDRs. Uncertainty in a CDR depends in part (and usually significantly) on the error covariance of the radiances and auxiliary data used in the retrieval. Typically, error covariance information is not available in the fundamental CDR (FCDR) (i.e., with the level-1 radiances), since provision of adequate level-1 uncertainty information is not yet standard practice. Those deriving CDRs thus cannot propagate the radiance uncertainty to their geophysical products. The FIDUCEO project (www.fiduceo.eu) is

  13. Optimizing production under uncertainty

    DEFF Research Database (Denmark)

    Rasmussen, Svend

    This Working Paper derives criteria for optimal production under uncertainty based on the state-contingent approach (Chambers and Quiggin, 2000), and discusses po-tential problems involved in applying the state-contingent approach in a normative context. The analytical approach uses the concept...... of state-contingent production functions and a definition of inputs including both sort of input, activity and alloca-tion technology. It also analyses production decisions where production is combined with trading in state-contingent claims such as insurance contracts. The final part discusses...

  14. A statistical approach to determining the uncertainty of peat thickness

    Directory of Open Access Journals (Sweden)

    J. Torppa

    2011-06-01

    Full Text Available This paper presents statistical studies of peat thickness to define its expected maximum variation (∆dm(∆r as a function of separation distance Δr. The aim was to provide an estimate of the observational uncertainty in peat depth due to positioning error, and the prediction uncertainty of the computed model. The data were GPS position and ground penetrating radar depth measurements of six mires in different parts of Finland. The calculated observational uncertainty for Finnish mires in general caused, for example, by a 20 m positioning error, is 43 cm in depth with 95 % confidence. The peat depth statistics differed among the six mires, and it is recommended that the mire specific function ∆dm(∆r is defined for each individual mire to obtain the best estimate of observational uncertainty. Knowledge of the observational error and function ∆dm(∆r should be used in peat depth modelling for defining the uncertainty of depth predictions.

  15. Control of uncertain systems by feedback linearization with neural networks augmentation. Part II. Controller validation by numerical simulation

    Directory of Open Access Journals (Sweden)

    Adrian TOADER

    2010-09-01

    Full Text Available The paper was conceived in two parts. Part I, previously published in this journal, highlighted the main steps of adaptive output feedback control for non-affine uncertain systems, having a known relative degree. The main paradigm of this approach was the feedback linearization (dynamic inversion with neural network augmentation. Meanwhile, based on new contributions of the authors, a new paradigm, that of robust servomechanism problem solution, has been added to the controller architecture. The current Part II of the paper presents the validation of the controller hereby obtained by using the longitudinal channel of a hovering VTOL-type aircraft as mathematical model.

  16. A Quantum Field Approach for Advancing Optical Coherence Tomography Part I: First Order Correlations, Single Photon Interference, and Quantum Noise.

    Science.gov (United States)

    Brezinski, M E

    2018-01-01

    Optical coherence tomography has become an important imaging technology in cardiology and ophthalmology, with other applications under investigations. Major advances in optical coherence tomography (OCT) imaging are likely to occur through a quantum field approach to the technology. In this paper, which is the first part in a series on the topic, the quantum basis of OCT first order correlations is expressed in terms of full field quantization. Specifically first order correlations are treated as the linear sum of single photon interferences along indistinguishable paths. Photons and the electromagnetic (EM) field are described in terms of quantum harmonic oscillators. While the author feels the study of quantum second order correlations will lead to greater paradigm shifts in the field, addressed in part II, advances from the study of quantum first order correlations are given. In particular, ranging errors are discussed (with remedies) from vacuum fluctuations through the detector port, photon counting errors, and position probability amplitude uncertainty. In addition, the principles of quantum field theory and first order correlations are needed for studying second order correlations in part II.

  17. A Quantum Field Approach for Advancing Optical Coherence Tomography Part I: First Order Correlations, Single Photon Interference, and Quantum Noise

    Science.gov (United States)

    Brezinski, ME

    2018-01-01

    Optical coherence tomography has become an important imaging technology in cardiology and ophthalmology, with other applications under investigations. Major advances in optical coherence tomography (OCT) imaging are likely to occur through a quantum field approach to the technology. In this paper, which is the first part in a series on the topic, the quantum basis of OCT first order correlations is expressed in terms of full field quantization. Specifically first order correlations are treated as the linear sum of single photon interferences along indistinguishable paths. Photons and the electromagnetic (EM) field are described in terms of quantum harmonic oscillators. While the author feels the study of quantum second order correlations will lead to greater paradigm shifts in the field, addressed in part II, advances from the study of quantum first order correlations are given. In particular, ranging errors are discussed (with remedies) from vacuum fluctuations through the detector port, photon counting errors, and position probability amplitude uncertainty. In addition, the principles of quantum field theory and first order correlations are needed for studying second order correlations in part II.

  18. Mission Plan for the Civilian Radioactive Waste Management Program. Volume I. Part I. Overview and current program plans; Part II. Information required by the Nuclear Waste Policy Act of 1982

    International Nuclear Information System (INIS)

    1985-06-01

    The Misson Plan is divided into two parts. Part I describes the overall goals, objectives, and strategy for the disposal of spent nuclear fuel and high-level waste. It explains that, to meet the directives of the Nuclear Waste Policy Act, the DOE intends to site, design, construct, and start operating a mined geologic repository by January 31, 1998. The Act specifies that the costs of these activities will be borne by the owners and generators of the waste received at the repository. Part I further describes the other components of the waste-management program - monitored retrievable storage, Federal interim storage, and transportation - as well as systems integration activities. Also discussed are institutional plans and activities as well as the program-management system being implemented by the Office of Civilian Radioactive Waste Management. Part II of the Mission Plan presents the detailed information required by Section 301(a) of the Act - key issues and information needs; plans for obtaining the necessary information; potential financial, institutional, and legal issues; plans for the test and evaluation facility; the principal results obtained to date from site investigations; information on the site-characterization programs; information on the waste package; schedules; costs; and socioeconomic impacts. In accordance with Section 301(a) of the Act, Part II is concerned primarily with the repository program

  19. Learning about Measurement Uncertainties in Secondary Education: A Model of the Subject Matter

    Science.gov (United States)

    Priemer, Burkhard; Hellwig, Julia

    2018-01-01

    Estimating measurement uncertainties is important for experimental scientific work. However, this is very often neglected in school curricula and teaching practice, even though experimental work is seen as a fundamental part of teaching science. In order to call attention to the relevance of measurement uncertainties, we developed a comprehensive…

  20. Review of studies related to uncertainty in risk analsis

    International Nuclear Information System (INIS)

    Rish, W.R.; Marnicio, R.J.

    1988-08-01

    The Environmental Protection Agency's Office of Radiation Programs (ORP) is responsible for regulating on a national level the risks associated with technological sources of ionizing radiation in the environment. A critical activity of the ORP is analyzing and evaluating risk. The ORP believes that the analysis of uncertainty should be an integral part of any risk assessment; therefore, the ORP has initiated a project to develop framework for the treatment of uncertainty in risk analysis. Summaries of recent studies done in five areas of study are presented

  1. Limited entropic uncertainty as new principle of quantum physics

    International Nuclear Information System (INIS)

    Ion, D.B.; Ion, M.L.

    2001-01-01

    The Uncertainty Principle (UP) of quantum mechanics discovered by Heisenberg, which constitute the corner-stone of quantum physics, asserts that: there is an irreducible lower bound on the uncertainty in the result of a simultaneous measurement of non-commuting observables. In order to avoid this state-dependence many authors proposed to use the information entropy as a measure of the uncertainty instead of above standard quantitative formulation of the Heisenberg uncertainty principle. In this paper the Principle of Limited Entropic Uncertainty (LEU-Principle), as a new principle in quantum physics, is proved. Then, consistent experimental tests of the LEU-principle, obtained by using the available 49 sets of the pion-nucleus phase shifts, are presented for both, extensive (q=1) and nonextensive (q=0.5 and q=2.0) cases. Some results obtained by the application of LEU-Principle to the diffraction phenomena are also discussed. The main results and conclusions of our paper can be summarized as follows: (i) We introduced a new principle in quantum physics namely the Principle of Limited Entropic Uncertainty (LEU-Principle). This new principle includes in a more general and exact form not only the old Heisenberg uncertainty principle but also introduce an upper limit on the magnitude of the uncertainty in the quantum physics. The LEU-Principle asserts that: 'there is an irreducible lower bound as well as an upper bound on the uncertainty in the result of a simultaneous measurement of non-commuting observables for any extensive and nonextensive (q ≥ 0) quantum systems'; (ii) Two important concrete realizations of the LEU-Principle are explicitly obtained in this paper, namely: (a) the LEU-inequalities for the quantum scattering of spinless particles and (b) the LEU-inequalities for the diffraction on single slit of width 2a. In particular from our general results, in the limit y → +1 we recover in an exact form all the results previously reported. In our paper an

  2. THE SPECTRUM OF Fe II

    Energy Technology Data Exchange (ETDEWEB)

    Nave, Gillian [National Institute of Standards and Technology, Gaithersburg, MD 20899-8422 (United States); Johansson, Sveneric, E-mail: gillian.nave@nist.gov [Lund Observatory, University of Lund, Box 43, SE-22100 Lund (Sweden)

    2013-01-15

    The spectrum of singly ionized iron (Fe II) has been recorded using high-resolution Fourier transform (FT) and grating spectroscopy over the wavelength range 900 A to 5.5 {mu}m. The spectra were observed in high-current continuous and pulsed hollow cathode discharges using FT spectrometers at the Kitt Peak National Observatory, Tucson, AZ and Imperial College, London and with the 10.7 m Normal Incidence Spectrograph at the National Institute of Standards and Technology. Roughly 12,900 lines were classified using 1027 energy levels of Fe II that were optimized to measured wavenumbers. The wavenumber uncertainties of lines in the FT spectra range from 10{sup -4} cm{sup -1} for strong lines around 4 {mu}m to 0.05 cm{sup -1} for weaker lines around 1500 A. The wavelength uncertainty of lines in the grating spectra is 0.005 A. The ionization energy of (130,655.4 {+-} 0.4) cm{sup -1} was estimated from the 3d{sup 6}({sup 5}D)5g and 3d{sup 6}({sup 5}D)6h levels.

  3. Quantifying reactor safety margins: Part 1: An overview of the code scaling, applicability, and uncertainty evaluation methodology

    International Nuclear Information System (INIS)

    Boyack, B.E.; Duffey, R.B.; Griffith, P.

    1988-01-01

    In August 1988, the Nuclear Regulatory Commission (NRC) approved the final version of a revised rule on the acceptance of emergency core cooling systems (ECCS) entitled ''Emergency Core Cooling System; Revisions to Acceptance Criteria.'' The revised rule states an alternate ECCS performance analysis, based on best-estimate methods, may be used to provide more realistic estimates of plant safety margins, provided the licensee quantifies the uncertainty of the estimates and included that uncertainty when comparing the calculated results with prescribed acceptance limits. To support the revised ECCS rule, the NRC and its contractors and consultants have developed and demonstrated a method called the Code Scaling, Applicability, and Uncertainty (CSAU) evaluation methodology. It is an auditable, traceable, and practical method for combining quantitative analyses and expert opinions to arrive at computed values of uncertainty. This paper provides an overview of the CSAU evaluation methodology and its application to a postulated cold-leg, large-break loss-of-coolant accident in a Westinghouse four-loop pressurized water reactor with 17 /times/ 17 fuel. The code selected for this demonstration of the CSAU methodology was TRAC-PF1/MOD1, Version 14.3. 23 refs., 5 figs., 1 tab

  4. Comparison of the GUM and Monte Carlo methods on the flatness uncertainty estimation in coordinate measuring machine

    Directory of Open Access Journals (Sweden)

    Jalid Abdelilah

    2016-01-01

    Full Text Available In engineering industry, control of manufactured parts is usually done on a coordinate measuring machine (CMM, a sensor mounted at the end of the machine probes a set of points on the surface to be inspected. Data processing is performed subsequently using software, and the result of this measurement process either validates or not the conformity of the part. Measurement uncertainty is a crucial parameter for making the right decisions, and not taking into account this parameter can, therefore, sometimes lead to aberrant decisions. The determination of the uncertainty measurement on CMM is a complex task for the variety of influencing factors. Through this study, we aim to check if the uncertainty propagation model developed according to the guide to the expression of uncertainty in measurement (GUM approach is valid, we present here a comparison of the GUM and Monte Carlo methods. This comparison is made to estimate a flatness deviation of a surface belonging to an industrial part and the uncertainty associated to the measurement result.

  5. Nuclear power plant simulators for operator licensing and training. Part I. The need for plant-reference simulators. Part II. The use of plant-reference simulators

    International Nuclear Information System (INIS)

    Rankin, W.L.; Bolton, P.A.; Shikiar, R.; Saari, L.M.

    1984-05-01

    Part I of this report presents technical justification for the use of plant-reference simulators in the licensing and training of nuclear power plant operators and examines alternatives to the use of plant-reference simulators. The technical rationale is based on research on the use of simulators in other industries, psychological learning and testing principles, expert opinion and user opinion. Part II discusses the central considerations in using plant-reference simulators for licensing examination of nuclear power plant operators and for incorporating simulators into nuclear power plant training programs. Recommendations are presented for the administration of simulator examinations in operator licensing that reflect the goal of maximizing both reliability and validity in the examination process. A series of organizational tasks that promote the acceptance, use, and effectiveness of simulator training as part of the onsite training program is delineated

  6. Dead layer and active volume determination for GERDA Phase II detectors

    Energy Technology Data Exchange (ETDEWEB)

    Lehnert, Bjoern [TU Dresden (Germany); Collaboration: GERDA-Collaboration

    2013-07-01

    The GERDA experiment investigates the neutrinoless double beta decay of {sup 76}Ge and is currently running Phase I of its physics program. Using the same isotope as the Heidelberg Moscow (HDM) experiment, GERDA aims to directly test the claim of observation by a subset of the HDM collaboration. For the update to Phase II of the experiment in 2013, the collaboration organized the production of 30 new Broad Energy Germanium (BEGe) type detectors from original 35 kg enriched material and tested their performance in the low background laboratory HADES in SCK.CEN, Belgium. With additional 20 kg of detectors, GERDA aims to probe the degenerated hierarchy scenario. One of the crucial detector parameters is the active volume (AV) fraction which directly enters into all physics analysis. This talk presents the methodology of dead layer and AV determination with different calibration sources such as {sup 241}Am, {sup 133}Ba, {sup 60}Co and {sup 228}Th and the results obtained for the new Phase II detectors. Furthermore, the AV fraction turned out to be the largest systematic uncertainty in the analysis of Phase I data which makes it imperative to reduce its uncertainty for Phase II. This talk addresses the major contributions to the AV uncertainty and gives an outlook for improvements in Phase II analysis.

  7. The accountability imperative for quantifying the uncertainty of emission forecasts: evidence from Mexico

    DEFF Research Database (Denmark)

    Puig, Daniel; Morales-Nápoles, Oswaldo; Bakhtiari, Fatemeh

    2017-01-01

    forecasting approaches can reflect prevailing uncertainties. We apply a transparent and replicable method to quantify the uncertainty associated with projections of gross domestic product growth rates for Mexico, a key driver of GHG emissions in the country. We use those projections to produce probabilistic...... forecasts of GHG emissions for Mexico. We contrast our probabilistic forecasts with Mexico’s governmental deterministic forecasts. We show that, because they fail to reflect such key uncertainty, deterministic forecasts are ill-suited for use in target-setting processes. We argue that (i) guidelines should...... be agreed upon, to ensure that governmental forecasts meet certain minimum transparency and quality standards, and (ii) governments should be held accountable for the appropriateness of the forecasting approach applied to prepare governmental forecasts, especially when those forecasts are used to derive...

  8. Uncertainty quantification for proton–proton fusion in chiral effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Acharya, B. [Department of Physics and Astronomy, University of Tennessee, Knoxville, TN 37996 (United States); Carlsson, B.D. [Department of Physics, Chalmers University of Technology, SE-412 96 Göteborg (Sweden); Ekström, A. [Department of Physics and Astronomy, University of Tennessee, Knoxville, TN 37996 (United States); Physics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Department of Physics, Chalmers University of Technology, SE-412 96 Göteborg (Sweden); Forssén, C. [Department of Physics, Chalmers University of Technology, SE-412 96 Göteborg (Sweden); Platter, L., E-mail: lplatter@utk.edu [Department of Physics and Astronomy, University of Tennessee, Knoxville, TN 37996 (United States); Physics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States)

    2016-09-10

    We compute the S-factor of the proton–proton (pp) fusion reaction using chiral effective field theory (χEFT) up to next-to-next-to-leading order (NNLO) and perform a rigorous uncertainty analysis of the results. We quantify the uncertainties due to (i) the computational method used to compute the pp cross section in momentum space, (ii) the statistical uncertainties in the low-energy coupling constants of χEFT, (iii) the systematic uncertainty due to the χEFT cutoff, and (iv) systematic variations in the database used to calibrate the nucleon–nucleon interaction. We also examine the robustness of the polynomial extrapolation procedure, which is commonly used to extract the threshold S-factor and its energy-derivatives. By performing a statistical analysis of the polynomial fit of the energy-dependent S-factor at several different energy intervals, we eliminate a systematic uncertainty that can arise from the choice of the fit interval in our calculations. In addition, we explore the statistical correlations between the S-factor and few-nucleon observables such as the binding energies and point-proton radii of {sup 2,3}H and {sup 3}He as well as the D-state probability and quadrupole moment of {sup 2}H, and the β-decay of {sup 3}H. We find that, with the state-of-the-art optimization of the nuclear Hamiltonian, the statistical uncertainty in the threshold S-factor cannot be reduced beyond 0.7%.

  9. Uncertainty Assessment of Space-Borne Passive Soil Moisture Retrievals

    Science.gov (United States)

    Quets, Jan; De Lannoy, Gabrielle; Reichle, Rolf; Cosh, Michael; van der Schalie, Robin; Wigneron, Jean-Pierre

    2017-01-01

    The uncertainty associated with passive soil moisture retrieval is hard to quantify, and known to be underlain by various, diverse, and complex causes. Factors affecting space-borne retrieved soil moisture estimation include: (i) the optimization or inversion method applied to the radiative transfer model (RTM), such as e.g. the Single Channel Algorithm (SCA), or the Land Parameter Retrieval Model (LPRM), (ii) the selection of the observed brightness temperatures (Tbs), e.g. polarization and incidence angle, (iii) the definition of the cost function and the impact of prior information in it, and (iv) the RTM parameterization (e.g. parameterizations officially used by the SMOS L2 and SMAP L2 retrieval products, ECMWF-based SMOS assimilation product, SMAP L4 assimilation product, and perturbations from those configurations). This study aims at disentangling the relative importance of the above-mentioned sources of uncertainty, by carrying out soil moisture retrieval experiments, using SMOS Tb observations in different settings, of which some are mentioned above. The ensemble uncertainties are evaluated at 11 reference CalVal sites, over a time period of more than 5 years. These experimental retrievals were inter-compared, and further confronted with in situ soil moisture measurements and operational SMOS L2 retrievals, using commonly used skill metrics to quantify the temporal uncertainty in the retrievals.

  10. Transferring diffractive optics from research to commercial applications: Part II - size estimations for selected markets

    Science.gov (United States)

    Brunner, Robert

    2014-04-01

    In a series of two contributions, decisive business-related aspects of the current process status to transfer research results on diffractive optical elements (DOEs) into commercial solutions are discussed. In part I, the focus was on the patent landscape. Here, in part II, market estimations concerning DOEs for selected applications are presented, comprising classical spectroscopic gratings, security features on banknotes, DOEs for high-end applications, e.g., for the semiconductor manufacturing market and diffractive intra-ocular lenses. The derived market sizes are referred to the optical elements, itself, rather than to the enabled instruments. The estimated market volumes are mainly addressed to scientifically and technologically oriented optical engineers to serve as a rough classification of the commercial dimensions of DOEs in the different market segments and do not claim to be exhaustive.

  11. Quantitative Analysis of Uncertainty in Medical Reporting: Part 3: Customizable Education, Decision Support, and Automated Alerts.

    Science.gov (United States)

    Reiner, Bruce I

    2017-12-18

    In order to better elucidate and understand the causative factors and clinical implications of uncertainty in medical reporting, one must first create a referenceable database which records a number of standardized metrics related to uncertainty language, clinical context, technology, and provider and patient data. The resulting analytics can in turn be used to create context and user-specific reporting guidelines, real-time decision support, educational resources, and quality assurance measures. If this technology can be directly integrated into reporting technology and workflow, the goal is to proactively improve clinical outcomes at the point of care.

  12. Structural and parameteric uncertainty quantification in cloud microphysics parameterization schemes

    Science.gov (United States)

    van Lier-Walqui, M.; Morrison, H.; Kumjian, M. R.; Prat, O. P.; Martinkus, C.

    2017-12-01

    Atmospheric model parameterization schemes employ approximations to represent the effects of unresolved processes. These approximations are a source of error in forecasts, caused in part by considerable uncertainty about the optimal value of parameters within each scheme -- parameteric uncertainty. Furthermore, there is uncertainty regarding the best choice of the overarching structure of the parameterization scheme -- structrual uncertainty. Parameter estimation can constrain the first, but may struggle with the second because structural choices are typically discrete. We address this problem in the context of cloud microphysics parameterization schemes by creating a flexible framework wherein structural and parametric uncertainties can be simultaneously constrained. Our scheme makes no assuptions about drop size distribution shape or the functional form of parametrized process rate terms. Instead, these uncertainties are constrained by observations using a Markov Chain Monte Carlo sampler within a Bayesian inference framework. Our scheme, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), has flexibility to predict various sets of prognostic drop size distribution moments as well as varying complexity of process rate formulations. We compare idealized probabilistic forecasts from versions of BOSS with varying levels of structural complexity. This work has applications in ensemble forecasts with model physics uncertainty, data assimilation, and cloud microphysics process studies.

  13. Good Modeling Practice for PAT Applications: Propagation of Input Uncertainty and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Eliasson Lantz, Anna

    2009-01-01

    The uncertainty and sensitivity analysis are evaluated for their usefulness as part of the model-building within Process Analytical Technology applications. A mechanistic model describing a batch cultivation of Streptomyces coelicolor for antibiotic production was used as case study. The input...... compared to the large uncertainty observed in the antibiotic and off-gas CO2 predictions. The output uncertainty was observed to be lower during the exponential growth phase, while higher in the stationary and death phases - meaning the model describes some periods better than others. To understand which...... promising for helping to build reliable mechanistic models and to interpret the model outputs properly. These tools make part of good modeling practice, which can contribute to successful PAT applications for increased process understanding, operation and control purposes. © 2009 American Institute...

  14. Adaptive Core Simulation Employing Discrete Inverse Theory - Part II: Numerical Experiments

    International Nuclear Information System (INIS)

    Abdel-Khalik, Hany S.; Turinsky, Paul J.

    2005-01-01

    Use of adaptive simulation is intended to improve the fidelity and robustness of important core attribute predictions such as core power distribution, thermal margins, and core reactivity. Adaptive simulation utilizes a selected set of past and current reactor measurements of reactor observables, i.e., in-core instrumentation readings, to adapt the simulation in a meaningful way. The companion paper, ''Adaptive Core Simulation Employing Discrete Inverse Theory - Part I: Theory,'' describes in detail the theoretical background of the proposed adaptive techniques. This paper, Part II, demonstrates several computational experiments conducted to assess the fidelity and robustness of the proposed techniques. The intent is to check the ability of the adapted core simulator model to predict future core observables that are not included in the adaption or core observables that are recorded at core conditions that differ from those at which adaption is completed. Also, this paper demonstrates successful utilization of an efficient sensitivity analysis approach to calculate the sensitivity information required to perform the adaption for millions of input core parameters. Finally, this paper illustrates a useful application for adaptive simulation - reducing the inconsistencies between two different core simulator code systems, where the multitudes of input data to one code are adjusted to enhance the agreement between both codes for important core attributes, i.e., core reactivity and power distribution. Also demonstrated is the robustness of such an application

  15. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    Science.gov (United States)

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  16. Heisenberg's principle of uncertainty and the uncertainty relations

    International Nuclear Information System (INIS)

    Redei, Miklos

    1987-01-01

    The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs

  17. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  18. Assessment of SFR Wire Wrap Simulation Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Delchini, Marc-Olivier G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Popov, Emilian L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Pointer, William David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Swiler, Laura P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-30

    Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical results

  19. Probabilistic Mass Growth Uncertainties

    Science.gov (United States)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  20. Coal-fired power materials - Part II

    Energy Technology Data Exchange (ETDEWEB)

    Viswanathan, V.; Purgert, R.; Rawls, P. [Electric Power Research Institute, Palo Alto, CA (United States)

    2008-09-15

    Part 1 discussed some general consideration in selection of alloys for advanced ultra supercritical (USC) coal-fired power plant boilers. This second part covers results reported by the US project consortium, which has extensively evaluated the steamside oxidation, fireside corrosion, and fabricability of the alloys selected for USC plants. 3 figs.

  1. TREATING UNCERTAINTIES IN A NUCLEAR SEISMIC PROBABILISTIC RISK ASSESSMENT BY MEANS OF THE DEMPSTER-SHAFER THEORY OF EVIDENCE

    Directory of Open Access Journals (Sweden)

    CHUNG-KUNG LO

    2014-02-01

    Full Text Available The analyses carried out within the Seismic Probabilistic Risk Assessments (SPRAs of Nuclear Power Plants (NPPs are affected by significant aleatory and epistemic uncertainties. These uncertainties have to be represented and quantified coherently with the data, information and knowledge available, to provide reasonable assurance that related decisions can be taken robustly and with confidence. The amount of data, information and knowledge available for seismic risk assessment is typically limited, so that the analysis must strongly rely on expert judgments. In this paper, a Dempster-Shafer Theory (DST framework for handling uncertainties in NPP SPRAs is proposed and applied to an example case study. The main contributions of this paper are two: (i applying the complete DST framework to SPRA models, showing how to build the Dempster-Shafer structures of the uncertainty parameters based on industry generic data, and (ii embedding Bayesian updating based on plant specific data into the framework. The results of the application to a case study show that the approach is feasible and effective in (i describing and jointly propagating aleatory and epistemic uncertainties in SPRA models and (ii providing ‘conservative’ bounds on the safety quantities of interest (i.e. Core Damage Frequency, CDF that reflect the (limited state of knowledge of the experts about the system of interest.

  2. Review of studies related to uncertainty in risk analsis

    Energy Technology Data Exchange (ETDEWEB)

    Rish, W.R.; Marnicio, R.J.

    1988-08-01

    The Environmental Protection Agency's Office of Radiation Programs (ORP) is responsible for regulating on a national level the risks associated with technological sources of ionizing radiation in the environment. A critical activity of the ORP is analyzing and evaluating risk. The ORP believes that the analysis of uncertainty should be an integral part of any risk assessment; therefore, the ORP has initiated a project to develop framework for the treatment of uncertainty in risk analysis. Summaries of recent studies done in five areas of study are presented.

  3. Dialogue act recognition under uncertainty using Bayesian networks

    NARCIS (Netherlands)

    Keizer, S.; op den Akker, Hendrikus J.A.

    2007-01-01

    In this paper we discuss the task of dialogue act recognition as a part of interpreting user utterances in context. To deal with the uncertainty that is inherent in natural language processing in general and dialogue act recognition in particular we use machine learning techniques to train

  4. Expert opinions on carbon dioxide capture and storage-A framing of uncertainties and possibilities

    International Nuclear Information System (INIS)

    Hansson, Anders; Bryngelsson, Marten

    2009-01-01

    There are many uncertainties and knowledge gaps regarding the development of carbon dioxide capture and storage (CCS)-e.g., when it comes to costs, life-cycle effects, storage capacity and permanence. In spite of these uncertainties and barriers, the CCS research community is generally very optimistic regarding CCS' development. The discrepancy between the uncertainties and the optimism is the point of departure in this study, which is based on interviews with 24 CCS experts. The aim is to analyse experts' framings of CCS with focus on two key aspects: (i) the function and potential of CCS and (ii) uncertainties. The optimism among the CCS experts is tentatively explained. The interpretative flexibility of CCS is claimed to be an essential explanation for the optimism. CCS is promoted from a wide variety of perspectives, e.g., solidarity and peace, bridge to a sustainable energy system, sustaining the modern lifestyle and compatibility with the fossil fuel lock-in. Awareness of the uncertainties and potential over-optimism is warranted within policy and decision making as they often rely on scientific forecasts and experts' judgements.

  5. Expert opinions on carbon dioxide capture and storage-A framing of uncertainties and possibilities

    Energy Technology Data Exchange (ETDEWEB)

    Hansson, Anders [Linkoeping University, Department of Technology and Social Change, SE-58183 Linkoeping (Sweden); Linkoeping University, Centre for Climate Science and Policy Research, SE-60174 Norrkoeping (Sweden); Bryngelsson, Marten [KTH, School of Chemical Sciences, Teknikringen 50, SE-10044 Stockholm (Sweden)], E-mail: mrtn@kth.se

    2009-06-15

    There are many uncertainties and knowledge gaps regarding the development of carbon dioxide capture and storage (CCS)-e.g., when it comes to costs, life-cycle effects, storage capacity and permanence. In spite of these uncertainties and barriers, the CCS research community is generally very optimistic regarding CCS' development. The discrepancy between the uncertainties and the optimism is the point of departure in this study, which is based on interviews with 24 CCS experts. The aim is to analyse experts' framings of CCS with focus on two key aspects: (i) the function and potential of CCS and (ii) uncertainties. The optimism among the CCS experts is tentatively explained. The interpretative flexibility of CCS is claimed to be an essential explanation for the optimism. CCS is promoted from a wide variety of perspectives, e.g., solidarity and peace, bridge to a sustainable energy system, sustaining the modern lifestyle and compatibility with the fossil fuel lock-in. Awareness of the uncertainties and potential over-optimism is warranted within policy and decision making as they often rely on scientific forecasts and experts' judgements.

  6. Safety aspects of advanced fuels irradiations in EBR-II

    International Nuclear Information System (INIS)

    Lehto, W.K.

    1975-09-01

    Basic safety questions such as MFCI, loss-of-Na bond, pin behavior during design basis transients, and failure propagation were evaluated as they pertain to advanced fuels in EBR-II. With the exception of pin response to the unlikely loss-of-flow transient, the study indicates that irradiation of significant numbers of advanced fueled subassemblies in EBR-II should pose no safety problems. The analysis predicts, however, that Na boiling may occur during the postulated design basis unlikely loss-of-flow transient in subassemblies containing He-bonded fuel pins with the larger fuel-clad gaps. The calculations indicate that coolant temperatures at top of core in the limiting S/A's, containing the He bonded pins, would reach approximately 1480 0 F during the transient without application of uncertainty factors. Inclusion of uncertainties could result in temperature predictions which approach coolant boiling temperatures (1640 0 F). Further analysis of He-bonded pins is being done in this potential problem area, e.g., to apply best estimates of uncertainty factors and to determine the sensitivity of the preliminary results to gap conductance

  7. Treatment of uncertainties in the IPCC: a philosophical analysis

    Science.gov (United States)

    Jebeile, J.; Drouet, I.

    2014-12-01

    The IPCC produces scientific reports out of findings on climate and climate change. Because the findings are uncertain in many respects, the production of reports requires aggregating assessments of uncertainties of different kinds. This difficult task is currently regulated by the Guidance note for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties. The note recommends that two metrics—i.e. confidence and likelihood— be used for communicating the degree of certainty in findings. Confidence is expressed qualitatively "based on the type, amount, quality, and consistency of evidence […] and the degree of agreement", while likelihood is expressed probabilistically "based on statistical analysis of observations or model results, or expert judgment". Therefore, depending on the evidence evaluated, authors have the choice to present either an assigned level of confidence or a quantified measure of likelihood. But aggregating assessments of uncertainties of these two different kinds express distinct and conflicting methodologies. So the question arises whether the treatment of uncertainties in the IPCC is rationally justified. In order to answer the question, it is worth comparing the IPCC procedures with the formal normative theories of epistemic rationality which have been developed by philosophers. These theories—which include contributions to the philosophy of probability and to bayesian probabilistic confirmation theory—are relevant for our purpose because they are commonly used to assess the rationality of common collective jugement formation based on uncertain knowledge. In this paper we make the comparison and pursue the following objectives: i/we determine whether the IPCC confidence and likelihood can be compared with the notions of uncertainty targeted by or underlying the formal normative theories of epistemic rationality; ii/we investigate whether the formal normative theories of epistemic rationality justify

  8. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    Science.gov (United States)

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them

  9. Tobacco control and gender in south-east Asia. Part II: Singapore and Vietnam.

    Science.gov (United States)

    Morrow, Martha; Barraclough, Simon

    2003-12-01

    In the World Health Organization's Western Pacific Region, being born male is the single greatest risk marker for tobacco use. While the literature demonstrates that risks associated with tobacco use may vary according to sex, gender refers to the socially determined roles and responsibilities of men and women, who initiate, continue and quit using tobacco for complex and often different reasons. Cigarette advertising frequently appeals to gender roles. Yet tobacco control policy tends to be gender-blind. Using a broad, gender-sensitivity framework, this contradiction is explored in four Western Pacific countries. Part I of the study presented the rationale, methodology and design of the study, discussed issues surrounding gender and tobacco, and analysed developments in Malaysia and the Philippines (see the previous issue of this journal). Part II deals with Singapore and Vietnam. In all four countries gender was salient for the initiation and maintenance of smoking. Yet, with a few exceptions, gender was largely unrecognized in control policy. Suggestions for overcoming this weakness in order to enhance tobacco control are made.

  10. [Education in our time: competency or aptitude? The case for medicine. Part II].

    Science.gov (United States)

    Viniegra-Velázquez, Leonardo

    Part II is focused on participatory education (PE), a distinctive way to understand and practice education in contrast to passive education. The core of PE is to develop everyone's own cognitive potentialities frequently mutilated, neglected or ignored. Epistemological and experiential basis of PE are defined: the concept of incisive and creative criticism, the idea of knowledge as each person's own construct and life experience as the main focus of reflection and cognition. The PE aims towards individuals with unprecedented cognitive and creative faculties, capable of approaching a more inclusive and hospitable world. The last part criticizes the fact that medical education has remained among the passive education paradigm. The key role of cognitive aptitudes, both methodological and practical (clinical aptitude), in the progress of medical education and practice is emphasized. As a conclusion, the knowhow of education is discussed, aiming towards a better world away from human and planetary degradation. Copyright © 2017 Hospital Infantil de México Federico Gómez. Publicado por Masson Doyma México S.A. All rights reserved.

  11. Stiffnites. Part II

    Directory of Open Access Journals (Sweden)

    Maria Teresa Pareschi

    2011-06-01

    Full Text Available

    The dynamics of a stiffnite are here inferred. A stiffnite is a sheet-shaped, gravity-driven submarine sediment flow, with a fabric made up of marine ooze. To infer stiffnite dynamics, order of magnitude estimations are used. Field deposits and experiments on materials taken from the literature are also used. Stiffnites can be tens or hundreds of kilometers wide, and a few centimeters/ meters thick. They move on the sea slopes over hundreds of kilometers, reaching submarine velocities as high as 100 m/s. Hard grain friction favors grain fragmentation and formation of triboelectrically electrified particles and triboplasma (i.e., ions + electrons. Marine lipids favor isolation of electrical charges. At first, two basic assumptions are introduced, and checked a posteriori: (a in a flowing stiffnite, magnetic dipole moments develop, with the magnetization proportional to the shear rate. I have named those dipoles as Ambigua. (b Ambigua are ‘vertically frozen’ along stiffnite streamlines. From (a and (b, it follows that: (i Ambigua create a magnetic field (at peak, >1 T. (ii Lorentz forces sort stiffnite particles into two superimposed sheets. The lower sheet, L+, has a sandy granulometry and a net positive electrical charge density. The upper sheet, L–, has a silty muddy granulometry and a net negative electrical charge density; the grains of sheet L– become finer upwards. (iii Faraday forces push ferromagnetic grains towards the base of a stiffnite, so that a peak of magnetic susceptibility characterizes a stiffnite deposit. (iv Stiffnites harden considerably during their motion, due to magnetic confinement. Stiffnite deposits and inferred stiffnite characteristics are compatible with a stable flow behavior against bending, pinch, or other macro instabilities. In the present report, a consistent hypothesis about the nature of Ambigua is provided.

  12. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    International Nuclear Information System (INIS)

    Kirchner, G.; Peterson, R.

    1996-11-01

    Uncertainty in predictions of environmental transfer models arises from, among other sources, the adequacy of the conceptual model, the approximations made in coding the conceptual model, the quality of the input data, the uncertainty in parameter values, and the assumptions made by the user. In recent years efforts to quantify the confidence that can be placed in predictions have been increasing, but have concentrated on a statistical propagation of the influence of parameter uncertainties on the calculational results. The primary objective of this Working Group of BIOMOVS II was to test user's influence on model predictions on a more systematic basis than has been done before. The main goals were as follows: To compare differences between predictions from different people all using the same model and the same scenario description with the statistical uncertainties calculated by the model. To investigate the main reasons for different interpretations by users. To create a better awareness of the potential influence of the user on the modeling results. Terrestrial food chain models driven by deposition of radionuclides from the atmosphere were used. Three codes were obtained and run with three scenarios by a maximum of 10 users. A number of conclusions can be drawn some of which are general and independent of the type of models and processes studied, while others are restricted to the few processes that were addressed directly: For any set of predictions, the variation in best estimates was greater than one order of magnitude. Often the range increased from deposition to pasture to milk probably due to additional transfer processes. The 95% confidence intervals about the predictions calculated from the parameter distributions prepared by the participants did not always overlap the observations; similarly, sometimes the confidence intervals on the predictions did not overlap. Often the 95% confidence intervals of individual predictions were smaller than the

  13. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Kirchner, G. [Univ. of Bremen (Germany); Peterson, R. [AECL, Chalk River, ON (Canada)] [and others

    1996-11-01

    Uncertainty in predictions of environmental transfer models arises from, among other sources, the adequacy of the conceptual model, the approximations made in coding the conceptual model, the quality of the input data, the uncertainty in parameter values, and the assumptions made by the user. In recent years efforts to quantify the confidence that can be placed in predictions have been increasing, but have concentrated on a statistical propagation of the influence of parameter uncertainties on the calculational results. The primary objective of this Working Group of BIOMOVS II was to test user's influence on model predictions on a more systematic basis than has been done before. The main goals were as follows: To compare differences between predictions from different people all using the same model and the same scenario description with the statistical uncertainties calculated by the model. To investigate the main reasons for different interpretations by users. To create a better awareness of the potential influence of the user on the modeling results. Terrestrial food chain models driven by deposition of radionuclides from the atmosphere were used. Three codes were obtained and run with three scenarios by a maximum of 10 users. A number of conclusions can be drawn some of which are general and independent of the type of models and processes studied, while others are restricted to the few processes that were addressed directly: For any set of predictions, the variation in best estimates was greater than one order of magnitude. Often the range increased from deposition to pasture to milk probably due to additional transfer processes. The 95% confidence intervals about the predictions calculated from the parameter distributions prepared by the participants did not always overlap the observations; similarly, sometimes the confidence intervals on the predictions did not overlap. Often the 95% confidence intervals of individual predictions were smaller than the

  14. The T?lz Temporal Topography Study: Mapping the visual field across the life span. Part II: Cognitive factors shaping visual field maps

    OpenAIRE

    Poggel, Dorothe A.; Treutwein, Bernhard; Calmanti, Claudia; Strasburger, Hans

    2012-01-01

    Part I described the topography of visual performance over the life span. Performance decline was explained only partly by deterioration of the optical apparatus. Part II therefore examines the influence of higher visual and cognitive functions. Visual field maps for 95 healthy observers of static perimetry, double-pulse resolution (DPR), reaction times, and contrast thresholds, were correlated with measures of visual attention (alertness, divided attention, spatial cueing), visual search, an...

  15. Influence of measurement uncertainty on classification of thermal environment in buildings according to European Standard EN 15251

    DEFF Research Database (Denmark)

    Kolarik, Jakub; Olesen, Bjarne W.

    2015-01-01

    European Standard EN 15 251 in its current version does not provide any guidance on how to handle uncertainty of long term measurements of indoor environmental parameters used for classification of buildings. The objective of the study was to analyse the uncertainty for field measurements...... measurements of operative temperature at two measuring points (south/south-west and north/northeast orientation). Results of the present study suggest that measurement uncertainty needs to be considered during assessment of thermal environment in existing buildings. When expanded standard uncertainty was taken...... into account in categorization of thermal environment according to EN 15251, the difference in prevalence of exceeded category limits were up to 17.3%, 8.3% and 2% of occupied hours for category I, II and III respectively....

  16. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    Science.gov (United States)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G; Ring, David; Parisien, Robert

    2016-06-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if other factors are involved. With added experience, uncertainty could be expected to diminish, but perhaps more influential are things like physician confidence, belief in the veracity of what is published, and even one's religious beliefs. In addition, it is plausible that the kind of practice a physician works in can affect the experience of uncertainty. Practicing physicians may not be immediately aware of these effects on how uncertainty is experienced in their clinical decision-making. We asked: (1) Does uncertainty and overconfidence bias decrease with years of practice? (2) What sociodemographic factors are independently associated with less recognition of uncertainty, in particular belief in God or other deity or deities, and how is atheism associated with recognition of uncertainty? (3) Do confidence bias (confidence that one's skill is greater than it actually is), degree of trust in the orthopaedic evidence, and degree of statistical sophistication correlate independently with recognition of uncertainty? We created a survey to establish an overall recognition of uncertainty score (four questions), trust in the orthopaedic evidence base (four questions), confidence bias (three questions), and statistical understanding (six questions). Seven hundred six members of the Science of Variation Group, a collaboration that aims to study variation in the definition and treatment of human illness, were approached to complete our survey. This group represents mainly orthopaedic surgeons specializing in trauma or hand and wrist surgery, practicing in Europe and North America, of whom the majority is involved in teaching. Approximately half of the group has more than 10 years

  17. The design of reliability data bases, part II: competing risk and data compression

    International Nuclear Information System (INIS)

    Cooke, Roger M.

    1996-01-01

    Models for analyzing dependent competing risk data are presented. These models are designed to represent interactions of critical failure and maintenance mechanisms responsible for intercepting incipient and degraded failures, and they are fashioned such that the (constant) critical failure rate is identifiable from dependent competing risk data. Uncertainty bounds for the critical failure rate which take modeling uncertainty and statistical fluctuations into account are given

  18. Human-like behavior of robot arms: general considerations and the handwriting task-part II: The robot arm in handwriting

    NARCIS (Netherlands)

    Potkonjak, V.; Kostic, D.; Tzafestas, S.; Popovic, M.; Lazarevic, M.; Djordjevic, G.

    2001-01-01

    This paper (Part II) investigates the motion of a redundant anthropomorphic arm during the writing task. Two approaches are applied. The first is based on the concept of distributed positioning which is suitable to model the "writing" task before the occurrence of fatigue symptoms. The second

  19. Short stack modeling of degradation in solid oxide fuel cells. Part II. Sensitivity and interaction analysis

    Science.gov (United States)

    Gazzarri, J. I.; Kesler, O.

    In the first part of this two-paper series, we presented a numerical model of the impedance behaviour of a solid oxide fuel cell (SOFC) aimed at simulating the change in the impedance spectrum induced by contact degradation at the interconnect-electrode, and at the electrode-electrolyte interfaces. The purpose of that investigation was to develop a non-invasive diagnostic technique to identify degradation modes in situ. In the present paper, we appraise the predictive capabilities of the proposed method in terms of its robustness to uncertainties in the input parameters, many of which are very difficult to measure independently. We applied this technique to the degradation modes simulated in Part I, in addition to anode sulfur poisoning. Electrode delamination showed the highest robustness to input parameter variations, followed by interconnect oxidation and interconnect detachment. The most sensitive degradation mode was sulfur poisoning, due to strong parameter interactions. In addition, we simulate several simultaneous two-degradation-mode scenarios, assessing the method's capabilities and limitations for the prediction of electrochemical behaviour of SOFC's undergoing multiple simultaneous degradation modes.

  20. Short stack modeling of degradation in solid oxide fuel cells. Part II. Sensitivity and interaction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gazzarri, J.I. [Department of Mechanical Engineering, University of British Columbia, 2054-6250 Applied Science Lane, Vancouver, BC V6T 1Z4 (Canada); Kesler, O. [Department of Mechanical and Industrial Engineering, University of Toronto, 5 King' s College Road, Toronto, ON M5S 3G8 (Canada)

    2008-01-21

    In the first part of this two-paper series, we presented a numerical model of the impedance behaviour of a solid oxide fuel cell (SOFC) aimed at simulating the change in the impedance spectrum induced by contact degradation at the interconnect-electrode, and at the electrode-electrolyte interfaces. The purpose of that investigation was to develop a non-invasive diagnostic technique to identify degradation modes in situ. In the present paper, we appraise the predictive capabilities of the proposed method in terms of its robustness to uncertainties in the input parameters, many of which are very difficult to measure independently. We applied this technique to the degradation modes simulated in Part I, in addition to anode sulfur poisoning. Electrode delamination showed the highest robustness to input parameter variations, followed by interconnect oxidation and interconnect detachment. The most sensitive degradation mode was sulfur poisoning, due to strong parameter interactions. In addition, we simulate several simultaneous two-degradation-mode scenarios, assessing the method's capabilities and limitations for the prediction of electrochemical behaviour of SOFC's undergoing multiple simultaneous degradation modes. (author)

  1. A Framework for Understanding Uncertainty in Seismic Risk Assessment.

    Science.gov (United States)

    Foulser-Piggott, Roxane; Bowman, Gary; Hughes, Martin

    2017-10-11

    A better understanding of the uncertainty that exists in models used for seismic risk assessment is critical to improving risk-based decisions pertaining to earthquake safety. Current models estimating the probability of collapse of a building do not consider comprehensively the nature and impact of uncertainty. This article presents a model framework to enhance seismic risk assessment and thus gives decisionmakers a fuller understanding of the nature and limitations of the estimates. This can help ensure that risks are not over- or underestimated and the value of acquiring accurate data is appreciated fully. The methodology presented provides a novel treatment of uncertainties in input variables, their propagation through the model, and their effect on the results. The study presents ranges of possible annual collapse probabilities for different case studies on buildings in different parts of the world, exposed to different levels of seismicity, and with different vulnerabilities. A global sensitivity analysis was conducted to determine the significance of uncertain variables. Two key outcomes are (1) that the uncertainty in ground-motion conversion equations has the largest effect on the uncertainty in the calculation of annual collapse probability; and (2) the vulnerability of a building appears to have an effect on the range of annual collapse probabilities produced, i.e., the level of uncertainty in the estimate of annual collapse probability, with less vulnerable buildings having a smaller uncertainty. © 2017 Society for Risk Analysis.

  2. Absolute Kr I and Kr II transition probabilities

    International Nuclear Information System (INIS)

    Brandt, T.; Helbig, V.; Nick, K.P.

    1982-01-01

    Transition probabilities for 11 KrI and 9 KrII lines between 366.5 and 599.3nm were obtained from measurements with a wall-stabilised arc at atmospheric pressure in pure krypton. The population densities of the excited krypton levels were calculated under the assumption of LTE from electron densities measured by laser interferometry. The uncertainties for the KrI and the KrII data are 15 and 25% respectively. (author)

  3. Calibration of a neutron log in partially saturated media. Part II. Error analysis

    International Nuclear Information System (INIS)

    Hearst, J.R.; Kasameyer, P.W.; Dreiling, L.A.

    1981-01-01

    Four sources or error (uncertainty) are studied in water content obtained from neutron logs calibrated in partially saturated media for holes up to 3 m. For this calibration a special facility was built and an algorithm for a commercial epithermal neutron log was developed that obtains water content from count rate, bulk density, and gap between the neutron sonde and the borehole wall. The algorithm contained errors due to the calibration and lack of fit, while the field measurements included uncertainties in the count rate (caused by statistics and a short time constant), gap, and density. There can be inhomogeneity in the material surrounding the borehole. Under normal field conditions the hole-size-corrected water content obtained from such neutron logs can have an uncertainty as large as 15% of its value

  4. Towards quantifying uncertainty in predictions of Amazon 'dieback'.

    Science.gov (United States)

    Huntingford, Chris; Fisher, Rosie A; Mercado, Lina; Booth, Ben B B; Sitch, Stephen; Harris, Phil P; Cox, Peter M; Jones, Chris D; Betts, Richard A; Malhi, Yadvinder; Harris, Glen R; Collins, Mat; Moorcroft, Paul

    2008-05-27

    Simulations with the Hadley Centre general circulation model (HadCM3), including carbon cycle model and forced by a 'business-as-usual' emissions scenario, predict a rapid loss of Amazonian rainforest from the middle of this century onwards. The robustness of this projection to both uncertainty in physical climate drivers and the formulation of the land surface scheme is investigated. We analyse how the modelled vegetation cover in Amazonia responds to (i) uncertainty in the parameters specified in the atmosphere component of HadCM3 and their associated influence on predicted surface climate. We then enhance the land surface description and (ii) implement a multilayer canopy light interception model and compare with the simple 'big-leaf' approach used in the original simulations. Finally, (iii) we investigate the effect of changing the method of simulating vegetation dynamics from an area-based model (TRIFFID) to a more complex size- and age-structured approximation of an individual-based model (ecosystem demography). We find that the loss of Amazonian rainforest is robust across the climate uncertainty explored by perturbed physics simulations covering a wide range of global climate sensitivity. The introduction of the refined light interception model leads to an increase in simulated gross plant carbon uptake for the present day, but, with altered respiration, the net effect is a decrease in net primary productivity. However, this does not significantly affect the carbon loss from vegetation and soil as a consequence of future simulated depletion in soil moisture; the Amazon forest is still lost. The introduction of the more sophisticated dynamic vegetation model reduces but does not halt the rate of forest dieback. The potential for human-induced climate change to trigger the loss of Amazon rainforest appears robust within the context of the uncertainties explored in this paper. Some further uncertainties should be explored, particularly with respect to the

  5. Risk, unexpected uncertainty, and estimation uncertainty: Bayesian learning in unstable settings.

    Directory of Open Access Journals (Sweden)

    Elise Payzan-LeNestour

    Full Text Available Recently, evidence has emerged that humans approach learning using Bayesian updating rather than (model-free reinforcement algorithms in a six-arm restless bandit problem. Here, we investigate what this implies for human appreciation of uncertainty. In our task, a Bayesian learner distinguishes three equally salient levels of uncertainty. First, the Bayesian perceives irreducible uncertainty or risk: even knowing the payoff probabilities of a given arm, the outcome remains uncertain. Second, there is (parameter estimation uncertainty or ambiguity: payoff probabilities are unknown and need to be estimated. Third, the outcome probabilities of the arms change: the sudden jumps are referred to as unexpected uncertainty. We document how the three levels of uncertainty evolved during the course of our experiment and how it affected the learning rate. We then zoom in on estimation uncertainty, which has been suggested to be a driving force in exploration, in spite of evidence of widespread aversion to ambiguity. Our data corroborate the latter. We discuss neural evidence that foreshadowed the ability of humans to distinguish between the three levels of uncertainty. Finally, we investigate the boundaries of human capacity to implement Bayesian learning. We repeat the experiment with different instructions, reflecting varying levels of structural uncertainty. Under this fourth notion of uncertainty, choices were no better explained by Bayesian updating than by (model-free reinforcement learning. Exit questionnaires revealed that participants remained unaware of the presence of unexpected uncertainty and failed to acquire the right model with which to implement Bayesian updating.

  6. Evaluating variability and uncertainty in radiological impact assessment using SYMBIOSE

    International Nuclear Information System (INIS)

    Simon-Cornu, M.; Beaugelin-Seiller, K.; Boyer, P.; Calmon, P.; Garcia-Sanchez, L.; Mourlon, C.; Nicoulaud, V.; Sy, M.; Gonze, M.A.

    2015-01-01

    SYMBIOSE is a modelling platform that accounts for variability and uncertainty in radiological impact assessments, when simulating the environmental fate of radionuclides and assessing doses to human populations. The default database of SYMBIOSE is partly based on parameter values that are summarized within International Atomic Energy Agency (IAEA) documents. To characterize uncertainty on the transfer parameters, 331 Probability Distribution Functions (PDFs) were defined from the summary statistics provided within the IAEA documents (i.e. sample size, minimal and maximum values, arithmetic and geometric means, standard and geometric standard deviations) and are made available as spreadsheet files. The methods used to derive the PDFs without complete data sets, but merely the summary statistics, are presented. Then, a simple case-study illustrates the use of the database in a second-order Monte Carlo calculation, separating parametric uncertainty and inter-individual variability. - Highlights: • Parametric uncertainty in radioecology was derived from IAEA documents. • 331 Probability Distribution Functions were defined for transfer parameters. • Parametric uncertainty and inter-individual variability were propagated

  7. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    error distribution, and the propagation of the uncertainty to the geophysical variable in the CDR accounting for its error correlation properties. Uncertainty estimates can and should be validated as part of CDR validation when possible. These principles are quite general, but the approach to providing uncertainty information appropriate to different ECVs is varied, as confirmed by a brief review across different ECVs in the CCI. User requirements for uncertainty information can conflict with each other, and a variety of solutions and compromises are possible. The concept of an ensemble CDR as a simple means of communicating rigorous uncertainty information to users is discussed. Our review concludes by providing eight concrete recommendations for good practice in providing and communicating uncertainty in EO-based climate data records.

  8. FNR demonstration experiments Part II: Subcadmium neutron flux measurements

    International Nuclear Information System (INIS)

    Wehe, D.K.; King, J.S.

    1983-01-01

    The FNR HEU-LEU Demonstration Experiments include a comprehensive set of experiments to identify and quantify significant operational differences between two nuclear fuel enrichments. One aspect of these measurements, the subcadmium flux profiling, is the subject of this paper. The flux profiling effort has been accomplished through foil and wire activations, and by rhodium self-powered neutron detector (SPND) mappings. Within the experimental limitations discussed, the program to measure subcadmium flux profiles, lead to the following conclusions: (1) Replacement of a single fresh HEU element by a fresh LEU element at the center of an equilibrium HEU core produces a local flux depression. The ratio of HEU to LEU local flux is 1.19 ± .036, which is, well within experimental uncertainty, equal to the inverse of the U-235 masses for the two elements. (2) Whole core replacement of a large 38 element equilibrium HEU core by a fresh or nearly unburned LEU core reduces the core flux and raises the flux in both D 2 O and H 2 O reflectors. The reduction in the central core region is 40% to 10.0% for the small fresh 29 element LEU core, and 16% to 18% for a 31 element LEU core 482) with low average burnup 2 O reflector fluxes relative to core fluxes as measured by SPND with a fixed value of sensitivity, are in gross disagreement with the same flux ratios measured by Fe and Rh wire activations. Space dependent refinements of S are calculated to give some improvement in the discrepancy but the major part of the correction remains to be resolved

  9. Tunable, Flexible and Efficient Optimization of Control Pulses for Superconducting Qubits, part II - Applications

    Science.gov (United States)

    AsséMat, Elie; Machnes, Shai; Tannor, David; Wilhelm-Mauch, Frank

    In part I, we presented the theoretic foundations of the GOAT algorithm for the optimal control of quantum systems. Here in part II, we focus on several applications of GOAT to superconducting qubits architecture. First, we consider a control-Z gate on Xmons qubits with an Erf parametrization of the optimal pulse. We show that a fast and accurate gate can be obtained with only 16 parameters, as compared to hundreds of parameters required in other algorithms. We present numerical evidences that such parametrization should allow an efficient in-situ calibration of the pulse. Next, we consider the flux-tunable coupler by IBM. We show optimization can be carried out in a more realistic model of the system than was employed in the original study, which is expected to further simplify the calibration process. Moreover, GOAT reduced the complexity of the optimal pulse to only 6 Fourier components, composed with analytic wrappers.

  10. THE GREEN BANK TELESCOPE H II REGION DISCOVERY SURVEY. III. KINEMATIC DISTANCES

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, L. D. [Department of Physics, West Virginia University, Morgantown, WV 26506 (United States); Bania, T. M. [Institute for Astrophysical Research, Department of Astronomy, Boston University, 725 Commonwealth Avenue, Boston, MA 02215 (United States); Balser, Dana S. [National Radio Astronomy Observatory, 520 Edgemont Road, Charlottesville, VA 22903-2475 (United States); Rood, Robert T., E-mail: Loren.Anderson@mail.wvu.edu [Astronomy Department, University of Virginia, P.O. Box 3818, Charlottesville, VA 22903-0818 (United States)

    2012-07-20

    Using the H I emission/absorption method, we resolve the kinematic distance ambiguity and derive distances for 149 of 182 (82%) H II regions discovered by the Green Bank Telescope H II Region Discovery Survey (GBT HRDS). The HRDS is an X-band (9 GHz, 3 cm) GBT survey of 448 previously unknown H II regions in radio recombination line and radio continuum emission. Here, we focus on HRDS sources from 67 Degree-Sign {>=} l {>=} 18 Degree-Sign , where kinematic distances are more reliable. The 25 HRDS sources in this zone that have negative recombination line velocities are unambiguously beyond the orbit of the Sun, up to 20 kpc distant. They are the most distant H II regions yet discovered. We find that 61% of HRDS sources are located at the far distance, 31% at the tangent-point distance, and only 7% at the near distance. 'Bubble' H II regions are not preferentially located at the near distance (as was assumed previously) but average 10 kpc from the Sun. The HRDS nebulae, when combined with a large sample of H II regions with previously known distances, show evidence of spiral structure in two circular arc segments of mean Galactocentric radii of 4.25 and 6.0 kpc. We perform a thorough uncertainty analysis to analyze the effect of using different rotation curves, streaming motions, and a change to the solar circular rotation speed. The median distance uncertainty for our sample of H II regions is only 0.5 kpc, or 5%. This is significantly less than the median difference between the near and far kinematic distances, 6 kpc. The basic Galactic structure results are unchanged after considering these sources of uncertainty.

  11. Sensitivity and uncertainty analysis for the annual phosphorus loss estimator model.

    Science.gov (United States)

    Bolster, Carl H; Vadas, Peter A

    2013-07-01

    Models are often used to predict phosphorus (P) loss from agricultural fields. Although it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study we assessed the effect of model input error on predictions of annual P loss by the Annual P Loss Estimator (APLE) model. Our objectives were (i) to conduct a sensitivity analyses for all APLE input variables to determine which variables the model is most sensitive to, (ii) to determine whether the relatively easy-to-implement first-order approximation (FOA) method provides accurate estimates of model prediction uncertainties by comparing results with the more accurate Monte Carlo simulation (MCS) method, and (iii) to evaluate the performance of the APLE model against measured P loss data when uncertainties in model predictions and measured data are included. Our results showed that for low to moderate uncertainties in APLE input variables, the FOA method yields reasonable estimates of model prediction uncertainties, although for cases where manure solid content is between 14 and 17%, the FOA method may not be as accurate as the MCS method due to a discontinuity in the manure P loss component of APLE at a manure solid content of 15%. The estimated uncertainties in APLE predictions based on assumed errors in the input variables ranged from ±2 to 64% of the predicted value. Results from this study highlight the importance of including reasonable estimates of model uncertainty when using models to predict P loss. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  12. Reducing uncertainty in wind turbine blade health inspection with image processing techniques

    Science.gov (United States)

    Zhang, Huiyi

    Structural health inspection has been widely applied in the operation of wind farms to find early cracks in wind turbine blades (WTBs). Increased numbers of turbines and expanded rotor diameters are driving up the workloads and safety risks for site employees. Therefore, it is important to automate the inspection process as well as minimize the uncertainties involved in routine blade health inspection. In addition, crack documentation and trending is vital to assess rotor blade and turbine reliability in the 20 year designed life span. A new crack recognition and classification algorithm is described that can support automated structural health inspection of the surface of large composite WTBs. The first part of the study investigated the feasibility of digital image processing in WTB health inspection and defined the capability of numerically detecting cracks as small as hairline thickness. The second part of the study identified and analyzed the uncertainty of the digital image processing method. A self-learning algorithm was proposed to recognize and classify cracks without comparing a blade image to a library of crack images. The last part of the research quantified the uncertainty in the field conditions and the image processing methods.

  13. Reproduction in the space environment: Part II. Concerns for human reproduction

    Science.gov (United States)

    Jennings, R. T.; Santy, P. A.

    1990-01-01

    Long-duration space flight and eventual colonization of our solar system will require successful control of reproductive function and a thorough understanding of factors unique to space flight and their impact on gynecologic and obstetric parameters. Part II of this paper examines the specific environmental factors associated with space flight and the implications for human reproduction. Space environmental hazards discussed include radiation, alteration in atmospheric pressure and breathing gas partial pressures, prolonged toxicological exposure, and microgravity. The effects of countermeasures necessary to reduce cardiovascular deconditioning, calcium loss, muscle wasting, and neurovestibular problems are also considered. In addition, the impact of microgravity on male fertility and gamete quality is explored. Due to current constraints, human pregnancy is now contraindicated for space flight. However, a program to explore effective countermeasures to current constraints and develop the required health care delivery capability for extended-duration space flight is suggested. A program of Earth- and space-based research to provide further answers to reproductive questions is suggested.

  14. Mixed ligand complexes of alkaline earth metals: Part XII. Mg(II, Ca(II, Sr(II and Ba(II complexes with 5-chlorosalicylaldehyde and salicylaldehyde or hydroxyaromatic ketones

    Directory of Open Access Journals (Sweden)

    MITHLESH AGRAWAL

    2002-04-01

    Full Text Available The reactions of alkaline earth metal chlorides with 5-chlorosalicylaldehyde and salicylaldehyde, 2-hydroxyacetophenone or 2-hydroxypropiophenone have been carried out in 1 : 1 : 1 mole ratio and the mixed ligand complexes of the type MLL’(H2O2 (where M = Mg(II, Ca(II, Sr(II and Ba(II, HL = 5-chlorosalicylaldehyde and HL’ = salicylaldehyde, 2-hydroxyacetophenone or 2-hydroxypropiophenone have been isolated. These complexes were characterized by TLC, conductance measurements, IR and 1H-NMR spectra.

  15. [Influence of Uncertainty and Uncertainty Appraisal on Self-management in Hemodialysis Patients].

    Science.gov (United States)

    Jang, Hyung Suk; Lee, Chang Suk; Yang, Young Hee

    2015-04-01

    This study was done to examine the relation of uncertainty, uncertainty appraisal, and self-management in patients undergoing hemodialysis, and to identify factors influencing self-management. A convenience sample of 92 patients receiving hemodialysis was selected. Data were collected using a structured questionnaire and medical records. The collected data were analyzed using descriptive statistics, t-test, ANOVA, Pearson correlations and multiple regression analysis with the SPSS/WIN 20.0 program. The participants showed a moderate level of uncertainty with the highest score being for ambiguity among the four uncertainty subdomains. Scores for uncertainty danger or opportunity appraisals were under the mid points. The participants were found to perform a high level of self-management such as diet control, management of arteriovenous fistula, exercise, medication, physical management, measurements of body weight and blood pressure, and social activity. The self-management of participants undergoing hemodialysis showed a significant relationship with uncertainty and uncertainty appraisal. The significant factors influencing self-management were uncertainty, uncertainty opportunity appraisal, hemodialysis duration, and having a spouse. These variables explained 32.8% of the variance in self-management. The results suggest that intervention programs to reduce the level of uncertainty and to increase the level of uncertainty opportunity appraisal among patients would improve the self-management of hemodialysis patients.

  16. A real-time assessment of measurement uncertainty in the experimental characterization of sprays

    International Nuclear Information System (INIS)

    Panão, M R O; Moreira, A L N

    2008-01-01

    This work addresses the estimation of the measurement uncertainty of discrete probability distributions used in the characterization of sprays. A real-time assessment of this measurement uncertainty is further investigated, particularly concerning the informative quality of the measured distribution and the influence of acquiring additional information on the knowledge retrieved from statistical analysis. The informative quality is associated with the entropy concept as understood in information theory (Shannon entropy), normalized by the entropy of the most informative experiment. A new empirical correlation is derived between the error accuracy of a discrete cumulative probability distribution and the normalized Shannon entropy. The results include case studies using: (i) spray impingement measurements to study the applicability of the real-time assessment of measurement uncertainty, and (ii) the simulation of discrete probability distributions of unknown shape or function to test the applicability of the new correlation

  17. The Precautionary Principle and statistical approaches to uncertainty

    DEFF Research Database (Denmark)

    Keiding, Niels; Budtz-Jørgensen, Esben

    2004-01-01

    is unhelpful, because lack of significance can be due either to uninformative data or to genuine lack of effect (the Type II error problem). Its inversion, bioequivalence testing, might sometimes be a model for the Precautionary Principle in its ability to "prove the null hypothesis". Current procedures...... for setting safe exposure levels are essentially derived from these classical statistical ideas, and we outline how uncertainties in the exposure and response measurements affect the no observed adverse effect level, the Benchmark approach and the "Hockey Stick" model. A particular problem concerns model...

  18. Managing uncertainty in flood protection planning with climate projections

    Directory of Open Access Journals (Sweden)

    B. Dittes

    2018-04-01

    Full Text Available Technical flood protection is a necessary part of integrated strategies to protect riverine settlements from extreme floods. Many technical flood protection measures, such as dikes and protection walls, are costly to adapt after their initial construction. This poses a challenge to decision makers as there is large uncertainty in how the required protection level will change during the measure lifetime, which is typically many decades long. Flood protection requirements should account for multiple future uncertain factors: socioeconomic, e.g., whether the population and with it the damage potential grows or falls; technological, e.g., possible advancements in flood protection; and climatic, e.g., whether extreme discharge will become more frequent or not. This paper focuses on climatic uncertainty. Specifically, we devise methodology to account for uncertainty associated with the use of discharge projections, ultimately leading to planning implications. For planning purposes, we categorize uncertainties as either visible, if they can be quantified from available catchment data, or hidden, if they cannot be quantified from catchment data and must be estimated, e.g., from the literature. It is vital to consider the hidden uncertainty, since in practical applications only a limited amount of information (e.g., a finite projection ensemble is available. We use a Bayesian approach to quantify the visible uncertainties and combine them with an estimate of the hidden uncertainties to learn a joint probability distribution of the parameters of extreme discharge. The methodology is integrated into an optimization framework and applied to a pre-alpine case study to give a quantitative, cost-optimal recommendation on the required amount of flood protection. The results show that hidden uncertainty ought to be considered in planning, but the larger the uncertainty already present, the smaller the impact of adding more. The recommended planning is

  19. Managing uncertainty in flood protection planning with climate projections

    Science.gov (United States)

    Dittes, Beatrice; Špačková, Olga; Schoppa, Lukas; Straub, Daniel

    2018-04-01

    Technical flood protection is a necessary part of integrated strategies to protect riverine settlements from extreme floods. Many technical flood protection measures, such as dikes and protection walls, are costly to adapt after their initial construction. This poses a challenge to decision makers as there is large uncertainty in how the required protection level will change during the measure lifetime, which is typically many decades long. Flood protection requirements should account for multiple future uncertain factors: socioeconomic, e.g., whether the population and with it the damage potential grows or falls; technological, e.g., possible advancements in flood protection; and climatic, e.g., whether extreme discharge will become more frequent or not. This paper focuses on climatic uncertainty. Specifically, we devise methodology to account for uncertainty associated with the use of discharge projections, ultimately leading to planning implications. For planning purposes, we categorize uncertainties as either visible, if they can be quantified from available catchment data, or hidden, if they cannot be quantified from catchment data and must be estimated, e.g., from the literature. It is vital to consider the hidden uncertainty, since in practical applications only a limited amount of information (e.g., a finite projection ensemble) is available. We use a Bayesian approach to quantify the visible uncertainties and combine them with an estimate of the hidden uncertainties to learn a joint probability distribution of the parameters of extreme discharge. The methodology is integrated into an optimization framework and applied to a pre-alpine case study to give a quantitative, cost-optimal recommendation on the required amount of flood protection. The results show that hidden uncertainty ought to be considered in planning, but the larger the uncertainty already present, the smaller the impact of adding more. The recommended planning is robust to moderate changes in

  20. Part II. Population

    International Nuclear Information System (INIS)

    2004-01-01

    This monograph deals with assessment of radiological health effects of the Chernobyl accident for emergency workers (part 1) and the population of the contaminated areas in Russia (part 2). The Chernobyl emergency workers and people living in the contaminated areas of Russia received much lower doses than the population of Hiroshima and Nagasaki and it was unclear whether risks of radiation-induced cancers derived with the Japanese data could be extrapolated to the low dose range However, it was predicted as early as in 1990 that the thyroid cancer incidence might be increasing due to incorporated 131 irradiation. What conclusions can be drawn from regarding cancer incidence among emergency workers and residents of the contaminated areas in Russia and the role of the radiation factor on the basis of the registry data? Leukemia incidence. Leukemia incidence is known to be one of principal indications of radiation effects. The radiation risk for leukemias is 3-4 times higher that for solid cancers and its latent period is estimated to be 2-3 years after exposure. Results of the radiation epidemiological studies discussed in this book show that in the worst contaminated Bryansk region the leukemia incidence rate is not higher than in the country in general. Even though some evidence exists for the dose response relationship, the radiation risks appear to be not statistically significant. Since risks of leukemia are known to be higher for those who were children at exposure, long-term epidemiological studies need to be continued. The study of leukemias among emergency workers strongly suggest the existence of dose response relationship. In those who received external doses more than 0.15 Gy the leukemia incidence rate is two time higher and these emergency workers should be referred to as a group of increased radiation risk. Solid cancers. The obtained results provide no evidence to a radiation-induced increase in solid cancers among residents of the contaminated areas

  1. A comparison of numerical solutions of partial differential equations with probabilistic and possibilistic parameters for the quantification of uncertainty in subsurface solute transport.

    Science.gov (United States)

    Zhang, Kejiang; Achari, Gopal; Li, Hua

    2009-11-03

    Traditionally, uncertainty in parameters are represented as probabilistic distributions and incorporated into groundwater flow and contaminant transport models. With the advent of newer uncertainty theories, it is now understood that stochastic methods cannot properly represent non random uncertainties. In the groundwater flow and contaminant transport equations, uncertainty in some parameters may be random, whereas those of others may be non random. The objective of this paper is to develop a fuzzy-stochastic partial differential equation (FSPDE) model to simulate conditions where both random and non random uncertainties are involved in groundwater flow and solute transport. Three potential solution techniques namely, (a) transforming a probability distribution to a possibility distribution (Method I) then a FSPDE becomes a fuzzy partial differential equation (FPDE), (b) transforming a possibility distribution to a probability distribution (Method II) and then a FSPDE becomes a stochastic partial differential equation (SPDE), and (c) the combination of Monte Carlo methods and FPDE solution techniques (Method III) are proposed and compared. The effects of these three methods on the predictive results are investigated by using two case studies. The results show that the predictions obtained from Method II is a specific case of that got from Method I. When an exact probabilistic result is needed, Method II is suggested. As the loss or gain of information during a probability-possibility (or vice versa) transformation cannot be quantified, their influences on the predictive results is not known. Thus, Method III should probably be preferred for risk assessments.

  2. A systematic framework for effective uncertainty assessment of severe accident calculations; Hybrid qualitative and quantitative methodology

    International Nuclear Information System (INIS)

    Hoseyni, Seyed Mohsen; Pourgol-Mohammad, Mohammad; Tehranifard, Ali Abbaspour; Yousefpour, Faramarz

    2014-01-01

    This paper describes a systematic framework for characterizing important phenomena and quantifying the degree of contribution of each parameter to the output in severe accident uncertainty assessment. The proposed methodology comprises qualitative as well as quantitative phases. The qualitative part so called Modified PIRT, being a robust process of PIRT for more precise quantification of uncertainties, is a two step process for identifying and ranking based on uncertainty importance in severe accident phenomena. In this process identified severe accident phenomena are ranked according to their effect on the figure of merit and their level of knowledge. Analytical Hierarchical Process (AHP) serves here as a systematic approach for severe accident phenomena ranking. Formal uncertainty importance technique is used to estimate the degree of credibility of the severe accident model(s) used to represent the important phenomena. The methodology uses subjective justification by evaluating available information and data from experiments, and code predictions for this step. The quantitative part utilizes uncertainty importance measures for the quantification of the effect of each input parameter to the output uncertainty. A response surface fitting approach is proposed for estimating associated uncertainties with less calculation cost. The quantitative results are used to plan in reducing epistemic uncertainty in the output variable(s). The application of the proposed methodology is demonstrated for the ACRR MP-2 severe accident test facility. - Highlights: • A two stage framework for severe accident uncertainty analysis is proposed. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • Uncertainty importance measure quantitatively calculates effect of each uncertainty source. • Methodology is applied successfully on ACRR MP-2 severe accident test facility

  3. Medical Humanities: The Rx for Uncertainty?

    Science.gov (United States)

    Ofri, Danielle

    2017-12-01

    While medical students often fear the avalanche of knowledge they are required to learn during training, it is learning to translate that knowledge into wisdom that is the greatest challenge of becoming a doctor. Part of that challenge is learning to tolerate ambiguity and uncertainty, a difficult feat for doctors who are taught to question anything that is not evidence based or peer reviewed. The medical humanities specialize in this ambiguity and uncertainty, which are hallmarks of actual clinical practice but rarely addressed in medical education. The humanities also force reflection and contemplation-skills that are crucial to thoughtful decision making and to personal wellness. Beyond that, the humanities add a dose of joy and beauty to a training process that is notoriously frugal in these departments. Well integrated, the humanities can be the key to transforming medical knowledge into clinical wisdom.

  4. Evaluation of uncertainties in irradiated hardware characterization: Final report, September 30, 1986-March 31, 1987

    International Nuclear Information System (INIS)

    Bedore, N.; Levin, A.; Tuite, P.

    1987-10-01

    Waste Management Group, Inc. has evaluated the techniques used by industry to characterize and classify irradiated hardware components for disposal. This report describes the current practices used to characterize the radionuclide content of hardware components, identifies the uncertainties associated with the techniques and practices considered, and recommends areas for improvement which could reduce uncertainty. Industry uses two different characterization methods. The first uses a combination of gamma scanning, direct sampling, underwater radiation profiling and radiochemical analysis to determine radionuclide content, while the second uses a form of activation analysis in conjunction with underwater radiation profiling. Both methods employ the determination of Cobalt 60 content, and the determination of scaling factors for hard-to-detect Part 61 radionuclides. The accurate determination of Cobalt-60 is critical since the Part 61 activation product radionuclides which affect Part 61 classification are scaled from Cobalt-60. Current uncertainties in Cobalt-60 determination can be reduced by improving underwater radiation profiling equipment and techniques. The calculational techniques used for activation analysis can also be refined to reduce the uncertainties with Cobalt-60 determination. 33 refs., 11 figs., 10 tabs

  5. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Sin, Gürkan; Gernaey, Krist

    2013-01-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic modelbased process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty...

  6. 30 CFR Appendix II to Subpart D of... - Appendix II to Subpart D of Part 18

    Science.gov (United States)

    2010-07-01

    ... LABOR TESTING, EVALUATION, AND APPROVAL OF MINING PRODUCTS ELECTRIC MOTOR-DRIVEN MINE EQUIPMENT AND ACCESSORIES Machines Assembled With Certified or Explosion-Proof Components, Field Modifications of Approved Machines, and Permits To Use Experimental Equipment Pt. 18, Subpt. D, App. II Appendix II to Subpart D of...

  7. Uncertainty Assessment: What Good Does it Do? (Invited)

    Science.gov (United States)

    Oreskes, N.; Lewandowsky, S.

    2013-12-01

    The scientific community has devoted considerable time and energy to understanding, quantifying and articulating the uncertainties related to anthropogenic climate change. However, informed decision-making and good public policy arguably rely far more on a central core of understanding of matters that are scientifically well established than on detailed understanding and articulation of all relevant uncertainties. Advocates of vaccination, for example, stress its overall efficacy in preventing morbidity and mortality--not the uncertainties over how long the protective effects last. Advocates for colonoscopy for cancer screening stress its capacity to detect polyps before they become cancerous, with relatively little attention paid to the fact that many, if not most, polyps, would not become cancerous even if left unremoved. So why has the climate science community spent so much time focused on uncertainty? One reason, of course, is that articulation of uncertainty is a normal and appropriate part of scientific work. However, we argue that there is another reason that involves the pressure that the scientific community has experienced from individuals and groups promoting doubt about anthropogenic climate change. Specifically, doubt-mongering groups focus public attention on scientific uncertainty as a means to undermine scientific claims, equating uncertainty with untruth. Scientists inadvertently validate these arguments by agreeing that much of the science is uncertain, and thus seemingly implying that our knowledge is insecure. The problem goes further, as the scientific community attempts to articulate more clearly, and reduce, those uncertainties, thus, seemingly further agreeing that the knowledge base is insufficient to warrant public and governmental action. We refer to this effect as 'seepage,' as the effects of doubt-mongering seep into the scientific community and the scientific agenda, despite the fact that addressing these concerns does little to alter

  8. Orientation and uncertainties

    International Nuclear Information System (INIS)

    Peters, H.P.; Hennen, L.

    1990-01-01

    The authors report on the results of three representative surveys that made a closer inquiry into perceptions and valuations of information and information sources concering Chernobyl. If turns out that the information sources are generally considered little trustworthy. This was generally attributable to the interpretation of the events being tied to attitudes in the atmonic energy issue. The greatest credit was given to television broadcasting. The authors summarize their discourse as follows: There is good reason to interpret the widespread uncertainty after Chernobyl as proof of the fact that large parts of the population are prepared and willing to assume a critical stance towards information and prefer to draw their information from various sources representing different positions. (orig.) [de

  9. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  10. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  11. CERN scientists take part in the Tevatron Run II performance review committee

    CERN Multimedia

    Maximilien Brice

    2002-01-01

    Tevatron Run II is under way at Fermilab, exploring the high-energy frontier with upgraded detectors that will address some of the biggest questions in particle physics.Until CERN's LHC switches on, the Tevatron proton-antiproton collider is the world's only source of top quarks. It is the only place where we can search for supersymmetry, for the Higgs boson, and for signatures of additional dimensions of space-time. The US Department of Energy (DOE) recently convened a high-level international review committee to examine Fermilab experts' first-phase plans for the accelerator complex. Pictured here with a dipole magnet in CERN's LHC magnet test facility are the four CERN scientists who took part in the DOE's Tevatron review. Left to right: Francesco Ruggiero, Massimo Placidi, Flemming Pedersen, and Karlheinz Schindl. Further information: CERN Courier 43 (1)

  12. Implementing AORN recommended practices for a safe environment of care, part II.

    Science.gov (United States)

    Kennedy, Lynne

    2014-09-01

    Construction in and around a working perioperative suite is a challenge beyond merely managing traffic patterns and maintaining the sterile field. The AORN "Recommended practices for a safe environment of care, part II" provides guidance on building design; movement of patients, personnel, supplies, and equipment; environmental controls; safety and security; and control of noise and distractions. Whether the OR suite evolves through construction, reconstruction, or remodeling, a multidisciplinary team of construction experts and health care professionals should create a functional plan and communicate at every stage of the project to maintain a safe environment and achieve a well-designed outcome. Emergency preparedness, a facility-wide security plan, and minimization of noise and distractions in the OR also help enhance the safety of the perioperative environment. Copyright © 2014 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  13. Uncertainty measurement in the homogenization and sample reduction in the physical classification of rice and beans

    Directory of Open Access Journals (Sweden)

    Dieisson Pivoto

    2016-04-01

    Full Text Available ABSTRACT: The study aimed to i quantify the measurement uncertainty in the physical tests of rice and beans for a hypothetical defect, ii verify whether homogenization and sample reduction in the physical classification tests of rice and beans is effective to reduce the measurement uncertainty of the process and iii determine whether the increase in size of beans sample increases accuracy and reduces measurement uncertainty in a significant way. Hypothetical defects in rice and beans with different damage levels were simulated according to the testing methodology determined by the Normative Ruling of each product. The homogenization and sample reduction in the physical classification of rice and beans are not effective, transferring to the final test result a high measurement uncertainty. The sample size indicated by the Normative Ruling did not allow an appropriate homogenization and should be increased.

  14. Variability and Uncertainties of Key Hydrochemical Parameters for SKB Sites

    Energy Technology Data Exchange (ETDEWEB)

    Bath, Adrian [Intellisci Ltd, Willoughby on the Wolds, Loughborough (United Kingdom); Hermansson, Hans-Peter [Studsvik Nuclear AB, Nykoeping (Sweden)

    2006-12-15

    The work described in this report is a development of SKI's capability for the review and evaluation of data that will constitute part of SKB's case for selection of a suitable site and application to construct a geological repository for spent nuclear fuel. The aim has been to integrate a number of different approaches to interpreting and evaluating hydrochemical data, especially with respect to the parameters that matter most in assessing the suitability of a site and in understanding the geochemistry and groundwater conditions at a site. It has been focused on taking an independent view of overall uncertainties in reported data, taking account of analytical, sampling and other random and systematic sources of error. This evaluation was carried out initially with a compilation and general inspection of data from the Simpevarp, Forsmark and Laxemar sites plus data from older 'historical' boreholes in the Aespoe area. That was followed by a more specific interpretation by means of geochemical calculations which test the robustness of certain parameters, namely pH and redox/Eh. Geochemical model calculations have been carried out with widely available computer software. Data sources and their handling were also considered, especially access to SKB's SICADA database. In preparation for the use of geochemical modelling programs and to establish comparability of model results with those reported by SKB, the underlying thermodynamic databases were compared with each other and with other generally accepted databases. Comparisons of log K data for selected solid phases and solution complexes from the different thermodynamic databases were made. In general, there is a large degree of comparability between the databases, but there are some significant, and in a few cases large, differences. The present situation is however adequate for present purposes. The interpretation of redox equilibria is dependent on identifying the relevant solid phases and

  15. Confronting the Uncertainty in Aerosol Forcing Using Comprehensive Observational Data

    Science.gov (United States)

    Johnson, J. S.; Regayre, L. A.; Yoshioka, M.; Pringle, K.; Sexton, D.; Lee, L.; Carslaw, K. S.

    2017-12-01

    The effect of aerosols on cloud droplet concentrations and radiative properties is the largest uncertainty in the overall radiative forcing of climate over the industrial period. In this study, we take advantage of a large perturbed parameter ensemble of simulations from the UK Met Office HadGEM-UKCA model (the aerosol component of the UK Earth System Model) to comprehensively sample uncertainty in aerosol forcing. Uncertain aerosol and atmospheric parameters cause substantial aerosol forcing uncertainty in climatically important regions. As the aerosol radiative forcing itself is unobservable, we investigate the potential for observations of aerosol and radiative properties to act as constraints on the large forcing uncertainty. We test how eight different theoretically perfect aerosol and radiation observations can constrain the forcing uncertainty over Europe. We find that the achievable constraint is weak unless many diverse observations are used simultaneously. This is due to the complex relationships between model output responses and the multiple interacting parameter uncertainties: compensating model errors mean there are many ways to produce the same model output (known as model equifinality) which impacts on the achievable constraint. However, using all eight observable quantities together we show that the aerosol forcing uncertainty can potentially be reduced by around 50%. This reduction occurs as we reduce a large sample of model variants (over 1 million) that cover the full parametric uncertainty to around 1% that are observationally plausible.Constraining the forcing uncertainty using real observations is a more complex undertaking, in which we must account for multiple further uncertainties including measurement uncertainties, structural model uncertainties and the model discrepancy from reality. Here, we make a first attempt to determine the true potential constraint on the forcing uncertainty from our model that is achievable using a comprehensive

  16. Realising the Uncertainty Enabled Model Web

    Science.gov (United States)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    conversion between uncertainty types, and between the spatial / temporal support of service inputs / outputs. Finally we describe the tools being generated within the UncertWeb project, considering three main aspects: i) Elicitation of uncertainties on model inputs. We are developing tools to enable domain experts to provide judgements about input uncertainties from UncertWeb model components (e.g. parameters in meteorological models) which allow panels of experts to engage in the process and reach a consensus view on the current knowledge / beliefs about that parameter or variable. We are developing systems for continuous and categorical variables as well as stationary spatial fields. ii) Visualisation of the resulting uncertain outputs from the end of the workflow, but also at intermediate steps. At this point we have prototype implementations driven by the requirements from the use cases that motivate UncertWeb. iii) Sensitivity and uncertainty analysis on model outputs. Here we show the design of the overall system we are developing, including the deployment of an emulator framework to allow computationally efficient approaches. We conclude with a summary of the open issues and remaining challenges we are facing in UncertWeb, and provide a brief overview of how we plan to tackle these.

  17. Measurements of downwelling far-infrared radiance during the RHUBC-II campaign at Cerro Toco, Chile and comparisons with line-by-line radiative transfer calculations

    Science.gov (United States)

    Mast, Jeffrey C.; Mlynczak, Martin G.; Cageao, Richard P.; Kratz, David P.; Latvakoski, Harri; Johnson, David G.; Turner, David D.; Mlawer, Eli J.

    2017-09-01

    Downwelling radiances at the Earth's surface measured by the Far-Infrared Spectroscopy of the Troposphere (FIRST) instrument in an environment with integrated precipitable water (IPW) as low as 0.03 cm are compared with calculated spectra in the far-infrared and mid-infrared. FIRST (a Fourier transform spectrometer) was deployed from August through October 2009 at 5.38 km MSL on Cerro Toco, a mountain in the Atacama Desert of Chile. There FIRST took part in the Radiative Heating in Unexplored Bands Campaign Part 2 (RHUBC-II), the goal of which is the assessment of water vapor spectroscopy. Radiosonde water vapor and temperature vertical profiles are input into the Atmospheric and Environmental Research (AER) Line-by-Line Radiative Transfer Model (LBLRTM) to compute modeled radiances. The LBLRTM minus FIRST residual spectrum is calculated to assess agreement. Uncertainties (1-σ) in both the measured and modeled radiances are also determined. Measured and modeled radiances nearly all agree to within combined (total) uncertainties. Features exceeding uncertainties can be corrected into the combined uncertainty by increasing water vapor and model continuum absorption, however this may not be necessary due to 1-σ uncertainties (68% confidence). Furthermore, the uncertainty in the measurement-model residual is very large and no additional information on the adequacy of current water vapor spectral line or continuum absorption parameters may be derived. Similar future experiments in similarly cold and dry environments will require absolute accuracy of 0.1% of a 273 K blackbody in radiance and water vapor accuracy of ∼3% in the profile layers contributing to downwelling radiance at the surface.

  18. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    Directory of Open Access Journals (Sweden)

    J. Florian Wellmann

    2013-04-01

    Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.

  19. Uncertainty, financial development and economic growth : an empirical analysis

    NARCIS (Netherlands)

    Lensink, Robert

    1999-01-01

    This paper examines whether financial sector development may partly undo growth-reducing effects of policy uncertainty. By performing a cross-country growth regression for the 1970-1995 period I find evidence that countries with a more developed financial sector are better able to nullify the

  20. Reducing uncertainty based on model fitness: Application to a ...

    African Journals Online (AJOL)

    A weakness of global sensitivity and uncertainty analysis methodologies is the often subjective definition of prior parameter probability distributions, especially ... The reservoir representing the central part of the wetland, where flood waters separate into several independent distributaries, is a keystone area within the model.

  1. Uncertainty and measurement

    International Nuclear Information System (INIS)

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  2. All About Dowels - A Review Part II Considerations After Cementation

    Directory of Open Access Journals (Sweden)

    Zishan Dangra

    2017-10-01

    Full Text Available The present review summarizes the published literature examining cementation of the dowel and factors related to it. The peer reviewed English language literature was reviewed from the period 1990 to 2015. Articles were searched in Pubmed/ Medline for the relevant terms. Additional manual searches of some dental journals were also carried out. The original key terms resulted in 228 articles. After applying inclusion criteria, 64 articles remained to be included in part II of this review. Article search indicates that most published literature on dowels are in the form of in vitro analysis. Literature on prefabricated dowel systems far exceeds than the custom cast dowel and newer fibre dowels. Clinical evidence is not sufficient and cannot be used to inform practice confidently. However, within the limitations of this review it is suggested that adhesive fixation is preferred in case of short dowel. Dowel width should be as small as possible. A ferrule of 2 mm has to be provided. Composites have proven to be a good core material provided that adequate tooth structure remained for bonding. Dowel should be inserted if endodontically treated tooth is to be used as abutment for removable partial dentures.

  3. Thinking in nursing education. Part II. A teacher's experience.

    Science.gov (United States)

    Ironside, P M

    1999-01-01

    Across academia, educators are investigating teaching strategies that facilitate students' abilities to think critically. Because may these strategies require low teacher-student ratios or sustained involvement over time, efforts to implement them are often constrained by diminishing resources for education, faculty reductions, and increasing number of part-time teachers and students. In nursing, the challenges of teaching and learning critical thinking are compounded by the demands of providing care to patients with increasingly acute and complex problems in a wide variety of settings. To meet these challenges, nurse teachers have commonly used a variety of strategies to teach critical thinking (1). For instance, they often provide students with case studies or simulated clinical situations in classroom and laboratory settings (2). At other times, students are taught a process of critical thinking and given structured clinical assignments, such as care plans or care maps, where they apply this process in anticipating the care a particular patient will require. Accompanying students onto clinical units, teachers typically evaluate critical thinking ability by reviewing a student's preparation prior to the experience and discussing it with the student during the course of the experience. The rationales students provide for particular nursing interventions are taken as evidence of their critical thinking ability. While this approach is commonly thought to be effective, the evolving health care system has placed increased emphasis on community nursing (3,4), where it is often difficult to prespecify learning experiences or to anticipate patient care needs. In addition, teachers are often not able to accompany each student to the clinical site. Thus, the traditional strategies for teaching and learning critical thinking common to hospital-based clinical courses are being challenged, transformed, and extended (5). Part II of this article describes findings that suggest

  4. Socioeconomic Implications of Achieving 2.0 °C and 1.5 °C Climate Targets under Scientific Uncertainties

    Science.gov (United States)

    Su, X.; Takahashi, K.; Fujimori, S.; Hasegawa, T.; Tanaka, K.; Shiogama, H.; Emori, S.; LIU, J.; Hanasaki, N.; Hijioka, Y.; Masui, T.

    2017-12-01

    Large uncertainty exists in the temperature projections, including contributions from carbon cycle, climate system and aerosols. For the integrated assessment models (IAMs), like DICE, FUND and PAGE, however, the scientific uncertainties mainly rely on the distribution of (equilibrium) climate sensitivity. This study aims at evaluating the emission pathways by limiting temperature increase below 2.0 ºC or 1.5 ºC after 2100 considering scientific uncertainties, and exploring how socioeconomic indicators are affected by such scientific uncertainties. We use a stochastic version of the SCM4OPT, with an uncertainty measurement by considering alternative ranges of key parameters. Three climate cases, namely, i) base case of SSP2, ii) limiting temperature increase below 2.0 ºC after 2100 and iii) limiting temperature increase below 1.5 ºC after 2100, and three types of probabilities - i) >66% probability or likely, ii) >50% probability or more likely than not and iii) the mean of the probability distribution, are considered in the study. The results show that, i) for the 2.0ºC case, the likely CO2 reduction rate in 2100 ranges from 75.5%-102.4%, with mean value of 88.1%, and 93.0%-113.1% (mean 102.5%) for the 1.5ºC case; ii) a likely range of forcing effect is found for the 2.0 ºC case (2.7-3.9 Wm-2) due to scientific uncertainty, and 1.9-3.1 Wm-2 for the 1.5 ºC case; iii) the carbon prices within 50% confidential interval may differ a factor of 3 for both the 2.0ºC case and the 1.5 ºC case; iv) the abatement costs within 50% confidential interval may differ a factor of 4 for both the 2.0ºC case and the 1.5 ºC case. Nine C4MIP carbon cycle models and nineteen CMIP3 AOGCMs are used to account for the scientific uncertainties, following MAGICC 6.0. These uncertainties will result in a likely radiative forcing range of 6.1-7.5 Wm-2 and a likely temperature increase of 3.1-4.5 ºC in 2100 for the base case of SSP2. If we evaluate the 2 ºC target by limiting the

  5. Mammalian Toxicity of Munition Compounds. Phase II. Effects of Multiple Doses. Part III. 2,6-Dinitrotoluene

    Science.gov (United States)

    1976-07-01

    and the neuromuscular effects in these dogs were not due to hypocalcemia . The lowest serum calcium concen- tration in these dogs was 4.2 meq/liter...motor end plate might produce a local hypocalcemia . Such a mechanism is purely speculative. Qualitatively and quantitavely, most of the effects of 2,6...I ýNw,- -MIM I/ MIDWEST RESEARCH INS14ITUTE H0q .3L I LU -_ MAMMALIAN TOXICITY OF MUNITIONS COMPOUlNDSPHASE II: EFFECTS OF MiULTIPLE DOSES C* •PART

  6. Eigenspace perturbations for structural uncertainty estimation of turbulence closure models

    Science.gov (United States)

    Jofre, Lluis; Mishra, Aashwin; Iaccarino, Gianluca

    2017-11-01

    With the present state of computational resources, a purely numerical resolution of turbulent flows encountered in engineering applications is not viable. Consequently, investigations into turbulence rely on various degrees of modeling. Archetypal amongst these variable resolution approaches would be RANS models in two-equation closures, and subgrid-scale models in LES. However, owing to the simplifications introduced during model formulation, the fidelity of all such models is limited, and therefore the explicit quantification of the predictive uncertainty is essential. In such scenario, the ideal uncertainty estimation procedure must be agnostic to modeling resolution, methodology, and the nature or level of the model filter. The procedure should be able to give reliable prediction intervals for different Quantities of Interest, over varied flows and flow conditions, and at diametric levels of modeling resolution. In this talk, we present and substantiate the Eigenspace perturbation framework as an uncertainty estimation paradigm that meets these criteria. Commencing from a broad overview, we outline the details of this framework at different modeling resolution. Thence, using benchmark flows, along with engineering problems, the efficacy of this procedure is established. This research was partially supported by NNSA under the Predictive Science Academic Alliance Program (PSAAP) II, and by DARPA under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo).

  7. Modelo computacional para suporte à decisão em áreas irrigadas. Parte II: testes e aplicação Computer model for decision support in irrigated areas. Part II: tests and application

    Directory of Open Access Journals (Sweden)

    Paulo A. Ferreira

    2006-12-01

    Full Text Available Apresentou-se, na Parte I desta pesquisa, o desenvolvimento de um modelo computacional denominado MCID, para suporte à tomada de decisão quanto ao planejamento e manejo de projetos de irrigação e/ou drenagem. Objetivou-se, na Parte II, testar e aplicar o MCID. No teste comparativo com o programa DRAINMOD, espaçamentos entre drenos, obtidos com o MCID, foram ligeiramente maiores ou idênticos. Os espaçamentos advindos com o MCID e o DRAINMOD foram consideravelmente maiores que os obtidos por meio de metodologias tradicionais de dimensionamento de sistemas de drenagem. A produtividade relativa total, YRT, obtida com o MCID foi, em geral, inferior à conseguida com o DRAINMOD, devido a diferenças de metodologia ao se estimar a produtividade da cultura em resposta ao déficit hídrico. Na comparação com o programa CROPWAT, obtiveram-se resultados muito próximos para (YRT e evapotranspiração real. O modelo desenvolvido foi aplicado para as condições do Projeto Jaíba, MG, para culturas perenes e anuais cultivadas em diferentes épocas. Os resultados dos testes e aplicações indicaram a potencialidade do MCID como ferramenta de apoio à decisão em projetos de irrigação e/ou drenagem.Part I of this research presented the development of a decision support model, called MCID, for planning and managing irrigation and/or drainage projects. Part II is aimed at testing and applying MCID. In a comparative test with the DRAINMOD model, drain spacings obtained with MCID were slightly larger or identical. The spacings obtained with MCID and DRAINMOD were considerably larger than those obtained through traditional methodologies of design of drainage systems. The relative crop yield (YRT obtained with MCID was, in general, lower than the one obtained with DRAINMOD due to differences in the estimate of crop response to water deficit. In comparison with CROPWAT, very close results for YRT and for actual evapotranspiration were obtained. The

  8. Impact of monovalent cations on soil structure. Part II. Results of two Swiss soils

    Science.gov (United States)

    Farahani, Elham; Emami, Hojat; Keller, Thomas

    2018-01-01

    In this study, we investigated the impact of adding solutions with different potassium and sodium concentrations on dispersible clay, water retention characteristics, air permeability, and soil shrinkage behaviour using two agricultural soils from Switzerland with different clay content but similar organic carbon to clay ratio. Three different solutions (including only Na, only K, and the combination of both) were added to soil samples at three different cation ratio of soil structural stability levels, and the soil samples were incubated for one month. Our findings showed that the amount of readily dispersible clay increased with increasing Na concentrations and with increasing cation ratio of soil structural stability. The treatment with the maximum Na concentration resulted in the highest water retention and in the lowest shrinkage capacity. This was was associated with high amounts of readily dispersible clay. Air permeability generally increased during incubation due to moderate wetting and drying cycles, but the increase was negatively correlated with readily dispersible clay. Readily dispersible clay decreased with increasing K, while readily dispersible clay increased with increasing K in Iranian soil (Part I of our study). This can be attributed to the different clay mineralogy of the studied soils (muscovite in Part I and illite in Part II).

  9. Model-specification uncertainty in future forest pest outbreak.

    Science.gov (United States)

    Boulanger, Yan; Gray, David R; Cooke, Barry J; De Grandpré, Louis

    2016-04-01

    Climate change will modify forest pest outbreak characteristics, although there are disagreements regarding the specifics of these changes. A large part of this variability may be attributed to model specifications. As a case study, we developed a consensus model predicting spruce budworm (SBW, Choristoneura fumiferana [Clem.]) outbreak duration using two different predictor data sets and six different correlative methods. The model was used to project outbreak duration and the uncertainty associated with using different data sets and correlative methods (=model-specification uncertainty) for 2011-2040, 2041-2070 and 2071-2100, according to three forcing scenarios (RCP 2.6, RCP 4.5 and RCP 8.5). The consensus model showed very high explanatory power and low bias. The model projected a more important northward shift and decrease in outbreak duration under the RCP 8.5 scenario. However, variation in single-model projections increases with time, making future projections highly uncertain. Notably, the magnitude of the shifts in northward expansion, overall outbreak duration and the patterns of outbreaks duration at the southern edge were highly variable according to the predictor data set and correlative method used. We also demonstrated that variation in forcing scenarios contributed only slightly to the uncertainty of model projections compared with the two sources of model-specification uncertainty. Our approach helped to quantify model-specification uncertainty in future forest pest outbreak characteristics. It may contribute to sounder decision-making by acknowledging the limits of the projections and help to identify areas where model-specification uncertainty is high. As such, we further stress that this uncertainty should be strongly considered when making forest management plans, notably by adopting adaptive management strategies so as to reduce future risks. © 2015 Her Majesty the Queen in Right of Canada Global Change Biology © 2015 Published by John

  10. Seismic risk analysis for General Electric Plutonium Facility, Pleasanton, California. Final report, part II

    International Nuclear Information System (INIS)

    1980-01-01

    This report is the second of a two part study addressing the seismic risk or hazard of the special nuclear materials (SNM) facility of the General Electric Vallecitos Nuclear Center at Pleasanton, California. The Part I companion to this report, dated July 31, 1978, presented the seismic hazard at the site that resulted from exposure to earthquakes on the Calaveras, Hayward, San Andreas and, additionally, from smaller unassociated earthquakes that could not be attributed to these specific faults. However, while this study was in progress, certain additional geologic information became available that could be interpreted in terms of the existance of a nearby fault. Although substantial geologic investigations were subsequently deployed, the existance of this postulated fault, called the Verona Fault, remained very controversial. The purpose of the Part II study was to assume the existance of such a capable fault and, under this assumption, to examine the loads that the fault could impose on the SNM facility. This report first reviews the geologic setting with a focus on specifying sufficient geologic parameters to characterize the postulated fault. The report next presents the methodology used to calculate the vibratory ground motion hazard. Because of the complexity of the fault geometry, a slightly different methodology is used here compared to the Part I report. This section ends with the results of the calculation applied to the SNM facility. Finally, the report presents the methodology and results of the rupture hazard calculation

  11. CHILD WELFARE IN CANADA : PART II

    OpenAIRE

    松本, 眞一; Shinichi, Matsumoto; 桃山学院大学社会学部

    2006-01-01

    This part study aims to research on the whole aspect of child protection in Canada. And so, this paper consists of five chapters as follows: (1)Canadian history of child protection, (2)definition of child abuse, (3)current situation of child protection in Canada, (4)outline of child protection and treatment, (5)triangular comparison of child protection and prevention in Canada, Australia and England. The first efforts at identifying and combating child abuse occurred in the latter part of the...

  12. Interpretation of the peak areas in gamma-ray spectra that have a large relative uncertainty

    International Nuclear Information System (INIS)

    Korun, M.; Maver Modec, P.; Vodenik, B.

    2012-01-01

    Empirical evidence is provided that the areas of peaks having a relative uncertainty in excess of 30% are overestimated. This systematic influence is of a statistical nature and originates in way the peak-analyzing routine recognizes the small peaks. It is not easy to detect this influence since it is smaller than the peak-area uncertainty. However, the systematic influence can be revealed in repeated measurements under the same experimental conditions, e.g., in background measurements. To evaluate the systematic influence, background measurements were analyzed with the peak-analyzing procedure described by Korun et al. (2008). The magnitude of the influence depends on the relative uncertainty of the peak area and may amount, in the conditions used in the peak analysis, to a factor of 5 at relative uncertainties exceeding 60%. From the measurements, the probability for type-II errors, as a function of the relative uncertainty of the peak area, was extracted. This probability is near zero below an uncertainty of 30% and rises to 90% at uncertainties exceeding 50%. - Highlights: ► A systematic influence affecting small peak areas in gamma-ray spectra is described. ► The influence originates in the peak locating procedure, using a pre-determined sensitivity. ► The predetermined sensitivity makes peak areas with large uncertainties to be overestimated. ► The influence depends on the relative uncertainty of the number of counts in the peak. ► Corrections exceeding a factor of 3 are attained at peak area uncertainties exceeding 60%.

  13. Chemical kinetic model uncertainty minimization through laminar flame speed measurements

    Science.gov (United States)

    Park, Okjoo; Veloo, Peter S.; Sheen, David A.; Tao, Yujie; Egolfopoulos, Fokion N.; Wang, Hai

    2016-01-01

    Laminar flame speed measurements were carried for mixture of air with eight C3-4 hydrocarbons (propene, propane, 1,3-butadiene, 1-butene, 2-butene, iso-butene, n-butane, and iso-butane) at the room temperature and ambient pressure. Along with C1-2 hydrocarbon data reported in a recent study, the entire dataset was used to demonstrate how laminar flame speed data can be utilized to explore and minimize the uncertainties in a reaction model for foundation fuels. The USC Mech II kinetic model was chosen as a case study. The method of uncertainty minimization using polynomial chaos expansions (MUM-PCE) (D.A. Sheen and H. Wang, Combust. Flame 2011, 158, 2358–2374) was employed to constrain the model uncertainty for laminar flame speed predictions. Results demonstrate that a reaction model constrained only by the laminar flame speed values of methane/air flames notably reduces the uncertainty in the predictions of the laminar flame speeds of C3 and C4 alkanes, because the key chemical pathways of all of these flames are similar to each other. The uncertainty in model predictions for flames of unsaturated C3-4 hydrocarbons remain significant without considering fuel specific laminar flames speeds in the constraining target data set, because the secondary rate controlling reaction steps are different from those in the saturated alkanes. It is shown that the constraints provided by the laminar flame speeds of the foundation fuels could reduce notably the uncertainties in the predictions of laminar flame speeds of C4 alcohol/air mixtures. Furthermore, it is demonstrated that an accurate prediction of the laminar flame speed of a particular C4 alcohol/air mixture is better achieved through measurements for key molecular intermediates formed during the pyrolysis and oxidation of the parent fuel. PMID:27890938

  14. Hydrological model uncertainty due to spatial evapotranspiration estimation methods

    Czech Academy of Sciences Publication Activity Database

    Yu, X.; Lamačová, Anna; Duffy, Ch.; Krám, P.; Hruška, Jakub

    2016-01-01

    Roč. 90, part B (2016), s. 90-101 ISSN 0098-3004 R&D Projects: GA MŠk(CZ) LO1415 Institutional support: RVO:67179843 Keywords : Uncertainty * Evapotranspiration * Forest management * PIHM * Biome-BGC Subject RIV: DA - Hydrology ; Limnology OBOR OECD: Hydrology Impact factor: 2.533, year: 2016

  15. A novel dose uncertainty model and its application for dose verification

    International Nuclear Information System (INIS)

    Jin Hosang; Chung Heetaek; Liu Chihray; Palta, Jatinder; Suh, Tae-Suk; Kim, Siyong

    2005-01-01

    Based on statistical approach, a novel dose uncertainty model was introduced considering both nonspatial and spatial dose deviations. Non-space-oriented uncertainty is mainly caused by dosimetric uncertainties, and space-oriented dose uncertainty is the uncertainty caused by all spatial displacements. Assuming these two parts are independent, dose difference between measurement and calculation is a linear combination of nonspatial and spatial dose uncertainties. Two assumptions were made: (1) the relative standard deviation of nonspatial dose uncertainty is inversely proportional to the dose standard deviation σ, and (2) the spatial dose uncertainty is proportional to the gradient of dose. The total dose uncertainty is a quadratic sum of the nonspatial and spatial uncertainties. The uncertainty model provides the tolerance dose bound for comparison between calculation and measurement. In the statistical uncertainty model based on a Gaussian distribution, a confidence level of 3σ theoretically confines 99.74% of measurements within the bound. By setting the confidence limit, the tolerance bound for dose comparison can be made analogous to that of existing dose comparison methods (e.g., a composite distribution analysis, a γ test, a χ evaluation, and a normalized agreement test method). However, the model considers the inherent dose uncertainty characteristics of the test points by taking into account the space-specific history of dose accumulation, while the previous methods apply a single tolerance criterion to the points, although dose uncertainty at each point is significantly different from others. Three types of one-dimensional test dose distributions (a single large field, a composite flat field made by two identical beams, and three-beam intensity-modulated fields) were made to verify the robustness of the model. For each test distribution, the dose bound predicted by the uncertainty model was compared with simulated measurements. The simulated

  16. Proficiency testing as a basis for estimating uncertainty of measurement: application to forensic alcohol and toxicology quantitations.

    Science.gov (United States)

    Wallace, Jack

    2010-05-01

    While forensic laboratories will soon be required to estimate uncertainties of measurement for those quantitations reported to the end users of the information, the procedures for estimating this have been little discussed in the forensic literature. This article illustrates how proficiency test results provide the basis for estimating uncertainties in three instances: (i) For breath alcohol analyzers the interlaboratory precision is taken as a direct measure of uncertainty. This approach applies when the number of proficiency tests is small. (ii) For blood alcohol, the uncertainty is calculated from the differences between the laboratory's proficiency testing results and the mean quantitations determined by the participants; this approach applies when the laboratory has participated in a large number of tests. (iii) For toxicology, either of these approaches is useful for estimating comparability between laboratories, but not for estimating absolute accuracy. It is seen that data from proficiency tests enable estimates of uncertainty that are empirical, simple, thorough, and applicable to a wide range of concentrations.

  17. Quantum Action Principle with Generalized Uncertainty Principle

    OpenAIRE

    Gu, Jie

    2013-01-01

    One of the common features in all promising candidates of quantum gravity is the existence of a minimal length scale, which naturally emerges with a generalized uncertainty principle, or equivalently a modified commutation relation. Schwinger's quantum action principle was modified to incorporate this modification, and was applied to the calculation of the kernel of a free particle, partly recovering the result previously studied using path integral.

  18. Radiotherapy for breast cancer: respiratory and set-up uncertainties

    International Nuclear Information System (INIS)

    Saliou, M.G.; Giraud, P.; Simon, L.; Fournier-Bidoz, N.; Fourquet, A.; Dendale, R.; Rosenwald, J.C.; Cosset, J.M.

    2005-01-01

    Adjuvant Radiotherapy has been shown to significantly reduce locoregional recurrence but this advantage is associated with increased cardiovascular and pulmonary morbidities. All uncertainties inherent to conformal radiation therapy must be identified in order to increase the precision of treatment; misestimation of these uncertainties increases the potential risk of geometrical misses with, as a consequence, under-dosage of the tumor and/or overdosage of healthy tissues. Geometric uncertainties due to respiratory movements or set-up errors are well known. Two strategies have been proposed to limit their effect: quantification of these uncertainties, which are then taken into account in the final calculation of safety margins and/or reduction of respiratory and set-up uncertainties by an efficient immobilization or gating systems. Measured on portal films with two tangential fields. CLD (central lung distance), defined as the distance between the deep field edge and the interior chest wall at the central axis, seems to be the best predictor of set-up uncertainties. Using CLD, estimated mean set-up errors from the literature are 3.8 and 3.2 mm for the systematic and random errors respectively. These depend partly on the type of immobilization device and could be reduced by the use of portal imaging systems. Furthermore, breast is mobile during respiration with motion amplitude as high as 0.8 to 10 mm in the anteroposterior direction. Respiratory gating techniques, currently on evaluation, have the potential to reduce effect of these movements. Each radiotherapy department should perform its own assessments and determine the geometric uncertainties with respect of the equipment used and its particular treatment practices. This paper is a review of the main geometric uncertainties in breast treatment, due to respiration and set-up, and solutions proposed to limit their impact. (author)

  19. Radiation protection instruments based on tissue equivalent proportional counters: Part II of an international intercomparison

    International Nuclear Information System (INIS)

    Alberts, W.G.; Dietz, E.; Guldbakke, S.; Kluge, H.; Schumacher, H.

    1988-04-01

    This report describes the irradiation conditions and procedures of Part II of an international intercomparison of tissue-equivalent proportional counters used for radiation protection measurements. The irradiations took place in monoenergetic reference neutron fields produced by the research reactor and accelerator facilities of the PTB Braunschweig in the range from thermal neutrons to 14.8 MeV. In addition measurements were performed in 60 Co and D 2 O-moderated 252 Cf radiation fields. Prototype instruments from 7 European groups were investigated. The results of the measurements are summarized and compared with the reference data of the irradiations. (orig.) [de

  20. The prediction of creep damage in Type 347 weld metal: part II creep fatigue tests

    International Nuclear Information System (INIS)

    Spindler, M.W.

    2005-01-01

    Calculations of creep damage under conditions of strain control are often carried out using either a time fraction approach or a ductility exhaustion approach. In part I of this paper the rupture strength and creep ductility data for a Type 347 weld metal were fitted to provide the material properties that are used to calculate creep damage. Part II of this paper examines whether the time fraction approach or the ductility exhaustion approach gives the better predictions of creep damage in creep-fatigue tests on the same Type 347 weld metal. In addition, a new creep damage model, which was developed by removing some of the simplifying assumptions that are made in the ductility exhaustion approach, was used. This new creep damage model is a function of the strain rate, stress and temperature and was derived from creep and constant strain rate test data using a reverse modelling technique (see part I of this paper). It is shown that the new creep damage model gives better predictions of creep damage in the creep-fatigue tests than the time fraction and the ductility exhaustion approaches

  1. A comprehensive review and update on the biologic treatment of adult noninfectious uveitis: part II.

    Science.gov (United States)

    Lee, Kyungmin; Bajwa, Asima; Freitas-Neto, Clovis A; Metzinger, Jamie Lynne; Wentworth, Bailey A; Foster, C Stephen

    2014-11-01

    Treatment of adult, noninfectious uveitis remains a major challenge for ophthalmologists around the world, especially in regard to recalcitrant cases. It is reported to comprise approximately 10% of preventable blindness in the USA. The cause of uveitis can be idiopathic or associated with infectious and systemic disorders. The era of biologic medical therapies provides new options for patients with otherwise treatment-resistant inflammatory eye disease. This two-part review gives a comprehensive overview of the existing medical treatment options for patients with adult, noninfectious uveitis, as well as important advances for the treatment ocular inflammation. Part I covers classic immunomodulation and latest information on corticosteroid therapy. In part II, emerging therapies are discussed, including biologic response modifiers, experimental treatments and ongoing clinical studies for uveitis. The hazard of chronic corticosteroid use in the treatment of adult, noninfectious uveitis is well documented. Corticosteroid-sparing therapies, which offer a very favorable risk-benefit profile when administered properly, should be substituted. Although nothing is currently approved for on-label use in this indication, many therapies, through either translation or novel basic science research, have the potential to fill the currently exposed gaps.

  2. Preliminary Uncertainty Analysis for SMART Digital Core Protection and Monitoring System

    International Nuclear Information System (INIS)

    Koo, Bon Seung; In, Wang Kee; Hwang, Dae Hyun

    2012-01-01

    The Korea Atomic Energy Research Institute (KAERI) developed on-line digital core protection and monitoring systems, called SCOPS and SCOMS as a part of SMART plant protection and monitoring system. SCOPS simplified the protection system by directly connecting the four RSPT signals to each core protection channel and eliminated the control element assembly calculator (CEAC) hardware. SCOMS adopted DPCM3D method in synthesizing core power distribution instead of Fourier expansion method being used in conventional PWRs. The DPCM3D method produces a synthetic 3-D power distribution by coupling a neutronics code and measured in-core detector signals. The overall uncertainty analysis methodology which is used statistically combining uncertainty components of SMART core protection and monitoring system was developed. In this paper, preliminary overall uncertainty factors for SCOPS/SCOMS of SMART initial core were evaluated by applying newly developed uncertainty analysis method

  3. Regeneration decisions in forestry under climate change related uncertainties and risks

    DEFF Research Database (Denmark)

    Schou, Erik; Thorsen, Bo Jellesmark; Jacobsen, Jette Bredahl

    2015-01-01

    ) assigned to each outcome. Results show that the later a forest manager expects to obtain certainty about climate change or the more skewed their belief distribution, the more will decisions be based on ex ante assessments — suggesting that if forest managers believe that climate change uncertainty......Future climate development and its effects on forest ecosystems are not easily predicted or described in terms of standard probability concepts. Nevertheless, forest managers continuously make long-term decisions that will be subject to climate change impacts. The manager's assessment of possible...... to generate a set of alternative outcomes, investigating effects on decision making of three aspects of uncertainty: (i) the perceived time horizon before there will be certainty on outcome, (ii) the spread of impacts across the set of alternative outcomes, and (iii) the subjective probability (belief...

  4. Uncertainty in social dilemmas

    OpenAIRE

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size uncertainty). Several researchers have therefore asked themselves the question as to how such uncertainty influences people’s choice behavior. These researchers have repeatedly concluded that uncertainty...

  5. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  6. On the uncertainty inequality as applied to discrete signals

    Directory of Open Access Journals (Sweden)

    Y. V. Venkatesh

    2006-01-01

    Full Text Available Given a continuous-time bandlimited signal, the Shannon sampling theorem provides an interpolation scheme for exactly reconstructing it from its discrete samples. We analyze the relationship between concentration (or compactness in the temporal/spectral domains of the (i continuous-time and (ii discrete-time signals. The former is governed by the Heisenberg uncertainty inequality which prescribes a lower bound on the product of effective temporal and spectral spreads of the signal. On the other hand, the discrete-time counterpart seems to exhibit some strange properties, and this provides motivation for the present paper. We consider the following problem: for a bandlimited signal, can the uncertainty inequality be expressed in terms of the samples, using thestandard definitions of the temporal and spectral spreads of the signal? In contrast with the results of the literature, we present a new approach to solve this problem. We also present a comparison of the results obtained using the proposed definitions with those available in the literature.

  7. Forensic Uncertainty Quantification of Explosive Dispersal of Particles

    Science.gov (United States)

    Hughes, Kyle; Park, Chanyoung; Haftka, Raphael; Kim, Nam-Ho

    2017-06-01

    In addition to the numerical challenges of simulating the explosive dispersal of particles, validation of the simulation is often plagued with poor knowledge of the experimental conditions. The level of experimental detail required for validation is beyond what is usually included in the literature. This presentation proposes the use of forensic uncertainty quantification (UQ) to investigate validation-quality experiments to discover possible sources of uncertainty that may have been missed in initial design of experiments or under-reported. The current experience of the authors has found that by making an analogy to crime scene investigation when looking at validation experiments, valuable insights may be gained. One examines all the data and documentation provided by the validation experimentalists, corroborates evidence, and quantifies large sources of uncertainty a posteriori with empirical measurements. In addition, it is proposed that forensic UQ may benefit from an independent investigator to help remove possible implicit biases and increases the likelihood of discovering unrecognized uncertainty. Forensic UQ concepts will be discussed and then applied to a set of validation experiments performed at Eglin Air Force Base. This work was supported in part by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program.

  8. The uncertainty processing theory of motivation.

    Science.gov (United States)

    Anselme, Patrick

    2010-04-02

    Most theories describe motivation using basic terminology (drive, 'wanting', goal, pleasure, etc.) that fails to inform well about the psychological mechanisms controlling its expression. This leads to a conception of motivation as a mere psychological state 'emerging' from neurophysiological substrates. However, the involvement of motivation in a large number of behavioural parameters (triggering, intensity, duration, and directedness) and cognitive abilities (learning, memory, decision, etc.) suggest that it should be viewed as an information processing system. The uncertainty processing theory (UPT) presented here suggests that motivation is the set of cognitive processes allowing organisms to extract information from the environment by reducing uncertainty about the occurrence of psychologically significant events. This processing of information is shown to naturally result in the highlighting of specific stimuli. The UPT attempts to solve three major problems: (i) how motivations can affect behaviour and cognition so widely, (ii) how motivational specificity for objects and events can result from nonspecific neuropharmacological causal factors (such as mesolimbic dopamine), and (iii) how motivational interactions can be conceived in psychological terms, irrespective of their biological correlates. The UPT is in keeping with the conceptual tradition of the incentive salience hypothesis while trying to overcome the shortcomings inherent to this view. Copyright 2009 Elsevier B.V. All rights reserved.

  9. French RSE-M and RCC-MR code appendices for flaw analysis: Presentation of the fracture parameters calculation-Part II: Cracked plates

    International Nuclear Information System (INIS)

    Marie, S.; Chapuliot, S.; Kayser, Y.; Lacire, M.H.; Drubay, B.; Barthelet, B.; Le Delliou, P.; Rougier, V.; Naudin, C.; Gilles, P.; Triay, M.

    2007-01-01

    French nuclear codes include flaw assessment procedures: the RSE-M Code 'Rules for In-service Inspection of Nuclear Power Plant Components' and the RCC-MR code 'Design and Construction rules for mechanical components of FBR nuclear islands and high temperature applications'. An important effort of development of these analytical methods has been made for the last 10 years in the frame of a collaboration between CEA, EDF and AREVA-NP, and in the frame of R and D actions involving CEA and IRSN. These activities have led to a unification of the common methods of the two codes. The calculation of fracture mechanics parameters, and in particular the stress intensity factor K I and the J integral, has been widely developed for industrial configurations. All the developments have been integrated in the 2005 edition of RSE-M and in the 2007 edition of RCC-MR. This series of articles is composed of 5 parts: the first part presents an overview of the methods proposed in the RCC-MR and RSE-M codes. Parts II-IV provide compendia for specific components. The geometries are plates (part II), pipes (part III) and elbows (part IV). Finally, part V presents the validation elements of the methods, with details on the process followed for the development and evaluation of the accuracy of the proposed analytical methods. This second article in the series presents all details for the stress intensity factor and J calculations for cracked plates. General data applicable for all defect geometries are first presented, and then, available defect geometries where compendia for K I and σ ref calculation are provided are given

  10. Experimental programme and analysis, ZENITH II, Core 4

    Energy Technology Data Exchange (ETDEWEB)

    Ingram, G.; Sanders, J. E.; Sherwin, J.

    1974-10-15

    The Phase 3 program of reactor physics experiments on the HTR (or Mk 3 GCR) lattices continued during the first half of 1974 with a study of a series of critical builds in Zenith II aimed at testing predictions of shut-down margins in the local criticality-situations arising during power reactor refueling. The paper describes the experimental program and the subsequent theoretical analysis using methods developed in the United Kingdom for calculating low-enriched uranium HTR fuel systems. The importance of improving the accuracy of predictions of shut-down margins arises from the basic requirement that the core in its most reactive condition and with a specified number of absorbers removed from the array must remain sub-critical with a margin adequate to cover the total uncertainty of +/- 1 Nile (that is, 1 % delta-k). The major uncertainty is that in modelling the complex fuel/absorber configuration, and this is the aspect essentially covered in the Zenith II Core 4 studies.

  11. Moving Beyond 2% Uncertainty: A New Framework for Quantifying Lidar Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer F.; Clifton, Andrew

    2017-03-08

    Remote sensing of wind using lidar is revolutionizing wind energy. However, current generations of wind lidar are ascribed a climatic value of uncertainty, which is based on a poor description of lidar sensitivity to external conditions. In this presentation, we show how it is important to consider the complete lidar measurement process to define the measurement uncertainty, which in turn offers the ability to define a much more granular and dynamic measurement uncertainty. This approach is a progression from the 'white box' lidar uncertainty method.

  12. Compatibility analysis of DUPIC fuel (Part II) - Reactor physics design and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Chang Joon; Choi, Hang Bok; Rhee, Bo Wook; Roh, Gyu Hong; Kim, Do Hun [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    The compatibility analysis of the DUPIC fuel in a CANDU reactor has been assessed. This study includes the fuel composition adjustment, comparison of lattice properties, performance analysis of reactivity devices, determination of regional over-power (ROP) trip setpoint, and uncertainty estimation of core performance parameters. For the DUPIC fuel composition adjustment, three options have been proposed, which can produce uniform neutronic characteristics of the DUPIC fuel. The lattice analysis has shown that the characteristics of the DUPIC fuel is compatible with those of natural uranium fuel. The reactivity devices of the CANDU-6 reactor maintain their functional requirements even for the DUPIC fuel system. The ROP analysis has shown that the trip setpoint is not sacrificed for the DUPIC fuel system owing to the power shape that enhances more thermal margin. The uncertainty analysis of the core performance parameter has shown that the uncertainty associated with the fuel composition variation is reduced appreciably, which is primarily due to the fuel composition adjustment and secondly the on-power refueling feature and spatial control function of the CANDU reactor. The reactor physics calculation has also shown that it is feasible to use spent PWR fuel directly in CANDU reactors without deteriorating the CANDU-6 core physics design requirements. 29 refs., 67 figs., 60 tabs. (Author)

  13. Terminal altitude maximization for Mars entry considering uncertainties

    Science.gov (United States)

    Cui, Pingyuan; Zhao, Zeduan; Yu, Zhengshi; Dai, Juan

    2018-04-01

    Uncertainties present in the Mars atmospheric entry process may cause state deviations from the nominal designed values, which will lead to unexpected performance degradation if the trajectory is designed merely based on the deterministic dynamic model. In this paper, a linear covariance based entry trajectory optimization method is proposed considering the uncertainties presenting in the initial states and parameters. By extending the elements of the state covariance matrix as augmented states, the statistical behavior of the trajectory is captured to reformulate the performance metrics and path constraints. The optimization problem is solved by the GPOPS-II toolbox in MATLAB environment. Monte Carlo simulations are also conducted to demonstrate the capability of the proposed method. Primary trading performances between the nominal deployment altitude and its dispersion can be observed by modulating the weights on the dispersion penalty, and a compromised result referring to maximizing the 3σ lower bound of the terminal altitude is achieved. The resulting path constraints also show better satisfaction in a disturbed environment compared with the nominal situation.

  14. Managing project risks and uncertainties

    Directory of Open Access Journals (Sweden)

    Mike Mentis

    2015-01-01

    Full Text Available This article considers threats to a project slipping on budget, schedule and fit-for-purpose. Threat is used here as the collective for risks (quantifiable bad things that can happen and uncertainties (poorly or not quantifiable bad possible events. Based on experience with projects in developing countries this review considers that (a project slippage is due to uncertainties rather than risks, (b while eventuation of some bad things is beyond control, managed execution and oversight are still the primary means to keeping within budget, on time and fit-for-purpose, (c improving project delivery is less about bigger and more complex and more about coordinated focus, effectiveness and developing thought-out heuristics, and (d projects take longer and cost more partly because threat identification is inaccurate, the scope of identified threats is too narrow, and the threat assessment product is not integrated into overall project decision-making and execution. Almost by definition, what is poorly known is likely to cause problems. Yet it is not just the unquantifiability and intangibility of uncertainties causing project slippage, but that they are insufficiently taken into account in project planning and execution that cause budget and time overruns. Improving project performance requires purpose-driven and managed deployment of scarce seasoned professionals. This can be aided with independent oversight by deeply experienced panelists who contribute technical insights and can potentially show that diligence is seen to be done.

  15. Digital logic circuit design with ALTERA MAX+PLUS II

    International Nuclear Information System (INIS)

    Lee, Seung Ho; Park, Yong Su; Park, Gun Jong; Lee, Ju Heon

    2006-09-01

    This book is composed of five parts. The first part has introduction of ALTERA MAX+PLUS II and graphic editor, text editor, compiler, waveform editor simulator and timing analyzer of it. The second part is about direction of digital logic circuit design with training kit. The third part has grammar and practice of VHDL in ALTERA MAX+PLUS II including example and history of VHDL. The fourth part shows the design example of digital logic circuit by VHDL of ALTERA MAX+PLUS II which lists designs of adder and subtractor, code converter, counter, state machine and LCD module. The last part explains design example of digital logic circuit by graphic editor in ALTERA MAX+PLUS II.

  16. Uncertainty associated with the gravimetric measurement of particulate matter concentration in ambient air.

    Science.gov (United States)

    Lacey, Ronald E; Faulkner, William Brock

    2015-07-01

    matter (PM) concentrations approach regulatory limits, the uncertainty of the measurement is essential in determining the sample size and the probability of type II errors in hypothesis testing. This is an important factor in determining if ambient PM concentrations exceed regulatory limits. The technique described in this paper can be applied to other measurement systems and is especially useful where there are no methods available to generate these values empirically.

  17. Transposing an active fault database into a fault-based seismic hazard assessment for nuclear facilities - Part 2: Impact of fault parameter uncertainties on a site-specific PSHA exercise in the Upper Rhine Graben, eastern France

    Science.gov (United States)

    Chartier, Thomas; Scotti, Oona; Clément, Christophe; Jomard, Hervé; Baize, Stéphane

    2017-09-01

    We perform a fault-based probabilistic seismic hazard assessment (PSHA) exercise in the Upper Rhine Graben to quantify the relative influence of fault parameters on the hazard at the Fessenheim nuclear power plant site. Specifically, we show that the potentially active faults described in the companion paper (Jomard et al., 2017, hereafter Part 1) are the dominant factor in hazard estimates at the low annual probability of exceedance relevant for the safety assessment of nuclear installations. Geological information documenting the activity of the faults in this region, however, remains sparse, controversial and affected by a high degree of uncertainty. A logic tree approach is thus implemented to explore the epistemic uncertainty and quantify its impact on the seismic hazard estimates. Disaggregation of the peak ground acceleration (PGA) hazard at a 10 000-year return period shows that the Rhine River fault is the main seismic source controlling the hazard level at the site. Sensitivity tests show that the uncertainty on the slip rate of the Rhine River fault is the dominant factor controlling the variability of the seismic hazard level, greater than the epistemic uncertainty due to ground motion prediction equations (GMPEs). Uncertainty on slip rate estimates from 0.04 to 0.1 mm yr-1 results in a 40 to 50 % increase in hazard levels at the 10 000-year target return period. Reducing epistemic uncertainty in future fault-based PSHA studies at this site will thus require (1) performing in-depth field studies to better characterize the seismic potential of the Rhine River fault; (2) complementing GMPEs with more physics-based modelling approaches to better account for the near-field effects of ground motion and (3) improving the modelling of the background seismicity. Indeed, in this exercise, we assume that background earthquakes can only host M 6. 0 earthquakes have been recently identified at depth within the Upper Rhine Graben (see Part 1) but are not accounted

  18. Condition trees as a mechanism for communicating the meaning of uncertainties

    Science.gov (United States)

    Beven, Keith

    2015-04-01

    Uncertainty communication for environmental problems is fraught with difficulty for good epistemic reasons. The fact that most sources of uncertainty are subject to, and often dominated by, epistemic uncertainties means that the unthinking use of probability theory might actually be misleading and lead to false inference (even in some cases where the assumptions of a probabilistic error model might seem to be reasonably valid). This therefore creates problems in communicating the meaning of probabilistic uncertainties of model predictions to potential users (there are many examples in hydrology, hydraulics, climate change and other domains). It is suggested that one way of being more explicit about the meaning of uncertainties is to associate each type of application with a condition tree of assumptions that need to be made in producing an estimate of uncertainty. The condition tree then provides a basis for discussion and communication of assumptions about uncertainties with users. Agreement of assumptions (albeit generally at some institutional level) will provide some buy-in on the part of users, and a basis for commissioning of future studies. Even in some relatively well-defined problems, such as mapping flood risk, such a condition tree can be rather extensive, but by making each step in the tree explicit then an audit trail is established for future reference. This can act to provide focus in the exercise of agreeing more realistic assumptions.

  19. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    more reliable predictions with a new and improved set of model parameters for GC (group contribution) based and CI (atom connectivity index) based models and to quantify the uncertainties in the estimated property values from a process design point-of-view. This includes: (i) parameter estimation using....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column......, the developed methodology can be used to quantify the sensitivity of process design to uncertainties in property estimates; obtain rationally the risk/safety factors in process design; and identify additional experimentation needs in order to reduce most critical uncertainties....

  20. Uncertainties and demonstration of compliance with numerical risk standards

    International Nuclear Information System (INIS)

    Preyssl, C.; Cullingford, M.C.

    1987-01-01

    When dealing with numerical results of a probabilistic risk analysis performed for a complex system, such as a nuclear power plant, one major objective may be to deal with the problem of compliance or non-compliance with a prefixed risk standard. The uncertainties in the risk results associated with the consequences and their probabilities of occurrence may be considered by representing the risk as a risk band. Studying the area and distance between the upper and lower bound of the risk band provides consistent information on the uncertainties in terms of risk, not by means of scalars only but also by real functions. Criteria can be defined for determining compliance with a numerical risk standard, and the 'weighting functional' method, representing a possible tool for testing compliance of risk results, is introduced. By shifting the upper confidence bound due to redefinition, part of the risk band may exceed the standard without changing the underlying results. Using the concept described it is possible to determine the amount of risk, i.e. uncertainty, exceeding the standard. The mathematical treatment of uncertainties therefore allows probabilistic risk assessment results to be compared. A realistic example illustrates the method. (author)

  1. Linear systems with unstructured multiplicative uncertainty: Modeling and robust stability analysis.

    Directory of Open Access Journals (Sweden)

    Radek Matušů

    Full Text Available This article deals with continuous-time Linear Time-Invariant (LTI Single-Input Single-Output (SISO systems affected by unstructured multiplicative uncertainty. More specifically, its aim is to present an approach to the construction of uncertain models based on the appropriate selection of a nominal system and a weight function and to apply the fundamentals of robust stability investigation for considered sort of systems. The initial theoretical parts are followed by three extensive illustrative examples in which the first order time-delay, second order and third order plants with parametric uncertainty are modeled as systems with unstructured multiplicative uncertainty and subsequently, the robust stability of selected feedback loops containing constructed models and chosen controllers is analyzed and obtained results are discussed.

  2. Compósitos de borracha natural ou policloropreno e celulose II: influência do tamanho de partícula Natural rubber or chloroprene rubber and cellulose II composites: influence of particle size

    Directory of Open Access Journals (Sweden)

    Bruno de A. Napolitano

    2004-01-01

    Full Text Available O objetivo deste trabalho foi o desenvolvimento de compósitos claros com propriedades de interesse tecnológico utilizando elastômeros com diferentes polaridades. Para que este objetivo fosse atingido, celulose II em pó foi usada como carga, em borracha natural (NR ou policloropreno (CR. A celulose II foi obtida por coagulação da solução de xantato de celulose em meio ácido, sob agitação constante e à temperatura ambiente, constituindo uma nova forma de obtenção deste tipo de carga. Compósitos com 10 phr de celulose II com NR e CR, respectivamente, foram desenvolvidos tendo como variável o tamanho de partícula da carga. As propriedades mecânicas e os aspectos microscópicos dos diferentes compósitos foram avaliados e comparados com aqueles das formulações sem carga. Os resultados permitiram identificar o compósito como o de melhor resultado, influenciado pela polaridade da matriz elastomérica e pelo tamanho de partícula da carga, conseqüência das condições de moagem usadas.The aim of this work was to develop light composites with properties of technological interest by using elastomers of different polarities. This was achieved by employing cellulose II, in the powder form, as filler in natural rubber (NR and chloroprene (CR. Cellulose II was obtained by coagulation of cellulose xanthate solution, in acid medium, under stirring and at room temperature, which represents, to our knowledge, a new way of obtaining this type of filler. Composites with 10phr of cellulose II and NR or CR were prepared having the particle size as variable. The mechanical properties and the microscopic aspect of the different composites were evaluated and compared with compounds without filler. The results indicated best results for the CR composite, influenced by the polarity of the elastomeric matrix and by the particle size, as a consequence of the milling conditions of the filler used.

  3. A Nondominated Genetic Algorithm Procedure for Multiobjective Discrete Network Design under Demand Uncertainty

    Directory of Open Access Journals (Sweden)

    Bian Changzhi

    2015-01-01

    Full Text Available This paper addresses the multiobjective discrete network design problem under demand uncertainty. The OD travel demands are supposed to be random variables with the given probability distribution. The problem is formulated as a bilevel stochastic optimization model where the decision maker’s objective is to minimize the construction cost, the expectation, and the standard deviation of total travel time simultaneously and the user’s route choice is described using user equilibrium model on the improved network under all scenarios of uncertain demand. The proposed model generates globally near-optimal Pareto solutions for network configurations based on the Monte Carlo simulation and nondominated sorting genetic algorithms II. Numerical experiments implemented on Nguyen-Dupuis test network show trade-offs among construction cost, the expectation, and standard deviation of total travel time under uncertainty are obvious. Investment on transportation facilities is an efficient method to improve the network performance and reduce risk under demand uncertainty, but it has an obvious marginal decreasing effect.

  4. Supporting Qualified Database for Uncertainty Evaluation

    International Nuclear Information System (INIS)

    Petruzzi, A.; Fiori, F.; Kovtonyuk, A.; Lisovyy, O.; D'Auria, F.

    2013-01-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS-52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The 'RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QP' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering Handbook) of the input nodalization

  5. Supporting qualified database for uncertainty evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Petruzzi, A.; Fiori, F.; Kovtonyuk, A.; D' Auria, F. [Nuclear Research Group of San Piero A Grado, Univ. of Pisa, Via Livornese 1291, 56122 Pisa (Italy)

    2012-07-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS-52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The' RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QR' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering

  6. PDF4LHC recommendations for LHC Run II

    NARCIS (Netherlands)

    Butterworth, Jon; Carrazza, Stefano; Cooper-Sarkar, Amanda; Roeck, Albert de; Feltesse, Joel; Forte, Stefano; Gao, Jun; Glazov, Sasha; Huston, Joey; Kassabov, Zahari; McNulty, Ronan; Morsch, Andreas; Nadolsky, Pavel; Radescu, Voica; Rojo, Juan; Thorne, Robert S.

    2015-01-01

    We provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+$\\alpha_s$ uncertainties suitable for applications at the LHC Run II. We review developments since the previous PDF4LHC recommendation, and discuss and compare the new

  7. Accounting for sensor calibration, data validation, measurement and sampling uncertainties in monitoring urban drainage systems.

    Science.gov (United States)

    Bertrand-Krajewski, J L; Bardin, J P; Mourad, M; Béranger, Y

    2003-01-01

    Assessing the functioning and the performance of urban drainage systems on both rainfall event and yearly time scales is usually based on online measurements of flow rates and on samples of influent effluent for some rainfall events per year. In order to draw pertinent scientific and operational conclusions from the measurement results, it is absolutely necessary to use appropriate methods and techniques in order to i) calibrate sensors and analytical methods, ii) validate raw data, iii) evaluate measurement uncertainties, iv) evaluate the number of rainfall events to sample per year in order to determine performance indicator with a given uncertainty. Based an previous work, the paper gives a synthetic review of required and techniques, and illustrates their application to storage and settling tanks. Experiments show that, controlled and careful experimental conditions, relative uncertainties are about 20% for flow rates in sewer pipes, 6-10% for volumes, 25-35% for TSS concentrations and loads, and 18-276% for TSS removal rates. In order to evaluate the annual pollutant interception efficiency of storage and settling tanks with a given uncertainty, efforts should first be devoted to decrease the sampling uncertainty by increasing the number of sampled events.

  8. An introductory guide to uncertainty analysis in environmental and health risk assessment

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Hammonds, J.S.

    1992-10-01

    To compensate for the potential for overly conservative estimates of risk using standard US Environmental Protection Agency methods, an uncertainty analysis should be performed as an integral part of each risk assessment. Uncertainty analyses allow one to obtain quantitative results in the form of confidence intervals that will aid in decision making and will provide guidance for the acquisition of additional data. To perform an uncertainty analysis, one must frequently rely on subjective judgment in the absence of data to estimate the range and a probability distribution describing the extent of uncertainty about a true but unknown value for each parameter of interest. This information is formulated from professional judgment based on an extensive review of literature, analysis of the data, and interviews with experts. Various analytical and numerical techniques are available to allow statistical propagation of the uncertainty in the model parameters to a statement of uncertainty in the risk to a potentially exposed individual. Although analytical methods may be straightforward for relatively simple models, they rapidly become complicated for more involved risk assessments. Because of the tedious efforts required to mathematically derive analytical approaches to propagate uncertainty in complicated risk assessments, numerical methods such as Monte Carlo simulation should be employed. The primary objective of this report is to provide an introductory guide for performing uncertainty analysis in risk assessments being performed for Superfund sites

  9. EL español andino. II parte

    Directory of Open Access Journals (Sweden)

    Rubén Arboleda Toro

    2002-01-01

    Full Text Available En el número 13 de esta revista (nov. del 2000 se publicó una primera parte del estudio sobre el español andino. Presentamos ahora una segunda parte que comprende aspectos histórico-geográficos de Nariño y Putumayo andinos, región de Colombia donde se habla esa variedad, y una descripción general de su realidad lingüística. Esperamos que sean objeto de otra publicación la descripción de los rasgos dialectales del español andino, parte nuclear del trabajo, y la presentación de la metodología y el corpus. En esto nos encontramos trabajando. Incluimos no obstante un inventario de rasgos más amplio que el presentado en la primera parte. Pero por ahora se trata de eso, de un inventario ilustrativo, no del análisis en el que estamos empeñados, en el marco del contacto de lenguas, el cambio lingüístico y la relación entre la norma y las posibilidades del sistema. Para contextualizar esta segunda parte, incluimos, a manera de introducción, un resumen de la primera.

  10. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    Science.gov (United States)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  11. Application Of Global Sensitivity Analysis And Uncertainty Quantification In Dynamic Modelling Of Micropollutants In Stormwater Runoff

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen

    2012-01-01

    of uncertainty in a conceptual lumped dynamic stormwater runoff quality model that is used in a study catchment to estimate (i) copper loads, (ii) compliance with dissolved Cu concentration limits on stormwater discharge and (iii) the fraction of Cu loads potentially intercepted by a planned treatment facility...

  12. Optical photon transport in powdered-phosphor scintillators. Part II. Calculation of single-scattering transport parameters

    Energy Technology Data Exchange (ETDEWEB)

    Poludniowski, Gavin G. [Joint Department of Physics, Division of Radiotherapy and Imaging, Institute of Cancer Research and Royal Marsden NHS Foundation Trust, Downs Road, Sutton, Surrey SM2 5PT, United Kingdom and Centre for Vision Speech and Signal Processing (CVSSP), Faculty of Engineering and Physical Sciences, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom); Evans, Philip M. [Centre for Vision Speech and Signal Processing (CVSSP), Faculty of Engineering and Physical Sciences, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom)

    2013-04-15

    Purpose: Monte Carlo methods based on the Boltzmann transport equation (BTE) have previously been used to model light transport in powdered-phosphor scintillator screens. Physically motivated guesses or, alternatively, the complexities of Mie theory have been used by some authors to provide the necessary inputs of transport parameters. The purpose of Part II of this work is to: (i) validate predictions of modulation transform function (MTF) using the BTE and calculated values of transport parameters, against experimental data published for two Gd{sub 2}O{sub 2}S:Tb screens; (ii) investigate the impact of size-distribution and emission spectrum on Mie predictions of transport parameters; (iii) suggest simpler and novel geometrical optics-based models for these parameters and compare to the predictions of Mie theory. A computer code package called phsphr is made available that allows the MTF predictions for the screens modeled to be reproduced and novel screens to be simulated. Methods: The transport parameters of interest are the scattering efficiency (Q{sub sct}), absorption efficiency (Q{sub abs}), and the scatter anisotropy (g). Calculations of these parameters are made using the analytic method of Mie theory, for spherical grains of radii 0.1-5.0 {mu}m. The sensitivity of the transport parameters to emission wavelength is investigated using an emission spectrum representative of that of Gd{sub 2}O{sub 2}S:Tb. The impact of a grain-size distribution in the screen on the parameters is investigated using a Gaussian size-distribution ({sigma}= 1%, 5%, or 10% of mean radius). Two simple and novel alternative models to Mie theory are suggested: a geometrical optics and diffraction model (GODM) and an extension of this (GODM+). Comparisons to measured MTF are made for two commercial screens: Lanex Fast Back and Lanex Fast Front (Eastman Kodak Company, Inc.). Results: The Mie theory predictions of transport parameters were shown to be highly sensitive to both grain size

  13. Automating Embedded Analysis Capabilities and Managing Software Complexity in Multiphysics Simulation, Part II: Application to Partial Differential Equations

    Directory of Open Access Journals (Sweden)

    Roger P. Pawlowski

    2012-01-01

    Full Text Available A template-based generic programming approach was presented in Part I of this series of papers [Sci. Program. 20 (2012, 197–219] that separates the development effort of programming a physical model from that of computing additional quantities, such as derivatives, needed for embedded analysis algorithms. In this paper, we describe the implementation details for using the template-based generic programming approach for simulation and analysis of partial differential equations (PDEs. We detail several of the hurdles that we have encountered, and some of the software infrastructure developed to overcome them. We end with a demonstration where we present shape optimization and uncertainty quantification results for a 3D PDE application.

  14. Doppler reactivity uncertainties and their effect upon a hypothetical LOF accident

    International Nuclear Information System (INIS)

    Malloy, D.J.

    1976-01-01

    The statistical uncertainties and the major methodological errors which contribute to the Doppler feedback uncertainty were reviewed and investigated. Improved estimates for the magnitudes of each type of uncertainty were established. The generally applied reactivity feedback methodology has been extended by explicitly treating the coupling effect which exists between the various feedback components. The improved methodology was specifically applied to the coupling of Doppler and sodium void reactivities. In addition, the description of the temperature dependence of the Doppler feedback has been improved by the use of a two-constant formula on a global and regional basis. Feedback and coupling coefficients are presented as a first comparison of the improved and the currently applied methods. Further, the energy release which results from hypothetical disassembly accidents was simulated with a special response surface in the parametric safety evaluation code PARSEC. The impact of the improved feedback methodology and of Doppler coefficient uncertainties was illustrated by the usual parametric relationship between available work-energy and the Doppler coefficient. The work-energy was calculated with the VENUS-II disassembly code and was represented as a response surface in PARSEC. Additionally, the probability distribution for available work-energy, which results from the statistical uncertainty of the Doppler coefficient, was calculated for the current and the improved feedback methodology. The improved feedback description yielded about a 16 percent higher average value for the work-energy. A substantially larger increase is found on the high-yield end of the spectrum: the probability for work-energy above 500 MJ was increased by about a factor of ten

  15. Methodology for the assessment of measuring uncertainties of articulated arm coordinate measuring machines

    International Nuclear Information System (INIS)

    Romdhani, Fekria; Hennebelle, François; Ge, Min; Juillion, Patrick; Fontaine, Jean François; Coquet, Richard

    2014-01-01

    Articulated Arm Coordinate Measuring Machines (AACMMs) have gradually evolved and are increasingly used in mechanical industry. At present, measurement uncertainties relating to the use of these devices are not yet well quantified. The work carried out consists of determining the measurement uncertainties of a mechanical part by an AACMM. The studies aiming to develop a model of measurement uncertainty are based on the Monte Carlo method developed in Supplement 1 of the Guide to Expression of Uncertainty in Measurement [1] but also identifying and characterizing the main sources of uncertainty. A multi-level Monte Carlo approach principle has been developed which allows for characterizing the possible evolution of the AACMM during the measurement and quantifying in a second level the uncertainty on the considered measurand. The first Monte Carlo level is the most complex and is thus divided into three sub-levels, namely characterization on the positioning error of a point, estimation of calibration errors and evaluation of fluctuations of the ‘localization point’. The global method is thus presented and results of the first sub-level are particularly developed. The main sources of uncertainty, including AACMM deformations, are exposed. (paper)

  16. Results from the Application of Uncertainty Methods in the CSNI Uncertainty Methods Study (UMS)

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a - best estimate - concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced -best estimate- thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges.

  17. Information Seeking in Uncertainty Management Theory: Exposure to Information About Medical Uncertainty and Information-Processing Orientation as Predictors of Uncertainty Management Success.

    Science.gov (United States)

    Rains, Stephen A; Tukachinsky, Riva

    2015-01-01

    Uncertainty management theory outlines the processes through which individuals cope with health-related uncertainty. Information seeking has been frequently documented as an important uncertainty management strategy. The reported study investigates exposure to specific types of medical information during a search, and one's information-processing orientation as predictors of successful uncertainty management (i.e., a reduction in the discrepancy between the level of uncertainty one feels and the level one desires). A lab study was conducted in which participants were primed to feel more or less certain about skin cancer and then were allowed to search the World Wide Web for skin cancer information. Participants' search behavior was recorded and content analyzed. The results indicate that exposure to two health communication constructs that pervade medical forms of uncertainty (i.e., severity and susceptibility) and information-processing orientation predicted uncertainty management success.

  18. Defining Allowable Physical Property Variations for High Accurate Measurements on Polymer Parts

    DEFF Research Database (Denmark)

    Mohammadi, Ali; Sonne, Mads Rostgaard; Madruga, Daniel González

    2015-01-01

    Measurement conditions and material properties have a significant impact on the dimensions of a part, especially for polymers parts. Temperature variation causes part deformations that increase the uncertainty of the measurement process. Current industrial tolerances of a few micrometres demand...... high accurate measurements in non-controlled ambient. Most of polymer parts are manufactured by injection moulding and their inspection is carried out after stabilization, around 200 hours. The overall goal of this work is to reach ±5μm in uncertainty measurements a polymer products which...

  19. Conditional uncertainty principle

    Science.gov (United States)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  20. The students' ability in the mathematical literacy for uncertainty problems on the PISA adaptation test

    Science.gov (United States)

    Julie, Hongki; Sanjaya, Febi; Anggoro, Ant. Yudhi

    2017-08-01

    One of purposes of this study was to describe the solution profile of the junior high school students for the PISA adaptation test. The procedures conducted by researchers to achieve this objective were (1) adapting the PISA test, (2) validating the adapting PISA test, (3) asking junior high school students to do the adapting PISA test, and (4) making the students' solution profile. The PISA problems for mathematics could be classified into four areas, namely quantity, space and shape, change and relationship, and uncertainty. The research results that would be presented in this paper were the result test for uncertainty problems. In the adapting PISA test, there were fifteen questions. Subjects in this study were 18 students from 11 junior high schools in Yogyakarta, Central Java, and Banten. The type of research that used by the researchers was a qualitative research. For the first uncertainty problem in the adapting test, 66.67% of students reached level 3. For the second uncertainty problem in the adapting test, 44.44% of students achieved level 4, and 33.33% of students reached level 3. For the third uncertainty problem in the adapting test n, 38.89% of students achieved level 5, 11.11% of students reached level 4, and 5.56% of students achieved level 3. For the part a of the fourth uncertainty problem in the adapting test, 72.22% of students reached level 4 and for the part b of the fourth uncertainty problem in the adapting test, 83.33% students achieved level 4.

  1. Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.

    Science.gov (United States)

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2009-04-01

    The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.

  2. Uncertainty modeling in vibration, control and fuzzy analysis of structural systems

    CERN Document Server

    Halder, Achintya; Ayyub, Bilal M

    1997-01-01

    This book gives an overview of the current state of uncertainty modeling in vibration, control, and fuzzy analysis of structural and mechanical systems. It is a coherent compendium written by leading experts and offers the reader a sampling of exciting research areas in several fast-growing branches in this field. Uncertainty modeling and analysis are becoming an integral part of system definition and modeling in many fields. The book consists of ten chapters that report the work of researchers, scientists and engineers on theoretical developments and diversified applications in engineering sy

  3. THE LIFETIME AND POWERS OF FR IIs IN GALAXY CLUSTERS

    International Nuclear Information System (INIS)

    Antognini, Joe; Bird, Jonathan; Martini, Paul

    2012-01-01

    We have identified and studied a sample of 151 FR IIs found in brightest cluster galaxies (BCGs) in the MaxBCG cluster catalog with data from FIRST and NVSS. We have compared the radio luminosities and projected lengths of these FR IIs to the projected length distribution of a range of mock catalogs generated by an FR II model and estimate the FR II lifetime to be 1.9 × 10 8 yr. The uncertainty in the lifetime calculation is a factor of two, primarily due to uncertainties in the intracluster medium (ICM) density and the FR II axial ratio. We furthermore measure the jet power distribution of FR IIs in BCGs and find that it is well described by a log-normal distribution with a median power of 1.1 × 10 37 W and a coefficient of variation of 2.2. These jet powers are nearly linearly related to the observed luminosities, and this relation is steeper than many other estimates, although it is dependent on the jet model. We investigate correlations between FR II and cluster properties and find that galaxy luminosity is correlated with jet power. This implies that jet power is also correlated with black hole mass, as the stellar luminosity of a BCG should be a good proxy for its spheroid mass and therefore the black hole mass. Jet power, however, is not correlated with cluster richness, nor is FR II lifetime strongly correlated with any cluster properties. We calculate the enthalpy of the lobes to examine the impact of the FR IIs on the ICM and find that heating due to adiabatic expansion is too small to offset radiative cooling by a factor of at least six. In contrast, the jet power is approximately an order of magnitude larger than required to counteract cooling. We conclude that if feedback from FR IIs offsets cooling of the ICM, then heating must be primarily due to another mechanism associated with FR II expansion.

  4. THE LIFETIME AND POWERS OF FR IIs IN GALAXY CLUSTERS

    Energy Technology Data Exchange (ETDEWEB)

    Antognini, Joe; Bird, Jonathan; Martini, Paul, E-mail: antognini@astronomy.ohio-state.edu, E-mail: bird@astronomy.ohio-state.edu, E-mail: martini@astronomy.ohio-state.edu [Department of Astronomy, Ohio State University, 140 W 18th Avenue, Columbus, OH 43210 (United States)

    2012-09-10

    We have identified and studied a sample of 151 FR IIs found in brightest cluster galaxies (BCGs) in the MaxBCG cluster catalog with data from FIRST and NVSS. We have compared the radio luminosities and projected lengths of these FR IIs to the projected length distribution of a range of mock catalogs generated by an FR II model and estimate the FR II lifetime to be 1.9 Multiplication-Sign 10{sup 8} yr. The uncertainty in the lifetime calculation is a factor of two, primarily due to uncertainties in the intracluster medium (ICM) density and the FR II axial ratio. We furthermore measure the jet power distribution of FR IIs in BCGs and find that it is well described by a log-normal distribution with a median power of 1.1 Multiplication-Sign 10{sup 37} W and a coefficient of variation of 2.2. These jet powers are nearly linearly related to the observed luminosities, and this relation is steeper than many other estimates, although it is dependent on the jet model. We investigate correlations between FR II and cluster properties and find that galaxy luminosity is correlated with jet power. This implies that jet power is also correlated with black hole mass, as the stellar luminosity of a BCG should be a good proxy for its spheroid mass and therefore the black hole mass. Jet power, however, is not correlated with cluster richness, nor is FR II lifetime strongly correlated with any cluster properties. We calculate the enthalpy of the lobes to examine the impact of the FR IIs on the ICM and find that heating due to adiabatic expansion is too small to offset radiative cooling by a factor of at least six. In contrast, the jet power is approximately an order of magnitude larger than required to counteract cooling. We conclude that if feedback from FR IIs offsets cooling of the ICM, then heating must be primarily due to another mechanism associated with FR II expansion.

  5. The Search for Another Earth–Part II

    Indian Academy of Sciences (India)

    Permanent link: https://www.ias.ac.in/article/fulltext/reso/021/10/0899-0910. Keywords. Exoplanets, earth, super-earth, diamond planet, neptune, habitability, extra-terrestrial life. Abstract. In the first part, we discussed the various methods for thedetection of planets outside the solar system known as theexoplanets. In this part ...

  6. A model of mechanical contacts in hearing aids for uncertainty analysis

    DEFF Research Database (Denmark)

    Creixell Mediante, Ester; Brunskog, Jonas; Jensen, Jakob Søndergaard

    2015-01-01

    Modelling the contact between assembled parts is a key point in the design of complex structures. Uncertainties at the joint parameters arise as a result of randomness in physical properties such as contact surface, load distribution or geometric details. This is a challenge of concern in the hea......Modelling the contact between assembled parts is a key point in the design of complex structures. Uncertainties at the joint parameters arise as a result of randomness in physical properties such as contact surface, load distribution or geometric details. This is a challenge of concern...... in the hearing aid field, where the small lightweight structures present vibration modes at frequencies within the hearing range. To approach this issue, a model of contacts based on lumped elements is suggested. The joint parameters are the stiffness of a series of spring elements placed along the contact...

  7. Uncertainty and Cognitive Control

    Directory of Open Access Journals (Sweden)

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  8. Statistical analysis of uncertainties of gamma-peak identification and area calculation in particulate air-filter environment radionuclide measurements using the results of a Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) organized intercomparison, Part I: Assessment of reliability and uncertainties of isotope detection and energy precision using artificial spiked test spectra, Part II: Assessment of the true type I error rate and the quality of peak area estimators in relation to type II errors using large numbers of natural spectra

    International Nuclear Information System (INIS)

    Zhang, W.; Zaehringer, M.; Ungar, K.; Hoffman, I.

    2008-01-01

    In this paper, the uncertainties of gamma-ray small peak analysis have been examined. As the intensity of a gamma-ray peak approaches its detection decision limit, derived parameters such as centroid channel energy, peak area, peak area uncertainty, baseline determination, and peak significance are statistically sensitive. The intercomparison exercise organized by the CTBTO provided an excellent opportunity for this to be studied. Near background levels, the false-positive and false-negative peak identification frequencies in artificial test spectra have been compared to statistically predictable limiting values. In addition, naturally occurring radon progeny were used to compare observed variance against nominal uncertainties. The results infer that the applied fit algorithms do not always represent the best estimator. Understanding the statistically predicted peak-finding limit is important for data evaluation and analysis assessment. Furthermore, these results are useful to optimize analytical procedures to achieve the best results

  9. Uncertainty, probability and information-gaps

    International Nuclear Information System (INIS)

    Ben-Haim, Yakov

    2004-01-01

    This paper discusses two main ideas. First, we focus on info-gap uncertainty, as distinct from probability. Info-gap theory is especially suited for modelling and managing uncertainty in system models: we invest all our knowledge in formulating the best possible model; this leaves the modeller with very faulty and fragmentary information about the variation of reality around that optimal model. Second, we examine the interdependence between uncertainty modelling and decision-making. Good uncertainty modelling requires contact with the end-use, namely, with the decision-making application of the uncertainty model. The most important avenue of uncertainty-propagation is from initial data- and model-uncertainties into uncertainty in the decision-domain. Two questions arise. Is the decision robust to the initial uncertainties? Is the decision prone to opportune windfall success? We apply info-gap robustness and opportunity functions to the analysis of representation and propagation of uncertainty in several of the Sandia Challenge Problems

  10. A Bayesian foundation for individual learning under uncertainty

    Directory of Open Access Journals (Sweden)

    Christoph eMathys

    2011-05-01

    Full Text Available Computational learning models are critical for understanding mechanisms of adaptive behavior. However, the two major current frameworks, reinforcement learning (RL and Bayesian learning, both have certain limitations. For example, many Bayesian models are agnostic of inter-individual variability and involve complicated integrals, making online learning difficult. Here, we introduce a generic hierarchical Bayesian framework for individual learning under multiple forms of uncertainty (e.g., environmental volatility and perceptual uncertainty. The model assumes Gaussian random walks of states at all but the first level, with the step size determined by the next higher level. The coupling between levels is controlled by parameters that shape the influence of uncertainty on learning in a subject-specific fashion. Using variational Bayes under a mean field approximation and a novel approximation to the posterior energy function, we derive trial-by-trial update equations which (i are analytical and extremely efficient, enabling real-time learning, (ii have a natural interpretation in terms of RL, and (iii contain parameters representing processes which play a key role in current theories of learning, e.g., precision-weighting of prediction error. These parameters allow for the expression of individual differences in learning and may relate to specific neuromodulatory mechanisms in the brain. Our model is very general: it can deal with both discrete and continuous states and equally accounts for deterministic and probabilistic relations between environmental events and perceptual states (i.e., situations with and without perceptual uncertainty. These properties are illustrated by simulations and analyses of empirical time series. Overall, our framework provides a novel foundation for understanding normal and pathological learning that contextualizes RL within a generic Bayesian scheme and thus connects it to principles of optimality from probability

  11. A bayesian foundation for individual learning under uncertainty.

    Science.gov (United States)

    Mathys, Christoph; Daunizeau, Jean; Friston, Karl J; Stephan, Klaas E

    2011-01-01

    Computational learning models are critical for understanding mechanisms of adaptive behavior. However, the two major current frameworks, reinforcement learning (RL) and Bayesian learning, both have certain limitations. For example, many Bayesian models are agnostic of inter-individual variability and involve complicated integrals, making online learning difficult. Here, we introduce a generic hierarchical Bayesian framework for individual learning under multiple forms of uncertainty (e.g., environmental volatility and perceptual uncertainty). The model assumes Gaussian random walks of states at all but the first level, with the step size determined by the next highest level. The coupling between levels is controlled by parameters that shape the influence of uncertainty on learning in a subject-specific fashion. Using variational Bayes under a mean-field approximation and a novel approximation to the posterior energy function, we derive trial-by-trial update equations which (i) are analytical and extremely efficient, enabling real-time learning, (ii) have a natural interpretation in terms of RL, and (iii) contain parameters representing processes which play a key role in current theories of learning, e.g., precision-weighting of prediction error. These parameters allow for the expression of individual differences in learning and may relate to specific neuromodulatory mechanisms in the brain. Our model is very general: it can deal with both discrete and continuous states and equally accounts for deterministic and probabilistic relations between environmental events and perceptual states (i.e., situations with and without perceptual uncertainty). These properties are illustrated by simulations and analyses of empirical time series. Overall, our framework provides a novel foundation for understanding normal and pathological learning that contextualizes RL within a generic Bayesian scheme and thus connects it to principles of optimality from probability theory.

  12. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.

    Science.gov (United States)

    Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark

    2011-01-01

    Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.

  13. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    Science.gov (United States)

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  14. Reproduce and die! Why aging? Part II

    NARCIS (Netherlands)

    Schuiling, GA

    Whilst in part I of this diptych on aging the question why aging exists at all is discussed; this part deals with the question which mechanisms underly aging and, ultimately, dying. It appears that aging is not just an active process as such - although all kinds of internal (e.g., oxigen-free

  15. Getting to the Source: a Survey of Quantitative Data Sources Available to the Everyday Librarian: Part II: Data Sources from Specific Library Applications

    Directory of Open Access Journals (Sweden)

    Lisa Goddard

    2007-03-01

    Full Text Available This is the second part of a two-part article that provides a survey of data sources which are likely to be immediately available to the typical practitioner who wishes to engage in statistical analysis of collections and services within his or her own library. Part I outlines the data elements which can be extracted from web server logs, and discusses web log analysis tools. Part II looks at logs, reports, and data sources from proxy servers, resource vendors, link resolvers, federated search engines, institutional repositories, electronic reference services, and the integrated library system.

  16. Some concepts of model uncertainty for performance assessments of nuclear waste repositories

    International Nuclear Information System (INIS)

    Eisenberg, N.A.; Sagar, B.; Wittmeyer, G.W.

    1994-01-01

    Models of the performance of nuclear waste repositories will be central to making regulatory decisions regarding the safety of such facilities. The conceptual model of repository performance is represented by mathematical relationships, which are usually implemented as one or more computer codes. A geologic system may allow many conceptual models, which are consistent with the observations. These conceptual models may or may not have the same mathematical representation. Experiences in modeling the performance of a waste repository representation. Experiences in modeling the performance of a waste repository (which is, in part, a geologic system), show that this non-uniqueness of conceptual models is a significant source of model uncertainty. At the same time, each conceptual model has its own set of parameters and usually, it is not be possible to completely separate model uncertainty from parameter uncertainty for the repository system. Issues related to the origin of model uncertainty, its relation to parameter uncertainty, and its incorporation in safety assessments are discussed from a broad regulatory perspective. An extended example in which these issues are explored numerically is also provided

  17. Developing an Online Framework for Publication of Uncertainty Information in Hydrological Modeling

    Science.gov (United States)

    Etienne, E.; Piasecki, M.

    2012-12-01

    Inaccuracies in data collection and parameters estimation, and imperfection of models structures imply uncertain predictions of the hydrological models. Finding a way to communicate the uncertainty information in a model output is important in decision-making. This work aims to publish uncertainty information (computed by project partner at Penn State) associated with hydrological predictions on catchments. To this end we have developed a DB schema (derived from the CUAHSI ODM design) which is focused on storing uncertainty information and its associated metadata. The technologies used to build the system are: OGC's Sensor Observation Service (SOS) for publication, the uncertML markup language (also developed by the OGC) to describe uncertainty information, and use of the Interoperability and Automated Mapping (INTAMAP) Web Processing Service (WPS) that handles part of the statistics computations. We develop a service to provide users with the capability to exploit all the functionality of the system (based on DRUPAL). Users will be able to request and visualize uncertainty data, and also publish their data in the system.

  18. The management of uncertainties in radiological data

    International Nuclear Information System (INIS)

    Funtowicz, S.O.

    1989-01-01

    A prototype framework for representing uncertainties in radiological data is introduced. Through this framework, inherent variability in the quality of radiological data can be managed and communicated efficiently, systematically and consistently. Codings derived from the framework have applicability to radiological data, irrespective of the source from which the data are obtained. and irrespective of the context in which they may be used. The coding, in effect, can itself become part of a radiological base. (author)

  19. Multi-fidelity uncertainty quantification in large-scale predictive simulations of turbulent flow

    Science.gov (United States)

    Geraci, Gianluca; Jofre-Cruanyes, Lluis; Iaccarino, Gianluca

    2017-11-01

    The performance characterization of complex engineering systems often relies on accurate, but computationally intensive numerical simulations. It is also well recognized that in order to obtain a reliable numerical prediction the propagation of uncertainties needs to be included. Therefore, Uncertainty Quantification (UQ) plays a fundamental role in building confidence in predictive science. Despite the great improvement in recent years, even the more advanced UQ algorithms are still limited to fairly simplified applications and only moderate parameter dimensionality. Moreover, in the case of extremely large dimensionality, sampling methods, i.e. Monte Carlo (MC) based approaches, appear to be the only viable alternative. In this talk we describe and compare a family of approaches which aim to accelerate the convergence of standard MC simulations. These methods are based on hierarchies of generalized numerical resolutions (multi-level) or model fidelities (multi-fidelity), and attempt to leverage the correlation between Low- and High-Fidelity (HF) models to obtain a more accurate statistical estimator without introducing additional HF realizations. The performance of these methods are assessed on an irradiated particle laden turbulent flow (PSAAP II solar energy receiver). This investigation was funded by the United States Department of Energy's (DoE) National Nuclear Security Administration (NNSA) under the Predicitive Science Academic Alliance Program (PSAAP) II at Stanford University.

  20. Complexometric determination, Part II: Complexometric determination of Cu2+-ions

    Directory of Open Access Journals (Sweden)

    Rajković Miloš B.

    2002-01-01

    Full Text Available A copper-selective electrode of the coated wire type based on sulphidized copper wire was applied successfully for determining Cu(II ions by complexometric titration with the disodium salt of EDTA (complexon III. By the formation of internal complex compounds with the Cu(II ion, the copper concentration in the solution decreases, and all this is followed by a change of potential of the indicator system Cu-DWISE (or Cu-EDWISE/SCE. At the terminal point of titration, when all the Cu(II ions are already utilized for the formation of the complex with EDTA, there occurs a steep rise of potential, thus enabling us, through the first or second derivative to note the quantity of copper that is present in the solution. Copper-selective electrode showed a responsivity towards titration with EDTA as a complexing agent, with the absence of "fatigue" due to a great number of repeated measurings. Errors occurring during quantitative measurements were more a characteristic of the overall procedure which involve, because of the impossibility of the complete absence of subjectivity, a constant error, and the reproducibility of the results confirmed this fact. The disodium salt of EDTA appeared as a very efficient titrant in all titrations and with various concentrations ot Cu(II ions in the solution, with somewhat weaker response at lower concentrations in the solution.

  1. A Variation on Uncertainty Principle and Logarithmic Uncertainty Principle for Continuous Quaternion Wavelet Transforms

    Directory of Open Access Journals (Sweden)

    Mawardi Bahri

    2017-01-01

    Full Text Available The continuous quaternion wavelet transform (CQWT is a generalization of the classical continuous wavelet transform within the context of quaternion algebra. First of all, we show that the directional quaternion Fourier transform (QFT uncertainty principle can be obtained using the component-wise QFT uncertainty principle. Based on this method, the directional QFT uncertainty principle using representation of polar coordinate form is easily derived. We derive a variation on uncertainty principle related to the QFT. We state that the CQWT of a quaternion function can be written in terms of the QFT and obtain a variation on uncertainty principle related to the CQWT. Finally, we apply the extended uncertainty principles and properties of the CQWT to establish logarithmic uncertainty principles related to generalized transform.

  2. Pre-WIPP in-situ experiments in salt. Part I. Executive summary. Part II. Program description

    Energy Technology Data Exchange (ETDEWEB)

    Sattler, A.R.; Hunter, T.O.

    1979-08-01

    This document presents plans for in-situ experiments in a specific location in southeastern New Mexico. Schedule and facility design were based on features of a representative local potash mine and on contract negotiations with mine owners. Subsequent WIPP program uncertainties have required a delay in the implementation of the activities discussed here; however, the relative schedule for various activities are appropriate for future planning. The document represents a matrix of in-situ activities to address relevant technical issues prior to the availability of a bedded salt repository.

  3. Pre-WIPP in-situ experiments in salt. Part I. Executive summary. Part II. Program description

    International Nuclear Information System (INIS)

    Sattler, A.R.; Hunter, T.O.

    1979-08-01

    This document presents plans for in-situ experiments in a specific location in southeastern New Mexico. Schedule and facility design were based on features of a representative local potash mine and on contract negotiations with mine owners. Subsequent WIPP program uncertainties have required a delay in the implementation of the activities discussed here; however, the relative schedule for various activities are appropriate for future planning. The document represents a matrix of in-situ activities to address relevant technical issues prior to the availability of a bedded salt repository

  4. On the relationship between aerosol model uncertainty and radiative forcing uncertainty.

    Science.gov (United States)

    Lee, Lindsay A; Reddington, Carly L; Carslaw, Kenneth S

    2016-05-24

    The largest uncertainty in the historical radiative forcing of climate is caused by the interaction of aerosols with clouds. Historical forcing is not a directly measurable quantity, so reliable assessments depend on the development of global models of aerosols and clouds that are well constrained by observations. However, there has been no systematic assessment of how reduction in the uncertainty of global aerosol models will feed through to the uncertainty in the predicted forcing. We use a global model perturbed parameter ensemble to show that tight observational constraint of aerosol concentrations in the model has a relatively small effect on the aerosol-related uncertainty in the calculated forcing between preindustrial and present-day periods. One factor is the low sensitivity of present-day aerosol to natural emissions that determine the preindustrial aerosol state. However, the major cause of the weak constraint is that the full uncertainty space of the model generates a large number of model variants that are equally acceptable compared to present-day aerosol observations. The narrow range of aerosol concentrations in the observationally constrained model gives the impression of low aerosol model uncertainty. However, these multiple "equifinal" models predict a wide range of forcings. To make progress, we need to develop a much deeper understanding of model uncertainty and ways to use observations to constrain it. Equifinality in the aerosol model means that tuning of a small number of model processes to achieve model-observation agreement could give a misleading impression of model robustness.

  5. Resolving astrophysical uncertainties in dark matter direct detection

    CERN Document Server

    Frandsen, Mads T; McCabe, Christopher; Sarkar, Subir; Schmidt-Hoberg, Kai

    2012-01-01

    We study the impact of the assumed velocity distribution of galactic dark matter particles on the interpretation of results from nuclear recoil detectors. By converting experimental data to variables that make the astrophysical unknowns explicit, different experiments can be compared without implicit assumptions concerning the dark matter halo. We extend this framework to include the annual modulation signal, as well as multiple target elements. Recent results from DAMA, CoGeNT and CRESST-II can be brought into agreement if the velocity distribution is very anisotropic and thus allows a large modulation fraction. However constraints from CDMS and XENON cannot be evaded by appealing to such astrophysical uncertainties alone.

  6. Decision-making under great uncertainty

    International Nuclear Information System (INIS)

    Hansson, S.O.

    1992-01-01

    Five types of decision-uncertainty are distinguished: uncertainty of consequences, of values, of demarcation, of reliance, and of co-ordination. Strategies are proposed for each type of uncertainty. The general conclusion is that it is meaningful for decision theory to treat cases with greater uncertainty than the textbook case of 'decision-making under uncertainty'. (au)

  7. Alignment measurements uncertainties for large assemblies using probabilistic analysis techniques

    CERN Document Server

    AUTHOR|(CDS)2090816; Almond, Heather

    Big science and ambitious industrial projects continually push forward with technical requirements beyond the grasp of conventional engineering techniques. Example of those are ultra-high precision requirements in the field of celestial telescopes, particle accelerators and aerospace industry. Such extreme requirements are limited largely by the capability of the metrology used, namely, it’s uncertainty in relation to the alignment tolerance required. The current work was initiated as part of Maria Curie European research project held at CERN, Geneva aiming to answer those challenges as related to future accelerators requiring alignment of 2 m large assemblies to tolerances in the 10 µm range. The thesis has found several gaps in current knowledge limiting such capability. Among those was the lack of application of state of the art uncertainty propagation methods in alignment measurements metrology. Another major limiting factor found was the lack of uncertainty statements in the thermal errors compensatio...

  8. Sensitivity and uncertainty studies of the CRAC2 computer code

    International Nuclear Information System (INIS)

    Kocher, D.C.; Ward, R.C.; Killough, G.G.; Dunning, D.E. Jr.; Hicks, B.B.; Hosker, R.P. Jr.; Ku, J.Y.; Rao, K.S.

    1985-05-01

    This report presents a study of the sensitivity of early fatalities, early injuries, latent cancer fatalities, and economic costs for hypothetical nuclear reactor accidents as predicted by the CRAC2 computer code (CRAC = Calculation of Reactor Accident Consequences) to uncertainties in selected models and parameters used in the code. The sources of uncertainty that were investigated in the CRAC2 sensitivity studies include (1) the model for plume rise, (2) the model for wet deposition, (3) the procedure for meteorological bin-sampling involving the selection of weather sequences that contain rain, (4) the dose conversion factors for inhalation as they are affected by uncertainties in the physical and chemical form of the released radionuclides, (5) the weathering half-time for external ground-surface exposure, and (6) the transfer coefficients for estimating exposures via terrestrial foodchain pathways. The sensitivity studies were performed for selected radionuclide releases, hourly meteorological data, land-use data, a fixed non-uniform population distribution, a single evacuation model, and various release heights and sensible heat rates. Two important general conclusions from the sensitivity and uncertainty studies are as follows: (1) The large effects on predicted early fatalities and early injuries that were observed in some of the sensitivity studies apparently are due in part to the presence of thresholds in the dose-response models. Thus, the observed sensitivities depend in part on the magnitude of the radionuclide releases. (2) Some of the effects on predicted early fatalities and early injuries that were observed in the sensitivity studies were comparable to effects that were due only to the selection of different sets of weather sequences in bin-sampling runs. 47 figs., 50 tabs

  9. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  10. Integrating uncertainty propagation in GNSS radio occultation retrieval: from excess phase to atmospheric bending angle profiles

    Science.gov (United States)

    Schwarz, Jakob; Kirchengast, Gottfried; Schwaerz, Marc

    2018-05-01

    with the other parts of the rOPS processing chain this part is thus ready to provide integrated uncertainty propagation through the whole RO retrieval chain for the benefit of climate monitoring and other applications.

  11. Application of a Novel Dose-Uncertainty Model for Dose-Uncertainty Analysis in Prostate Intensity-Modulated Radiotherapy

    International Nuclear Information System (INIS)

    Jin Hosang; Palta, Jatinder R.; Kim, You-Hyun; Kim, Siyong

    2010-01-01

    Purpose: To analyze dose uncertainty using a previously published dose-uncertainty model, and to assess potential dosimetric risks existing in prostate intensity-modulated radiotherapy (IMRT). Methods and Materials: The dose-uncertainty model provides a three-dimensional (3D) dose-uncertainty distribution in a given confidence level. For 8 retrospectively selected patients, dose-uncertainty maps were constructed using the dose-uncertainty model at the 95% CL. In addition to uncertainties inherent to the radiation treatment planning system, four scenarios of spatial errors were considered: machine only (S1), S1 + intrafraction, S1 + interfraction, and S1 + both intrafraction and interfraction errors. To evaluate the potential risks of the IMRT plans, three dose-uncertainty-based plan evaluation tools were introduced: confidence-weighted dose-volume histogram, confidence-weighted dose distribution, and dose-uncertainty-volume histogram. Results: Dose uncertainty caused by interfraction setup error was more significant than that of intrafraction motion error. The maximum dose uncertainty (95% confidence) of the clinical target volume (CTV) was smaller than 5% of the prescribed dose in all but two cases (13.9% and 10.2%). The dose uncertainty for 95% of the CTV volume ranged from 1.3% to 2.9% of the prescribed dose. Conclusions: The dose uncertainty in prostate IMRT could be evaluated using the dose-uncertainty model. Prostate IMRT plans satisfying the same plan objectives could generate a significantly different dose uncertainty because a complex interplay of many uncertainty sources. The uncertainty-based plan evaluation contributes to generating reliable and error-resistant treatment plans.

  12. A Conversation with William A. Fowler Part II

    Science.gov (United States)

    Greenberg, John

    2005-06-01

    Physicist William A.Fowler initiated an experimental program in nuclear astrophysics after World War II. He recalls here the Steady State versus Big Bang controversy and his celebrated collaboration with Fred Hoyle and Geoffrey and Margaret Burbidge on nucleosynthesis in stars. He also comments on the shift away from nuclear physics in universities to large accelerators and national laboratories.

  13. Ruminations On NDA Measurement Uncertainty Compared TO DA Uncertainty

    International Nuclear Information System (INIS)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-01-01

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  14. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    Energy Technology Data Exchange (ETDEWEB)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  15. Application of probabilistic modelling for the uncertainty evaluation of alignment measurements of large accelerator magnets assemblies

    Science.gov (United States)

    Doytchinov, I.; Tonnellier, X.; Shore, P.; Nicquevert, B.; Modena, M.; Mainaud Durand, H.

    2018-05-01

    Micrometric assembly and alignment requirements for future particle accelerators, and especially large assemblies, create the need for accurate uncertainty budgeting of alignment measurements. Measurements and uncertainties have to be accurately stated and traceable, to international standards, for metre-long sized assemblies, in the range of tens of µm. Indeed, these hundreds of assemblies will be produced and measured by several suppliers around the world, and will have to be integrated into a single machine. As part of the PACMAN project at CERN, we proposed and studied a practical application of probabilistic modelling of task-specific alignment uncertainty by applying a simulation by constraints calibration method. Using this method, we calibrated our measurement model using available data from ISO standardised tests (10360 series) for the metrology equipment. We combined this model with reference measurements and analysis of the measured data to quantify the actual specific uncertainty of each alignment measurement procedure. Our methodology was successfully validated against a calibrated and traceable 3D artefact as part of an international inter-laboratory study. The validated models were used to study the expected alignment uncertainty and important sensitivity factors in measuring the shortest and longest of the compact linear collider study assemblies, 0.54 m and 2.1 m respectively. In both cases, the laboratory alignment uncertainty was within the targeted uncertainty budget of 12 µm (68% confidence level). It was found that the remaining uncertainty budget for any additional alignment error compensations, such as the thermal drift error due to variation in machine operation heat load conditions, must be within 8.9 µm and 9.8 µm (68% confidence level) respectively.

  16. Uncertainty Quantification for Complex RF-structures Using the State-space Concatenation Approach

    CERN Document Server

    Heller, Johann; Schmidt, Christian; Van Rienen, Ursula

    2015-01-01

    as well as to employ robust optimizations, a so-called uncertainty quantification (UQ) is applied. For large and complex structures such computations are heavily demanding and cannot be carried out using standard brute-force approaches. In this paper, we propose a combination of established techniques to perform UQ for long and complex structures, where the uncertainty is located only in parts of the structure. As exemplary structure, we investigate the third-harmonic cavity, which is being used at the FLASH accelerator at DESY, assuming an uncertain...

  17. Working fluid selection for organic Rankine cycles - Impact of uncertainty of fluid properties

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Andreasen, Jesper Graa; Liu, Wei

    2016-01-01

    This study presents a generic methodology to select working fluids for ORC (Organic Rankine Cycles)taking into account property uncertainties of the working fluids. A Monte Carlo procedure is described as a tool to propagate the influence of the input uncertainty of the fluid parameters on the ORC...... modeloutput, and provides the 95%-confidence interval of the net power output with respect to the fluid property uncertainties. The methodology has been applied to a molecular design problem for an ORCusing a low-temperature heat source and consisted of the following four parts: 1) formulation...... of processmodels and constraints 2) selection of property models, i.e. Penge Robinson equation of state 3)screening of 1965 possible working fluid candidates including identification of optimal process parametersbased on Monte Carlo sampling 4) propagating uncertainty of fluid parameters to the ORC netpower output...

  18. Interview-Based Qualitative Research in Emergency Care Part II: Data Collection, Analysis and Results Reporting

    Science.gov (United States)

    Ranney, Megan L.; Meisel, Zachary; Choo, Esther K.; Garro, Aris; Sasson, Comilla; Morrow, Kathleen

    2015-01-01

    Qualitative methods are increasingly being used in emergency care research. Rigorous qualitative methods can play a critical role in advancing the emergency care research agenda by allowing investigators to generate hypotheses, gain an in-depth understanding of health problems or specific populations, create expert consensus, and develop new intervention and dissemination strategies. In Part I of this two-article series, we provided an introduction to general principles of applied qualitative health research and examples of its common use in emergency care research, describing study designs and data collection methods most relevant to our field (observation, individual interviews, and focus groups). Here in Part II of this series, we outline the specific steps necessary to conduct a valid and reliable qualitative research project, with a focus on interview-based studies. These elements include building the research team, preparing data collection guides, defining and obtaining an adequate sample, collecting and organizing qualitative data, and coding and analyzing the data. We also discuss potential ethical considerations unique to qualitative research as it relates to emergency care research. PMID:26284572

  19. Interview-based Qualitative Research in Emergency Care Part II: Data Collection, Analysis and Results Reporting.

    Science.gov (United States)

    Ranney, Megan L; Meisel, Zachary F; Choo, Esther K; Garro, Aris C; Sasson, Comilla; Morrow Guthrie, Kate

    2015-09-01

    Qualitative methods are increasingly being used in emergency care research. Rigorous qualitative methods can play a critical role in advancing the emergency care research agenda by allowing investigators to generate hypotheses, gain an in-depth understanding of health problems or specific populations, create expert consensus, and develop new intervention and dissemination strategies. In Part I of this two-article series, we provided an introduction to general principles of applied qualitative health research and examples of its common use in emergency care research, describing study designs and data collection methods most relevant to our field (observation, individual interviews, and focus groups). Here in Part II of this series, we outline the specific steps necessary to conduct a valid and reliable qualitative research project, with a focus on interview-based studies. These elements include building the research team, preparing data collection guides, defining and obtaining an adequate sample, collecting and organizing qualitative data, and coding and analyzing the data. We also discuss potential ethical considerations unique to qualitative research as it relates to emergency care research. © 2015 by the Society for Academic Emergency Medicine.

  20. PIV Uncertainty Methodologies for CFD Code Validation at the MIR Facility

    Energy Technology Data Exchange (ETDEWEB)

    Sabharwall, Piyush [Idaho National Lab. (INL), Idaho Falls, ID (United States); Skifton, Richard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Stoots, Carl [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Eung Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Conder, Thomas [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-12-01

    Currently, computational fluid dynamics (CFD) is widely used in the nuclear thermal hydraulics field for design and safety analyses. To validate CFD codes, high quality multi dimensional flow field data are essential. The Matched Index of Refraction (MIR) Flow Facility at Idaho National Laboratory has a unique capability to contribute to the development of validated CFD codes through the use of Particle Image Velocimetry (PIV). The significance of the MIR facility is that it permits non intrusive velocity measurement techniques, such as PIV, through complex models without requiring probes and other instrumentation that disturb the flow. At the heart of any PIV calculation is the cross-correlation, which is used to estimate the displacement of particles in some small part of the image over the time span between two images. This image displacement is indicated by the location of the largest peak. In the MIR facility, uncertainty quantification is a challenging task due to the use of optical measurement techniques. Currently, this study is developing a reliable method to analyze uncertainty and sensitivity of the measured data and develop a computer code to automatically analyze the uncertainty/sensitivity of the measured data. The main objective of this study is to develop a well established uncertainty quantification method for the MIR Flow Facility, which consists of many complicated uncertainty factors. In this study, the uncertainty sources are resolved in depth by categorizing them into uncertainties from the MIR flow loop and PIV system (including particle motion, image distortion, and data processing). Then, each uncertainty source is mathematically modeled or adequately defined. Finally, this study will provide a method and procedure to quantify the experimental uncertainty in the MIR Flow Facility with sample test results.