WorldWideScience

Sample records for internally driven uncertainty

  1. Agriculture-driven deforestation in the tropics from 1990-2015: emissions, trends and uncertainties

    Science.gov (United States)

    Carter, Sarah; Herold, Martin; Avitabile, Valerio; de Bruin, Sytze; De Sy, Veronique; Kooistra, Lammert; Rufino, Mariana C.

    2018-01-01

    Limited data exists on emissions from agriculture-driven deforestation, and available data are typically uncertain. In this paper, we provide comparable estimates of emissions from both all deforestation and agriculture-driven deforestation, with uncertainties for 91 countries across the tropics between 1990 and 2015. Uncertainties associated with input datasets (activity data and emissions factors) were used to combine the datasets, where most certain datasets contribute the most. This method utilizes all the input data, while minimizing the uncertainty of the emissions estimate. The uncertainty of input datasets was influenced by the quality of the data, the sample size (for sample-based datasets), and the extent to which the timeframe of the data matches the period of interest. Area of deforestation, and the agriculture-driver factor (extent to which agriculture drives deforestation), were the most uncertain components of the emissions estimates, thus improvement in the uncertainties related to these estimates will provide the greatest reductions in uncertainties of emissions estimates. Over the period of the study, Latin America had the highest proportion of deforestation driven by agriculture (78%), and Africa had the lowest (62%). Latin America had the highest emissions from agriculture-driven deforestation, and these peaked at 974 ± 148 Mt CO2 yr-1 in 2000-2005. Africa saw a continuous increase in emissions between 1990 and 2015 (from 154 ± 21-412 ± 75 Mt CO2 yr-1), so mitigation initiatives could be prioritized there. Uncertainties for emissions from agriculture-driven deforestation are ± 62.4% (average over 1990-2015), and uncertainties were highest in Asia and lowest in Latin America. Uncertainty information is crucial for transparency when reporting, and gives credibility to related mitigation initiatives. We demonstrate that uncertainty data can also be useful when combining multiple open datasets, so we recommend new data

  2. Coping With Uncertainty in International Business

    OpenAIRE

    Briance Mascarenhas

    1982-01-01

    International business, as compared with domestic business, is usually characterized by increased uncertainty. A study of 10 multinational companies uncovered several methods of coping with uncertainty. This paper focuses on 2 methods which may not be apparent control and flexibility. A framework of analysis suggesting appropriate methods for coping with uncertainty is also developed.© 1982 JIBS. Journal of International Business Studies (1982) 13, 87–98

  3. A data-driven approach for modeling post-fire debris-flow volumes and their uncertainty

    Science.gov (United States)

    Friedel, Michael J.

    2011-01-01

    This study demonstrates the novel application of genetic programming to evolve nonlinear post-fire debris-flow volume equations from variables associated with a data-driven conceptual model of the western United States. The search space is constrained using a multi-component objective function that simultaneously minimizes root-mean squared and unit errors for the evolution of fittest equations. An optimization technique is then used to estimate the limits of nonlinear prediction uncertainty associated with the debris-flow equations. In contrast to a published multiple linear regression three-variable equation, linking basin area with slopes greater or equal to 30 percent, burn severity characterized as area burned moderate plus high, and total storm rainfall, the data-driven approach discovers many nonlinear and several dimensionally consistent equations that are unbiased and have less prediction uncertainty. Of the nonlinear equations, the best performance (lowest prediction uncertainty) is achieved when using three variables: average basin slope, total burned area, and total storm rainfall. Further reduction in uncertainty is possible for the nonlinear equations when dimensional consistency is not a priority and by subsequently applying a gradient solver to the fittest solutions. The data-driven modeling approach can be applied to nonlinear multivariate problems in all fields of study.

  4. Uncertainty assessment for accelerator-driven systems

    International Nuclear Information System (INIS)

    Finck, P. J.; Gomes, I.; Micklich, B.; Palmiotti, G.

    1999-01-01

    The concept of a subcritical system driven by an external source of neutrons provided by an accelerator ADS (Accelerator Driver System) has been recently revived and is becoming more popular in the world technical community with active programs in Europe, Russia, Japan, and the U.S. A general consensus has been reached in adopting for the subcritical component a fast spectrum liquid metal cooled configuration. Both a lead-bismuth eutectic, sodium and gas are being considered as a coolant; each has advantages and disadvantages. The major expected advantage is that subcriticality avoids reactivity induced transients. The potentially large subcriticality margin also should allow for the introduction of very significant quantities of waste products (minor Actinides and Fission Products) which negatively impact the safety characteristics of standard cores. In the U.S. these arguments are the basis for the development of the Accelerator Transmutation of Waste (ATW), which has significant potential in reducing nuclear waste levels. Up to now, neutronic calculations have not attached uncertainties on the values of the main nuclear integral parameters that characterize the system. Many of these parameters (e.g., degree of subcriticality) are crucial to demonstrate the validity and feasibility of this concept. In this paper we will consider uncertainties related to nuclear data only. The present knowledge of the cross sections of many isotopes that are not usually utilized in existing reactors (like Bi, Pb-207, Pb-208, and also Minor Actinides and Fission Products) suggests that uncertainties in the integral parameters will be significantly larger than for conventional reactor systems, and this raises concerns on the neutronic performance of those systems

  5. USACM Thematic Workshop On Uncertainty Quantification And Data-Driven Modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, James R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    The USACM Thematic Workshop on Uncertainty Quantification and Data-Driven Modeling was held on March 23-24, 2017, in Austin, TX. The organizers of the technical program were James R. Stewart of Sandia National Laboratories and Krishna Garikipati of University of Michigan. The administrative organizer was Ruth Hengst, who serves as Program Coordinator for the USACM. The organization of this workshop was coordinated through the USACM Technical Thrust Area on Uncertainty Quantification and Probabilistic Analysis. The workshop website (http://uqpm2017.usacm.org) includes the presentation agenda as well as links to several of the presentation slides (permission to access the presentations was granted by each of those speakers, respectively). Herein, this final report contains the complete workshop program that includes the presentation agenda, the presentation abstracts, and the list of posters.

  6. Fifth International Conference on Squeezed States and Uncertainty Relations

    Science.gov (United States)

    Han, D. (Editor); Janszky, J. (Editor); Kim, Y. S. (Editor); Man'ko, V. I. (Editor)

    1998-01-01

    The Fifth International Conference on Squeezed States and Uncertainty Relations was held at Balatonfured, Hungary, on 27-31 May 1997. This series was initiated in 1991 at the College Park Campus of the University of Maryland as the Workshop on Squeezed States and Uncertainty Relations. The scientific purpose of this series was to discuss squeezed states of light, but in recent years the scope is becoming broad enough to include studies of uncertainty relations and squeeze transformations in all branches of physics including quantum optics and foundations of quantum mechanics. Quantum optics will continue playing the pivotal role in the future, but the future meetings will include all branches of physics where squeeze transformations are basic. As the meeting attracted more participants and started covering more diversified subjects, the fourth meeting was called an international conference. The Fourth International Conference on Squeezed States and Uncertainty Relations was held in 1995 was hosted by Shanxi University in Taiyuan, China. The fifth meeting of this series, which was held at Balatonfured, Hungary, was also supported by the IUPAP. In 1999, the Sixth International Conference will be hosted by the University of Naples in 1999. The meeting will take place in Ravello near Naples.

  7. Essentialist beliefs, sexual identity uncertainty, internalized homonegativity and psychological wellbeing in gay men.

    Science.gov (United States)

    Morandini, James S; Blaszczynski, Alexander; Ross, Michael W; Costa, Daniel S J; Dar-Nimrod, Ilan

    2015-07-01

    The present study examined essentialist beliefs about sexual orientation and their implications for sexual identity uncertainty, internalized homonegativity and psychological wellbeing in a sample of gay men. A combination of targeted sampling and snowball strategies were used to recruit 639 gay identifying men for a cross-sectional online survey. Participants completed a questionnaire assessing sexual orientation beliefs, sexual identity uncertainty, internalized homonegativity, and psychological wellbeing outcomes. Structural equation modeling was used to test whether essentialist beliefs were associated with psychological wellbeing indirectly via their effect on sexual identity uncertainty and internalized homonegativity. A unique pattern of direct and indirect effects was observed in which facets of essentialism predicted sexual identity uncertainty, internalized homonegativity and psychological wellbeing. Of note, viewing sexual orientation as immutable/biologically based and as existing in discrete categories, were associated with less sexual identity uncertainty. On the other hand, these beliefs had divergent relationships with internalized homonegativity, with immutability/biological beliefs associated with lower, and discreteness beliefs associated with greater internalized homonegativity. Of interest, although sexual identity uncertainty was associated with poorer psychological wellbeing via its contribution to internalized homophobia, there was no direct relationship between identity uncertainty and psychological wellbeing. Findings indicate that essentializing sexual orientation has mixed implications for sexual identity uncertainty and internalized homonegativity and wellbeing in gay men. Those undertaking educational and clinical interventions with gay men should be aware of the benefits and of caveats of essentialist theories of homosexuality for this population. (c) 2015 APA, all rights reserved).

  8. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  9. Data-Driven Model Uncertainty Estimation in Hydrologic Data Assimilation

    Science.gov (United States)

    Pathiraja, S.; Moradkhani, H.; Marshall, L.; Sharma, A.; Geenens, G.

    2018-02-01

    The increasing availability of earth observations necessitates mathematical methods to optimally combine such data with hydrologic models. Several algorithms exist for such purposes, under the umbrella of data assimilation (DA). However, DA methods are often applied in a suboptimal fashion for complex real-world problems, due largely to several practical implementation issues. One such issue is error characterization, which is known to be critical for a successful assimilation. Mischaracterized errors lead to suboptimal forecasts, and in the worst case, to degraded estimates even compared to the no assimilation case. Model uncertainty characterization has received little attention relative to other aspects of DA science. Traditional methods rely on subjective, ad hoc tuning factors or parametric distribution assumptions that may not always be applicable. We propose a novel data-driven approach (named SDMU) to model uncertainty characterization for DA studies where (1) the system states are partially observed and (2) minimal prior knowledge of the model error processes is available, except that the errors display state dependence. It includes an approach for estimating the uncertainty in hidden model states, with the end goal of improving predictions of observed variables. The SDMU is therefore suited to DA studies where the observed variables are of primary interest. Its efficacy is demonstrated through a synthetic case study with low-dimensional chaotic dynamics and a real hydrologic experiment for one-day-ahead streamflow forecasting. In both experiments, the proposed method leads to substantial improvements in the hidden states and observed system outputs over a standard method involving perturbation with Gaussian noise.

  10. Uncertainties in carbon residence time and NPP-driven carbon uptake in terrestrial ecosystems of the conterminous USA: a Bayesian approach

    Directory of Open Access Journals (Sweden)

    Xuhui Zhou

    2012-10-01

    Full Text Available Carbon (C residence time is one of the key factors that determine the capacity of ecosystem C storage. However, its uncertainties have not been well quantified, especially at regional scales. Assessing uncertainties of C residence time is thus crucial for an improved understanding of terrestrial C sequestration. In this study, the Bayesian inversion and Markov Chain Monte Carlo (MCMC technique were applied to a regional terrestrial ecosystem (TECO-R model to quantify C residence times and net primary productivity (NPP-driven ecosystem C uptake and assess their uncertainties in the conterminous USA. The uncertainty was represented by coefficient of variation (CV. The 13 spatially distributed data sets of C pools and fluxes have been used to constrain TECO-R model for each biome (totally eight biomes. Our results showed that estimated ecosystem C residence times ranged from 16.6±1.8 (cropland to 85.9±15.3 yr (evergreen needleleaf forest with an average of 56.8±8.8 yr in the conterminous USA. The ecosystem C residence times and their CV were spatially heterogeneous and varied with vegetation types and climate conditions. Large uncertainties appeared in the southern and eastern USA. Driven by NPP changes from 1982 to 1998, terrestrial ecosystems in the conterminous USA would absorb 0.20±0.06 Pg C yr−1. Their spatial pattern was closely related to the greenness map in the summer with larger uptake in central and southeast regions. The lack of data or timescale mismatching between the available data and the estimated parameters lead to uncertainties in the estimated C residence times, which together with initial NPP resulted in the uncertainties in the estimated NPP-driven C uptake. The Bayesian approach with MCMC inversion provides an effective tool to estimate spatially distributed C residence time and assess their uncertainties in the conterminous USA.

  11. Investment and uncertainty in the international oil and gas industry

    International Nuclear Information System (INIS)

    Mohn, Klaus; Misund, Baard

    2009-01-01

    The standard theory of irreversible investments and real options suggests a negative relation between investment and uncertainty. Richer models with compound option structures open for a positive relationship. This paper presents a micro-econometric study of corporate investment and uncertainty in a period of market turbulence and restructuring in the international oil and gas industry. Based on data for 115 companies over the period 1992-2005, we estimate four different specifications of the q model of investment, with robust results for the uncertainty variables. The estimated models suggest that macroeconomic uncertainty creates a bottleneck for oil and gas investment and production, whereas industry-specific uncertainty has a stimulating effect. (author)

  12. Uncertainty Driven Action (UDA) model: A foundation for unifying perspectives on design activity

    DEFF Research Database (Denmark)

    Cash, Philip; Kreye, Melanie

    2017-01-01

    are linked via uncertainty perception. The foundations of the UDA model in the design literature are elaborated in terms of the three core actions and their links to designer cognition and behaviour, utilising definitions and concepts from Activity Theory. The practical relevance and theoretical......This paper proposes the Uncertainty Driven Action (UDA) model, which unifies the fragmented literature on design activity. The UDA model conceptualises design activity as a process consisting of three core actions: information action, knowledge-sharing action, and representation action, which...... contributions of the UDA model are discussed. This paper contributes to the design literature by offering a comprehensive formalisation of design activity of individual designers, which connects cognition and action, to provide a foundation for understanding previously disparate descriptions of design activity....

  13. Stakeholder-driven multi-attribute analysis for energy project selection under uncertainty

    International Nuclear Information System (INIS)

    Read, Laura; Madani, Kaveh; Mokhtari, Soroush; Hanks, Catherine

    2017-01-01

    In practice, selecting an energy project for development requires balancing criteria and competing stakeholder priorities to identify the best alternative. Energy source selection can be modeled as multi-criteria decision-maker problems to provide quantitative support to reconcile technical, economic, environmental, social, and political factors with respect to the stakeholders' interests. Decision making among these complex interactions should also account for the uncertainty present in the input data. In response, this work develops a stochastic decision analysis framework to evaluate alternatives by involving stakeholders to identify both quantitative and qualitative selection criteria and performance metrics which carry uncertainties. The developed framework is illustrated using a case study from Fairbanks, Alaska, where decision makers and residents must decide on a new source of energy for heating and electricity. We approach this problem in a five step methodology: (1) engaging experts (role players) to develop criteria of project performance; (2) collecting a range of quantitative and qualitative input information to determine the performance of each proposed solution according to the selected criteria; (3) performing a Monte-Carlo analysis to capture uncertainties given in the inputs; (4) applying multi-criteria decision-making, social choice (voting), and fallback bargaining methods to account for three different levels of cooperation among the stakeholders; and (5) computing an aggregate performance index (API) score for each alternative based on its performance across criteria and cooperation levels. API scores communicate relative performance between alternatives. In this way, our methodology maps uncertainty from the input data to reflect risk in the decision and incorporates varying degrees of cooperation into the analysis to identify an optimal and practical alternative. - Highlights: • We develop an applicable stakeholder-driven framework for

  14. Uncertainty evaluation in normalization of isotope delta measurement results against international reference materials.

    Science.gov (United States)

    Meija, Juris; Chartrand, Michelle M G

    2018-01-01

    Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.

  15. Scenario and modelling uncertainty in global mean temperature change derived from emission driven Global Climate Models

    Science.gov (United States)

    Booth, B. B. B.; Bernie, D.; McNeall, D.; Hawkins, E.; Caesar, J.; Boulton, C.; Friedlingstein, P.; Sexton, D.

    2012-09-01

    We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission driven rather than concentration driven perturbed parameter ensemble of a Global Climate Model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration driven simulations (with 10-90 percentile ranges of 1.7 K for the aggressive mitigation scenario up to 3.9 K for the high end business as usual scenario). A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 degrees (RCP8.5) and even under aggressive mitigation (RCP2.6) temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission driven experiments, they do not change existing expectations (based on previous concentration driven experiments) on the timescale that different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in case of SRES A1B and the Representative Concentration Pathways (RCPs), the concentration pathways used to drive GCM ensembles lies towards the lower end of our simulated distribution. This design decision (a legecy of previous assessments) is likely to lead concentration driven experiments to under-sample strong feedback responses in concentration driven projections. Our ensemble of emission driven simulations span the global temperature response of other multi-model frameworks except at the low end, where combinations of low climate sensitivity and low carbon cycle feedbacks lead to responses outside our ensemble range. The ensemble simulates a number of high end responses which lie above the CMIP5 carbon

  16. Scenario and modelling uncertainty in global mean temperature change derived from emission-driven global climate models

    Science.gov (United States)

    Booth, B. B. B.; Bernie, D.; McNeall, D.; Hawkins, E.; Caesar, J.; Boulton, C.; Friedlingstein, P.; Sexton, D. M. H.

    2013-04-01

    We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission-driven rather than concentration-driven perturbed parameter ensemble of a global climate model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration-driven simulations (with 10-90th percentile ranges of 1.7 K for the aggressive mitigation scenario, up to 3.9 K for the high-end, business as usual scenario). A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 K (RCP8.5) and even under aggressive mitigation (RCP2.6) temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission-driven experiments, they do not change existing expectations (based on previous concentration-driven experiments) on the timescales over which different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in the case of SRES A1B and the Representative Concentration Pathways (RCPs), the concentration scenarios used to drive GCM ensembles, lies towards the lower end of our simulated distribution. This design decision (a legacy of previous assessments) is likely to lead concentration-driven experiments to under-sample strong feedback responses in future projections. Our ensemble of emission-driven simulations span the global temperature response of the CMIP5 emission-driven simulations, except at the low end. Combinations of low climate sensitivity and low carbon cycle feedbacks lead to a number of CMIP5 responses to lie below our ensemble range. The ensemble simulates a number of high-end responses which lie

  17. Scenario and modelling uncertainty in global mean temperature change derived from emission-driven global climate models

    Directory of Open Access Journals (Sweden)

    B. B. B. Booth

    2013-04-01

    Full Text Available We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission-driven rather than concentration-driven perturbed parameter ensemble of a global climate model (GCM. These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration-driven simulations (with 10–90th percentile ranges of 1.7 K for the aggressive mitigation scenario, up to 3.9 K for the high-end, business as usual scenario. A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 K (RCP8.5 and even under aggressive mitigation (RCP2.6 temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission-driven experiments, they do not change existing expectations (based on previous concentration-driven experiments on the timescales over which different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in the case of SRES A1B and the Representative Concentration Pathways (RCPs, the concentration scenarios used to drive GCM ensembles, lies towards the lower end of our simulated distribution. This design decision (a legacy of previous assessments is likely to lead concentration-driven experiments to under-sample strong feedback responses in future projections. Our ensemble of emission-driven simulations span the global temperature response of the CMIP5 emission-driven simulations, except at the low end. Combinations of low climate sensitivity and low carbon cycle feedbacks lead to a number of CMIP5 responses to lie below our ensemble range. The ensemble simulates a number of high

  18. Uncertainty budget in internal monostandard NAA for small and large size samples analysis

    International Nuclear Information System (INIS)

    Dasari, K.B.; Acharya, R.

    2014-01-01

    Total uncertainty budget evaluation on determined concentration value is important under quality assurance programme. Concentration calculation in NAA or carried out by relative NAA and k0 based internal monostandard NAA (IM-NAA) method. IM-NAA method has been used for small and large sample analysis of clay potteries. An attempt was made to identify the uncertainty components in IM-NAA and uncertainty budget for La in both small and large size samples has been evaluated and compared. (author)

  19. Qualification and application of nuclear reactor accident analysis code with the capability of internal assessment of uncertainty

    International Nuclear Information System (INIS)

    Borges, Ronaldo Celem

    2001-10-01

    This thesis presents an independent qualification of the CIAU code ('Code with the capability of - Internal Assessment of Uncertainty') which is part of the internal uncertainty evaluation process with a thermal hydraulic system code on a realistic basis. This is done by combining the uncertainty methodology UMAE ('Uncertainty Methodology based on Accuracy Extrapolation') with the RELAP5/Mod3.2 code. This allows associating uncertainty band estimates with the results obtained by the realistic calculation of the code, meeting licensing requirements of safety analysis. The independent qualification is supported by simulations with RELAP5/Mod3.2 related to accident condition tests of LOBI experimental facility and to an event which has occurred in Angra 1 nuclear power plant, by comparison with measured results and by establishing uncertainty bands on safety parameter calculated time trends. These bands have indeed enveloped the measured trends. Results from this independent qualification of CIAU have allowed to ascertain the adequate application of a systematic realistic code procedure to analyse accidents with uncertainties incorporated in the results, although there is an evident need of extending the uncertainty data base. It has been verified that use of the code with this internal assessment of uncertainty is feasible in the design and license stages of a NPP. (author)

  20. International survey for good practices in forecasting uncertainty assessment and communication

    Science.gov (United States)

    Berthet, Lionel; Piotte, Olivier

    2014-05-01

    Achieving technically sound flood forecasts is a crucial objective for forecasters but remains of poor use if the users do not understand properly their significance and do not use it properly in decision making. One usual way to precise the forecasts limitations is to communicate some information about their uncertainty. Uncertainty assessment and communication to stakeholders are thus important issues for operational flood forecasting services (FFS) but remain open fields for research. French FFS wants to publish graphical streamflow and level forecasts along with uncertainty assessment in near future on its website (available to the greater public). In order to choose the technical options best adapted to its operational context, it carried out a survey among more than 15 fellow institutions. Most of these are providing forecasts and warnings to civil protection officers while some were mostly working for hydroelectricity suppliers. A questionnaire has been prepared in order to standardize the analysis of the practices of the surveyed institutions. The survey was conducted by gathering information from technical reports or from the scientific literature, as well as 'interviews' driven by phone, email discussions or meetings. The questionnaire helped in the exploration of practices in uncertainty assessment, evaluation and communication. Attention was paid to the particular context within which every insitution works, in the analysis drawn from raw results. Results show that most services interviewed assess their forecasts uncertainty. However, practices can differ significantly from a country to another. Popular techniques are ensemble approaches. They allow to take into account several uncertainty sources. Statistical past forecasts analysis (such as the quantile regressions) are also commonly used. Contrary to what was expected, only few services emphasize the role of the forecaster (subjective assessment). Similar contrasts can be observed in uncertainty

  1. Management of internal communication in times of uncertainty

    International Nuclear Information System (INIS)

    Fernandez de la Gala, F.

    2014-01-01

    Garona is having a strong media coverage since 2009. The continuity process is under great controversy that has generated increased uncertainty for workers and their families, affecting motivation. Although internal communication has sought to manage its effects on the structure of the company, the rate of spread of alien information has made this complex mission. The regulatory body has been interested in its potential impact on safety culture, making a significant difference compared to other industrial sectors. (Author)

  2. International conference on sub-critical accelerator driven systems. Proceedings

    International Nuclear Information System (INIS)

    Litovkina, L.P.; Titarenko, Yu.E.

    1999-01-01

    The International Meeting on Sub-Critical Accelerator Driven Systems was organized by the State Scientific Center - Institute for Theoretical and Experimental Physics with participation of Atomic Ministry of RF. The Meeting objective was to analyze the recent achievements and tendencies of the accelerator-driven systems development. The Meeting program covers a broad range of problems including the accelerator-driven systems (ADS) conceptual design; analyzing the ADS role in nuclear fuel cycle; accuracy of modeling the main parameters of ADS; conceptual design of high-current accelerators. Moreover, the results of recent experimental and theoretical studies on nuclear data accumulation to support the ADS technologies are presented. About 70 scientists from the main scientific centers of Russia, as well as scientists from USA, France, Belgium, India, and Yugoslavia, attended the meeting and presented 44 works [ru

  3. The Second International Workshop on Squeezed States and Uncertainty Relations

    Science.gov (United States)

    Han, D. (Editor); Kim, Y. S.; Manko, V. I.

    1993-01-01

    This conference publication contains the proceedings of the Second International Workshop on Squeezed States and Uncertainty Relations held in Moscow, Russia, on 25-29 May 1992. The purpose of this workshop was to study possible applications of squeezed states of light. The Workshop brought together many active researchers in squeezed states of light and those who may find the concept of squeezed states useful in their research, particularly in understanding the uncertainty relations. It was found at this workshop that the squeezed state has a much broader implication than the two-photon coherent states in quantum optics, since the squeeze transformation is one of the most fundamental transformations in physics.

  4. International cooperation behind the veil of uncertainty. The case of transboundary pollution

    International Nuclear Information System (INIS)

    Helm, C.

    1998-01-01

    The complexities of international environmental problems are only poorly understood. Hence, decision makers have to negotiate about abatement measures even though they do not know the 'true' model of the ecological system and have only a rough idea about the costs and benefits of their action. It will be analysed to what extent this kind of 'model uncertainty' - where players do not only have incomplete information about the payoff functions of the other players, but also about their own payoff function - affects the prospects of international cooperation. Using a simple game- theoretic model, it is shown how countries can use the veil of uncertainty to hide their distributional interests. The arguments are based on a deviation from the common prior assumption, which seems particularly questionable in a setting comprising various countries with different cultural and scientific backgrounds. Finally the model will prove useful to quantitatively and qualitatively illustrate the role of model uncertainty in the negotiations of the first Sulphur Protocol signed to combat boundary acidification. 26 refs

  5. Internally driven inertial waves in geodynamo simulations

    Science.gov (United States)

    Ranjan, A.; Davidson, P. A.; Christensen, U. R.; Wicht, J.

    2018-05-01

    Inertial waves are oscillations in a rotating fluid, such as the Earth's outer core, which result from the restoring action of the Coriolis force. In an earlier work, it was argued by Davidson that inertial waves launched near the equatorial regions could be important for the α2 dynamo mechanism, as they can maintain a helicity distribution which is negative (positive) in the north (south). Here, we identify such internally driven inertial waves, triggered by buoyant anomalies in the equatorial regions in a strongly forced geodynamo simulation. Using the time derivative of vertical velocity, ∂uz/∂t, as a diagnostic for traveling wave fronts, we find that the horizontal movement in the buoyancy field near the equator is well correlated with a corresponding movement of the fluid far from the equator. Moreover, the azimuthally averaged spectrum of ∂uz/∂t lies in the inertial wave frequency range. We also test the dispersion properties of the waves by computing the spectral energy as a function of frequency, ϖ, and the dispersion angle, θ. Our results suggest that the columnar flow in the rotation-dominated core, which is an important ingredient for the maintenance of a dipolar magnetic field, is maintained despite the chaotic evolution of the buoyancy field on a fast timescale by internally driven inertial waves.

  6. Determination of internal series resistance of PV devices: repeatability and uncertainty

    International Nuclear Information System (INIS)

    Trentadue, Germana; Pavanello, Diego; Salis, Elena; Field, Mike; Müllejans, Harald

    2016-01-01

    The calibration of photovoltaic devices requires the measurement of their current–voltage characteristics at standard test conditions (STC). As the latter can only be reached approximately, a curve translation is necessary, requiring among others the internal series resistance of the photovoltaic device as an input parameter. Therefore accurate and reliable determination of the series resistance is important in measurement and test laboratories. This work follows standard IEC 60891 ed 2 (2009) for the determination of the internal series resistance and investigates repeatability and uncertainty of the result in three aspects for a number of typical photovoltaic technologies. Firstly the effect of varying device temperature on the determined series resistance is determined experimentally and compared to a theoretical derivation showing agreement. It is found that the series resistance can be determined with an uncertainty of better than 5% if the device temperature is stable within  ±0.1 °C, whereas the temperature range of  ±2 °C allowed by the standard leads to much larger variations. Secondly the repeatability of the series resistance determination with respect to noise in current–voltage measurement is examined yielding typical values of  ±5%. Thirdly the determination of the series resistance using three different experimental set-ups (solar simulators) shows agreement on the level of  ±5% for crystalline Silicon photovoltaic devices and deviations up to 15% for thin-film devices. It is concluded that the internal series resistance of photovoltaic devices could be determined with an uncertainty of better than 10%. The influence of this uncertainty in series resistance on the electrical performance parameters of photovoltaic devices was estimated and showed a contribution of 0.05% for open-circuit voltage and 0.1% for maximum power. Furthermore it is concluded that the range of device temperatures allowed during determination of series

  7. A probabilistic approach to quantify the uncertainties in internal dose assessment using response surface and neural network

    International Nuclear Information System (INIS)

    Baek, M.; Lee, S.K.; Lee, U.C.; Kang, C.S.

    1996-01-01

    A probabilistic approach is formulated to assess the internal radiation exposure following the intake of radioisotopes. This probabilistic approach consists of 4 steps as follows: (1) screening, (2) quantification of uncertainties, (3) propagation of uncertainties, and (4) analysis of output. The approach has been applied for Pu-induced internal dose assessment and a multi-compartment dosimetric model is used for internal transport. In this approach, surrogate models of original system are constructed using response and neural network. And the results of these surrogate models are compared with those of original model. Each surrogate model well approximates the original model. The uncertainty and sensitivity analysis of the model parameters are evaluated in this process. Dominant contributors to each organ are identified and the results show that this approach could serve a good tool of assessing the internal radiation exposure

  8. Uncertainties in Future Regional Sea Level Trends: How to Deal with the Internal Climate Variability?

    Science.gov (United States)

    Becker, M.; Karpytchev, M.; Hu, A.; Deser, C.; Lennartz-Sassinek, S.

    2017-12-01

    Today, the Climate models (CM) are the main tools for forecasting sea level rise (SLR) at global and regional scales. The CM forecasts are accompanied by inherent uncertainties. Understanding and reducing these uncertainties is becoming a matter of increasing urgency in order to provide robust estimates of SLR impact on coastal societies, which need sustainable choices of climate adaptation strategy. These CM uncertainties are linked to structural model formulation, initial conditions, emission scenario and internal variability. The internal variability is due to complex non-linear interactions within the Earth Climate System and can induce diverse quasi-periodic oscillatory modes and long-term persistences. To quantify the effects of internal variability, most studies used multi-model ensembles or sea level projections from a single model ran with perturbed initial conditions. However, large ensembles are not generally available, or too small, and computationally expensive. In this study, we use a power-law scaling of sea level fluctuations, as observed in many other geophysical signals and natural systems, which can be used to characterize the internal climate variability. From this specific statistical framework, we (1) use the pre-industrial control run of the National Center for Atmospheric Research Community Climate System Model (NCAR-CCSM) to test the robustness of the power-law scaling hypothesis; (2) employ the power-law statistics as a tool for assessing the spread of regional sea level projections due to the internal climate variability for the 21st century NCAR-CCSM; (3) compare the uncertainties in predicted sea level changes obtained from a NCAR-CCSM multi-member ensemble simulations with estimates derived for power-law processes, and (4) explore the sensitivity of spatial patterns of the internal variability and its effects on regional sea level projections.

  9. Uncertainty in Twenty-First-Century CMIP5 Sea Level Projections

    Science.gov (United States)

    Little, Christopher M.; Horton, Radley M.; Kopp, Robert E.; Oppenheimer, Michael; Yip, Stan

    2015-01-01

    The representative concentration pathway (RCP) simulations included in phase 5 of the Coupled Model Intercomparison Project (CMIP5) quantify the response of the climate system to different natural and anthropogenic forcing scenarios. These simulations differ because of 1) forcing, 2) the representation of the climate system in atmosphere-ocean general circulation models (AOGCMs), and 3) the presence of unforced (internal) variability. Global and local sea level rise projections derived from these simulations, and the emergence of distinct responses to the four RCPs depend on the relative magnitude of these sources of uncertainty at different lead times. Here, the uncertainty in CMIP5 projections of sea level is partitioned at global and local scales, using a 164-member ensemble of twenty-first-century simulations. Local projections at New York City (NYSL) are highlighted. The partition between model uncertainty, scenario uncertainty, and internal variability in global mean sea level (GMSL) is qualitatively consistent with that of surface air temperature, with model uncertainty dominant for most of the twenty-first century. Locally, model uncertainty is dominant through 2100, with maxima in the North Atlantic and the Arctic Ocean. The model spread is driven largely by 4 of the 16 AOGCMs in the ensemble; these models exhibit outlying behavior in all RCPs and in both GMSL and NYSL. The magnitude of internal variability varies widely by location and across models, leading to differences of several decades in the local emergence of RCPs. The AOGCM spread, and its sensitivity to model exclusion and/or weighting, has important implications for sea level assessments, especially if a local risk management approach is utilized.

  10. SPRT Calibration Uncertainties and Internal Quality Control at a Commercial SPRT Calibration Facility

    Science.gov (United States)

    Wiandt, T. J.

    2008-06-01

    The Hart Scientific Division of the Fluke Corporation operates two accredited standard platinum resistance thermometer (SPRT) calibration facilities, one at the Hart Scientific factory in Utah, USA, and the other at a service facility in Norwich, UK. The US facility is accredited through National Voluntary Laboratory Accreditation Program (NVLAP), and the UK facility is accredited through UKAS. Both provide SPRT calibrations using similar equipment and procedures, and at similar levels of uncertainty. These uncertainties are among the lowest available commercially. To achieve and maintain low uncertainties, it is required that the calibration procedures be thorough and optimized. However, to minimize customer downtime, it is also important that the instruments be calibrated in a timely manner and returned to the customer. Consequently, subjecting the instrument to repeated calibrations or extensive repeated measurements is not a viable approach. Additionally, these laboratories provide SPRT calibration services involving a wide variety of SPRT designs. These designs behave differently, yet predictably, when subjected to calibration measurements. To this end, an evaluation strategy involving both statistical process control and internal consistency measures is utilized to provide confidence in both the instrument calibration and the calibration process. This article describes the calibration facilities, procedure, uncertainty analysis, and internal quality assurance measures employed in the calibration of SPRTs. Data will be reviewed and generalities will be presented. Finally, challenges and considerations for future improvements will be discussed.

  11. A review of the uncertainties in internal radiation dose assessment for inhaled thorium

    International Nuclear Information System (INIS)

    Hewson, G.S.

    1989-01-01

    Present assessments of internal radiation dose to designated radiation workers in the mineral sands industry, calculated using ICRP 26/30 methodology and data, indicate that some workers approach and exceed statutory radiation dose limits. Such exposures are indicative of the need for a critical assessment of work and operational procedures and also of metabolic and dosimetric models used to estimate internal dose. This paper reviews past occupational exposure experience with inhaled thorium compounds, examines uncertainties in the underlying radiation protection models, and indicates the effect of alternative assumptions on the calculation of committed effective dose equivalent. The extremely low recommended inhalation limits for thorium in air do not appear to be well supported by studies on the health status of former thorium refinery workers who were exposed to thorium well in excess of presently accepted limits. The effect of cautious model assumptions is shown to result in internal dose assessments that could be up to an order of magnitude too high. It is concluded that the effect of such uncertainty constrains the usefulness of internal dose estimates as a reliable indicator of actual health risk. 26 refs., 5 figs., 3 tabs

  12. Fourth International Conference on Squeezed States and Uncertainty Relations

    Science.gov (United States)

    Han, D. (Editor); Peng, Kunchi (Editor); Kim, Y. S. (Editor); Manko, V. I. (Editor)

    1996-01-01

    The fourth International Conference on Squeezed States and Uncertainty Relations was held at Shanxi University, Taiyuan, Shanxi, China, on June 5 - 9, 1995. This conference was jointly organized by Shanxi University, the University of Maryland (U.S.A.), and the Lebedev Physical Institute (Russia). The first meeting of this series was called the Workshop on Squeezed States and Uncertainty Relations, and was held in 1991 at College Park, Maryland. The second and third meetings in this series were hosted in 1992 by the Lebedev Institute in Moscow, and in 1993 by the University of Maryland Baltimore County, respectively. The scientific purpose of this series was initially to discuss squeezed states of light, but in recent years, the scope is becoming broad enough to include studies of uncertainty relations and squeeze transformations in all branches of physics, including, of course, quantum optics and foundations of quantum mechanics. Quantum optics will continue playing the pivotal role in the future, but the future meetings will include all branches of physics where squeeze transformations are basic transformation. This transition took place at the fourth meeting of this series held at Shanxi University in 1995. The fifth meeting in this series will be held in Budapest (Hungary) in 1997, and the principal organizer will be Jozsef Janszky of the Laboratory of Crystal Physics, P.O. Box 132, H-1052. Budapest, Hungary.

  13. Awe, uncertainty, and agency detection.

    Science.gov (United States)

    Valdesolo, Piercarlo; Graham, Jesse

    2014-01-01

    Across five studies, we found that awe increases both supernatural belief (Studies 1, 2, and 5) and intentional-pattern perception (Studies 3 and 4)-two phenomena that have been linked to agency detection, or the tendency to interpret events as the consequence of intentional and purpose-driven agents. Effects were both directly and conceptually replicated, and mediational analyses revealed that these effects were driven by the influence of awe on tolerance for uncertainty. Experiences of awe decreased tolerance for uncertainty, which, in turn, increased the tendency to believe in nonhuman agents and to perceive human agency in random events.

  14. Narrative of certitude for uncertainty normalisation regarding biotechnology in international organisations

    OpenAIRE

    Heath , Robert; Proutheau , Stéphanie

    2012-01-01

    International audience; Narrative theory has gained prominence especially as a companion to social construction of reality In matters of regulation and normalization, narratives socially and culturallyconstruct relevant contingencies, uncertainties, values, and decision. Here, decision dynamics pit risk generators, bearers, bearers' advocates, arbiters, researchers and informers as advocates and counter advocates (Palmlund, 2009). the decision-relevant narrative components (actors, themes, sc...

  15. Quantification of uncertainty in photon source spot size inference during laser-driven radiography experiments at TRIDENT

    Energy Technology Data Exchange (ETDEWEB)

    Tobias, Benjamin John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Palaniyappan, Sasikumar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gautier, Donald Cort [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mendez, Jacob [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Burris-Mog, Trevor John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Huang, Chengkun K. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Favalli, Andrea [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hunter, James F. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Espy, Michelle E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Schmidt, Derek William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Nelson, Ronald Owen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sefkow, Adam [Univ. of Rochester, NY (United States); Shimada, Tsutomu [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Johnson, Randall Philip [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fernandez, Juan Carlos [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-24

    Images of the R2DTO resolution target were obtained during laser-driven-radiography experiments performed at the TRIDENT laser facility, and analysis of these images using the Bayesian Inference Engine (BIE) determines a most probable full-width half maximum (FWHM) spot size of 78 μm. However, significant uncertainty prevails due to variation in the measured detector blur. Propagating this uncertainty in detector blur through the forward model results in an interval of probabilistic ambiguity spanning approximately 35-195 μm when the laser energy impinges on a thick (1 mm) tantalum target. In other phases of the experiment, laser energy is deposited on a thin (~100 nm) aluminum target placed 250 μm ahead of the tantalum converter. When the energetic electron beam is generated in this manner, upstream from the bremsstrahlung converter, the inferred spot size shifts to a range of much larger values, approximately 270-600 μm FWHM. This report discusses methods applied to obtain these intervals as well as concepts necessary for interpreting the result within a context of probabilistic quantitative inference.

  16. Reliability of a new biokinetic model of zirconium in internal dosimetry: part I, parameter uncertainty analysis.

    Science.gov (United States)

    Li, Wei Bo; Greiter, Matthias; Oeh, Uwe; Hoeschen, Christoph

    2011-12-01

    The reliability of biokinetic models is essential in internal dose assessments and radiation risk analysis for the public, occupational workers, and patients exposed to radionuclides. In this paper, a method for assessing the reliability of biokinetic models by means of uncertainty and sensitivity analysis was developed. The paper is divided into two parts. In the first part of the study published here, the uncertainty sources of the model parameters for zirconium (Zr), developed by the International Commission on Radiological Protection (ICRP), were identified and analyzed. Furthermore, the uncertainty of the biokinetic experimental measurement performed at the Helmholtz Zentrum München-German Research Center for Environmental Health (HMGU) for developing a new biokinetic model of Zr was analyzed according to the Guide to the Expression of Uncertainty in Measurement, published by the International Organization for Standardization. The confidence interval and distribution of model parameters of the ICRP and HMGU Zr biokinetic models were evaluated. As a result of computer biokinetic modelings, the mean, standard uncertainty, and confidence interval of model prediction calculated based on the model parameter uncertainty were presented and compared to the plasma clearance and urinary excretion measured after intravenous administration. It was shown that for the most important compartment, the plasma, the uncertainty evaluated for the HMGU model was much smaller than that for the ICRP model; that phenomenon was observed for other organs and tissues as well. The uncertainty of the integral of the radioactivity of Zr up to 50 y calculated by the HMGU model after ingestion by adult members of the public was shown to be smaller by a factor of two than that of the ICRP model. It was also shown that the distribution type of the model parameter strongly influences the model prediction, and the correlation of the model input parameters affects the model prediction to a

  17. International conference on Facets of Uncertainties and Applications

    CERN Document Server

    Skowron, Andrzej; Maiti, Manoranjan; Kar, Samarjit

    2015-01-01

    Since the emergence of the formal concept of probability theory in the seventeenth century, uncertainty has been perceived solely in terms of probability theory. However, this apparently unique link between uncertainty and probability theory has come under investigation a few decades back. Uncertainties are nowadays accepted to be of various kinds. Uncertainty in general could refer to different sense like not certainly known, questionable, problematic, vague, not definite or determined, ambiguous, liable to change, not reliable. In Indian languages, particularly in Sanskrit-based languages, there are other higher levels of uncertainties. It has been shown that several mathematical concepts such as the theory of fuzzy sets, theory of rough sets, evidence theory, possibility theory, theory of complex systems and complex network, theory of fuzzy measures and uncertainty theory can also successfully model uncertainty.

  18. Response of ENSO amplitude to global warming in CESM large ensemble: uncertainty due to internal variability

    Science.gov (United States)

    Zheng, Xiao-Tong; Hui, Chang; Yeh, Sang-Wook

    2018-06-01

    El Niño-Southern Oscillation (ENSO) is the dominant mode of variability in the coupled ocean-atmospheric system. Future projections of ENSO change under global warming are highly uncertain among models. In this study, the effect of internal variability on ENSO amplitude change in future climate projections is investigated based on a 40-member ensemble from the Community Earth System Model Large Ensemble (CESM-LE) project. A large uncertainty is identified among ensemble members due to internal variability. The inter-member diversity is associated with a zonal dipole pattern of sea surface temperature (SST) change in the mean along the equator, which is similar to the second empirical orthogonal function (EOF) mode of tropical Pacific decadal variability (TPDV) in the unforced control simulation. The uncertainty in CESM-LE is comparable in magnitude to that among models of the Coupled Model Intercomparison Project phase 5 (CMIP5), suggesting the contribution of internal variability to the intermodel uncertainty in ENSO amplitude change. However, the causations between changes in ENSO amplitude and the mean state are distinct between CESM-LE and CMIP5 ensemble. The CESM-LE results indicate that a large ensemble of 15 members is needed to separate the relative contributions to ENSO amplitude change over the twenty-first century between forced response and internal variability.

  19. Uncertainty

    International Nuclear Information System (INIS)

    Silva, T.A. da

    1988-01-01

    The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt

  20. Development of an expert system for the taking into account of uncertainties in the monitoring of internal contaminations

    International Nuclear Information System (INIS)

    Davesne, E.; Blanchardon, E.; Casanova, P.; Chojnacki, E.; Paquet, F.

    2010-01-01

    Internal contaminations may result from professional exposure and they can be monitored by anthropo-radiometric and radio-toxicological measurements which are interpreted in terms of embedded activity and effective dose by means of biokinetic and dosimetric models. In spite of standards, some uncertainties in the dosimetric interpretation of radio-toxicological measurements may remain. The authors report the development of a software (OPSCI code) which takes into account uncertainties related to the worker internal dosimetry, the calculation of the minimum detectable dose related to an exposure, and the development of a data monitoring programme

  1. Optimization of internal contamination monitoring programmes by studying uncertainties linked to dosimetric assessment

    International Nuclear Information System (INIS)

    Davesne, Estelle

    2010-01-01

    To optimise the protection of workers against ionizing radiations, the International Commission on Radiological Protection recommends the use of dose constraint and limits. To verify the compliance of the means of protection with these values when a risk of internal contamination exists, monitoring programmes formed of periodic bioassay measurements are performed. However, uncertainty in the dose evaluation arises from the variability of the activity measurement and from the incomplete knowledge of the exposure conditions. This uncertainty was taken into account by means of classical, Bayesian and possibilist statistics. The developed methodology was applied to the evaluation of the potential exposure during nuclear fuel preparation or mining; and to the analysis of the monitoring programme of workers purifying plutonium in AREVA NC La Hague reprocessing plant. From the measurement decision threshold, the minimum dose detectable (MDD) by the programme with a given confidence level can be calculated through the software OPSCI. It is shown to be a useful support in the optimisation of monitoring programmes when seeking a compromise between their sensitivity and their costs. (author)

  2. Deriving proper measurement uncertainty from Internal Quality Control data: An impossible mission?

    Science.gov (United States)

    Ceriotti, Ferruccio

    2018-03-30

    Measurement uncertainty (MU) is a "non-negative parameter characterizing the dispersion of the quantity values being attributed to a measurand, based on the information used". In the clinical laboratory the most convenient way to calculate MU is the "top down" approach based on the use of Internal Quality Control data. As indicated in the definition, MU depends on the information used for its calculation and so different estimates of MU can be obtained. The most problematic aspect is how to deal with bias. In fact bias is difficult to detect and quantify and it should be corrected including only the uncertainty derived from this correction. Several approaches to calculate MU starting from Internal Quality Control data are presented. The minimum requirement is to use only the intermediate precision data, provided to include 6 months of results obtained with a commutable quality control material at a concentration close to the clinical decision limit. This approach is the minimal requirement and it is convenient for all those measurands that are especially used for monitoring or where a reference measurement system does not exist and so a reference for calculating the bias is lacking. Other formulas including the uncertainty of the value of the calibrator, including the bias from a commutable certified reference material or from a material specifically prepared for trueness verification, including the bias derived from External Quality Assessment schemes or from historical mean of the laboratory are presented and commented. MU is an important parameter, but a single, agreed upon way to calculate it in a clinical laboratory is not yet available. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  3. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    International Nuclear Information System (INIS)

    Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-01-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has

  4. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-11-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach

  5. Uncertainty in Indian Ocean Dipole response to global warming: the role of internal variability

    Science.gov (United States)

    Hui, Chang; Zheng, Xiao-Tong

    2018-01-01

    The Indian Ocean Dipole (IOD) is one of the leading modes of interannual sea surface temperature (SST) variability in the tropical Indian Ocean (TIO). The response of IOD to global warming is quite uncertain in climate model projections. In this study, the uncertainty in IOD change under global warming, especially that resulting from internal variability, is investigated based on the community earth system model large ensemble (CESM-LE). For the IOD amplitude change, the inter-member uncertainty in CESM-LE is about 50% of the intermodel uncertainty in the phase 5 of the coupled model intercomparison project (CMIP5) multimodel ensemble, indicating the important role of internal variability in IOD future projection. In CESM-LE, both the ensemble mean and spread in mean SST warming show a zonal positive IOD-like (pIOD-like) pattern in the TIO. This pIOD-like mean warming regulates ocean-atmospheric feedbacks of the interannual IOD mode, and weakens the skewness of the interannual variability. However, as the changes in oceanic and atmospheric feedbacks counteract each other, the inter-member variability in IOD amplitude change is not correlated with that of the mean state change. Instead, the ensemble spread in IOD amplitude change is correlated with that in ENSO amplitude change in CESM-LE, reflecting the close inter-basin relationship between the tropical Pacific and Indian Ocean in this model.

  6. Internal dose assessments: Uncertainty studies and update of ideas guidelines and databases within CONRAD project

    International Nuclear Information System (INIS)

    Marsh, J. W.; Castellani, C. M.; Hurtgen, C.; Lopez, M. A.; Andrasi, A.; Bailey, M. R.; Birchall, A.; Blanchardon, E.; Desai, A. D.; Dorrian, M. D.; Doerfel, H.; Koukouliou, V.; Luciani, A.; Malatova, I.; Molokanov, A.; Puncher, M.; Vrba, T.

    2008-01-01

    The work of Task Group 5.1 (uncertainty studies and revision of IDEAS guidelines) and Task Group 5.5 (update of IDEAS databases) of the CONRAD project is described. Scattering factor (SF) values (i.e. measurement uncertainties) have been calculated for different radionuclides and types of monitoring data using real data contained in the IDEAS Internal Contamination Database. Based upon this work and other published values, default SF values are suggested. Uncertainty studies have been carried out using both a Bayesian approach as well as a frequentist (classical) approach. The IDEAS guidelines have been revised in areas relating to the evaluation of an effective AMAD, guidance is given on evaluating wound cases with the NCRP wound model and suggestions made on the number and type of measurements required for dose assessment. (authors)

  7. On the rejection of internal and external disturbances in a wind energy conversion system with direct-driven PMSG.

    Science.gov (United States)

    Li, Shengquan; Zhang, Kezhao; Li, Juan; Liu, Chao

    2016-03-01

    This paper deals with the critical issue in a wind energy conversion system (WECS) based on a direct-driven permanent magnet synchronous generator (PMSG): the rejection of lumped disturbance, including the system uncertainties in the internal dynamics and unknown external forces. To simultaneously track the motor speed in real time and capture the maximum power, a maximum power point tracking strategy is proposed based on active disturbance rejection control (ADRC) theory. In real application, system inertia, drive torque and some other parameters change in a wide range with the variations of disturbances and wind speeds, which substantially degrade the performance of WECS. The ADRC design must incorporate the available model information into an extended state observer (ESO) to compensate the lumped disturbance efficiently. Based on this principle, a model-compensation ADRC is proposed in this paper. Simulation study is conducted to evaluate the performance of the proposed control strategy. It is shown that the effect of lumped disturbance is compensated in a more effective way compared with the traditional ADRC approach. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Decision making under internal uncertainty: the case of multiple-choice tests with different scoring rules.

    Science.gov (United States)

    Bereby-Meyer, Yoella; Meyer, Joachim; Budescu, David V

    2003-02-01

    This paper assesses framing effects on decision making with internal uncertainty, i.e., partial knowledge, by focusing on examinees' behavior in multiple-choice (MC) tests with different scoring rules. In two experiments participants answered a general-knowledge MC test that consisted of 34 solvable and 6 unsolvable items. Experiment 1 studied two scoring rules involving Positive (only gains) and Negative (only losses) scores. Although answering all items was the dominating strategy for both rules, the results revealed a greater tendency to answer under the Negative scoring rule. These results are in line with the predictions derived from Prospect Theory (PT) [Econometrica 47 (1979) 263]. The second experiment studied two scoring rules, which allowed respondents to exhibit partial knowledge. Under the Inclusion-scoring rule the respondents mark all answers that could be correct, and under the Exclusion-scoring rule they exclude all answers that might be incorrect. As predicted by PT, respondents took more risks under the Inclusion rule than under the Exclusion rule. The results illustrate that the basic process that underlies choice behavior under internal uncertainty and especially the effect of framing is similar to the process of choice under external uncertainty and can be described quite accurately by PT. Copyright 2002 Elsevier Science B.V.

  9. Benchmarking observational uncertainties for hydrology (Invited)

    Science.gov (United States)

    McMillan, H. K.; Krueger, T.; Freer, J. E.; Westerberg, I.

    2013-12-01

    There is a pressing need for authoritative and concise information on the expected error distributions and magnitudes in hydrological data, to understand its information content. Many studies have discussed how to incorporate uncertainty information into model calibration and implementation, and shown how model results can be biased if uncertainty is not appropriately characterised. However, it is not always possible (for example due to financial or time constraints) to make detailed studies of uncertainty for every research study. Instead, we propose that the hydrological community could benefit greatly from sharing information on likely uncertainty characteristics and the main factors that control the resulting magnitude. In this presentation, we review the current knowledge of uncertainty for a number of key hydrological variables: rainfall, flow and water quality (suspended solids, nitrogen, phosphorus). We collated information on the specifics of the data measurement (data type, temporal and spatial resolution), error characteristics measured (e.g. standard error, confidence bounds) and error magnitude. Our results were primarily split by data type. Rainfall uncertainty was controlled most strongly by spatial scale, flow uncertainty was controlled by flow state (low, high) and gauging method. Water quality presented a more complex picture with many component errors. For all variables, it was easy to find examples where relative error magnitude exceeded 40%. We discuss some of the recent developments in hydrology which increase the need for guidance on typical error magnitudes, in particular when doing comparative/regionalisation and multi-objective analysis. Increased sharing of data, comparisons between multiple catchments, and storage in national/international databases can mean that data-users are far removed from data collection, but require good uncertainty information to reduce bias in comparisons or catchment regionalisation studies. Recently it has

  10. International Target Values 2010 for Measurement Uncertainties in Safeguarding Nuclear Materials

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, M.; Penkin, M.; Norman, C.; Balsley, S. [IAEA, Vienna (Australia); others, and

    2012-12-15

    This issue of the International Target Values (ITVs) represents the sixth revision, following the first release of such tables issued in 1979 by the ESARDA/WGDA. The ITVs are uncertainties to be considered in judging the reliability of analytical techniques applied to industrial nuclear and fissile material, which are subject to safeguards verification. The tabulated values represent estimates of the 'state of the practice' which should be achievable under routine measurement conditions. The most recent standard conventions in representing uncertainty have been considered, while maintaining a format that allows comparison with the previous releases of the ITVs. The present report explains why target values are needed, how the concept evolved and how they relate to the operator's and inspector's measurement systems. The ITVs-2010 are intended to be used by plant operators and safeguards organizations, as a reference of the quality of measurements achievable in nuclear material accountancy, and for planning purposes. The report suggests that the use of ITVs can be beneficial for statistical inferences regarding the significance of operator-inspector differences whenever valid performance values are not available.

  11. Uncertainty analysis of a low flow model for the Rhine River

    NARCIS (Netherlands)

    Demirel, M.C.; Booij, Martijn J.

    2011-01-01

    It is widely recognized that hydrological models are subject to parameter uncertainty. However, little attention has been paid so far to the uncertainty in parameters of the data-driven models like weights in neural networks. This study aims at applying a structured uncertainty analysis to a

  12. International target values 2010 for achievable measurement uncertainties in nuclear material accountancy

    Energy Technology Data Exchange (ETDEWEB)

    Dias, Fabio C., E-mail: fabio@ird.gov.b [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil); Almeida, Silvio G. de; Renha Junior, Geraldo, E-mail: silvio@abacc.org.b, E-mail: grenha@abacc.org.b [Agencia Brasileiro-Argentina de Contabilidade e Controle de Materiais Nucleares (ABACC), Rio de Janeiro, RJ (Brazil)

    2011-07-01

    The International Target Values (ITVs) are reasonable uncertainty estimates that can be used in judging the reliability of measurement techniques applied to industrial nuclear and fissile materials subject to accountancy and/or safeguards verification. In the absence of relevant experimental estimates, ITVs can also be used to select measurement techniques and calculate sample population during the planning phase of verification activities. It is important to note that ITVs represent estimates of the 'state-of-the-practice', which should be achievable under routine measurement conditions affecting both facility operators and safeguards inspectors, not only in the field, but also in laboratory. Tabulated values cover measurement methods used for the determination of volume or mass of the nuclear material, for its elemental and isotopic assays, and for its sampling. The 2010 edition represents the sixth revision of the International Target Values (ITVs), issued by the International Atomic Energy Agency (IAEA) as a Safeguards Technical Report (STR-368). The first version was released as 'Target Values' in 1979 by the Working Group on Techniques and Standards for Destructive Analysis (WGDA) of the European Safeguards Research and Development Association (ESARDA) and focused on destructive analytical methods. In the latest 2010 revision, international standards in estimating and expressing uncertainties have been considered while maintaining a format that allows comparison with the previous editions of the ITVs. Those standards have been usually applied in QC/QA programmes, as well as qualification of methods, techniques and instruments. Representatives of the Brazilian Nuclear Energy Commission (CNEN) and the Brazilian-Argentine Agency for Accounting and Control of Nuclear Materials (ABACC) participated in previous Consultants Group Meetings since the one convened to establish the first list of ITVs released in 1993 and in subsequent revisions

  13. International target values 2010 for achievable measurement uncertainties in nuclear material accountancy

    International Nuclear Information System (INIS)

    Dias, Fabio C.; Almeida, Silvio G. de; Renha Junior, Geraldo

    2011-01-01

    The International Target Values (ITVs) are reasonable uncertainty estimates that can be used in judging the reliability of measurement techniques applied to industrial nuclear and fissile materials subject to accountancy and/or safeguards verification. In the absence of relevant experimental estimates, ITVs can also be used to select measurement techniques and calculate sample population during the planning phase of verification activities. It is important to note that ITVs represent estimates of the 'state-of-the-practice', which should be achievable under routine measurement conditions affecting both facility operators and safeguards inspectors, not only in the field, but also in laboratory. Tabulated values cover measurement methods used for the determination of volume or mass of the nuclear material, for its elemental and isotopic assays, and for its sampling. The 2010 edition represents the sixth revision of the International Target Values (ITVs), issued by the International Atomic Energy Agency (IAEA) as a Safeguards Technical Report (STR-368). The first version was released as 'Target Values' in 1979 by the Working Group on Techniques and Standards for Destructive Analysis (WGDA) of the European Safeguards Research and Development Association (ESARDA) and focused on destructive analytical methods. In the latest 2010 revision, international standards in estimating and expressing uncertainties have been considered while maintaining a format that allows comparison with the previous editions of the ITVs. Those standards have been usually applied in QC/QA programmes, as well as qualification of methods, techniques and instruments. Representatives of the Brazilian Nuclear Energy Commission (CNEN) and the Brazilian-Argentine Agency for Accounting and Control of Nuclear Materials (ABACC) participated in previous Consultants Group Meetings since the one convened to establish the first list of ITVs released in 1993 and in subsequent revisions, including the latest one

  14. Atmospheric CO2 inversions on the mesoscale using data-driven prior uncertainties: quantification of the European terrestrial CO2 fluxes

    Science.gov (United States)

    Kountouris, Panagiotis; Gerbig, Christoph; Rödenbeck, Christian; Karstens, Ute; Koch, Thomas F.; Heimann, Martin

    2018-03-01

    Optimized biogenic carbon fluxes for Europe were estimated from high-resolution regional-scale inversions, utilizing atmospheric CO2 measurements at 16 stations for the year 2007. Additional sensitivity tests with different data-driven error structures were performed. As the atmospheric network is rather sparse and consequently contains large spatial gaps, we use a priori biospheric fluxes to further constrain the inversions. The biospheric fluxes were simulated by the Vegetation Photosynthesis and Respiration Model (VPRM) at a resolution of 0.1° and optimized against eddy covariance data. Overall we estimate an a priori uncertainty of 0.54 GtC yr-1 related to the poor spatial representation between the biospheric model and the ecosystem sites. The sink estimated from the atmospheric inversions for the area of Europe (as represented in the model domain) ranges between 0.23 and 0.38 GtC yr-1 (0.39 and 0.71 GtC yr-1 up-scaled to geographical Europe). This is within the range of posterior flux uncertainty estimates of previous studies using ground-based observations.

  15. Lattice Boltzmann equation calculation of internal, pressure-driven turbulent flow

    International Nuclear Information System (INIS)

    Hammond, L A; Halliday, I; Care, C M; Stevens, A

    2002-01-01

    We describe a mixing-length extension of the lattice Boltzmann approach to the simulation of an incompressible liquid in turbulent flow. The method uses a simple, adaptable, closure algorithm to bound the lattice Boltzmann fluid incorporating a law-of-the-wall. The test application, of an internal, pressure-driven and smooth duct flow, recovers correct velocity profiles for Reynolds number to 1.25 x 10 5 . In addition, the Reynolds number dependence of the friction factor in the smooth-wall branch of the Moody chart is correctly recovered. The method promises a straightforward extension to other curves of the Moody chart and to cylindrical pipe flow

  16. Proceedings of the international symposium on future of accelerator-driven system

    International Nuclear Information System (INIS)

    Sugawara, Takanori

    2012-11-01

    The international Symposium on “Future of Accelerator-Driven System” was held on 29th February, 2012 at Gakushi-Kaikan, Tokyo, Japan hosted by Nuclear Science and Engineering Directorate, JAEA (Japan Atomic Energy Agency) and J-PARC (Japan Proton Accelerator Research Complex) Center. The objectives of the symposium were to make participants acquainted with the current status and future plans for research and development of ADS in the world and to discuss an international collaboration for ADS and P and T (Partitioning and Transmutation) technology. About 100 scientists participated in the symposium from Belgium, China, France, India, Italy, Japan, Korea and Mongol. In the morning session, current R and D activities of ADS in Japan were reported. In the afternoon session, current R and D activities were reported from China, Korea, India, Belgium and EU. A panel discussion took place with regards to the international collaboration for ADS at the final session. Two keynote speakers presented their outlooks on the topics and seven panelists and audience discussed those topics. (author)

  17. Decay heat uncertainty quantification of MYRRHA

    OpenAIRE

    Fiorito Luca; Buss Oliver; Hoefer Axel; Stankovskiy Alexey; Eynde Gert Van den

    2017-01-01

    MYRRHA is a lead-bismuth cooled MOX-fueled accelerator driven system (ADS) currently in the design phase at SCK·CEN in Belgium. The correct evaluation of the decay heat and of its uncertainty level is very important for the safety demonstration of the reactor. In the first part of this work we assessed the decay heat released by the MYRRHA core using the ALEPH-2 burnup code. The second part of the study focused on the nuclear data uncertainty and covariance propagation to the MYRRHA decay hea...

  18. Sixth International Conference on Squeezed States and Uncertainty Relations

    Science.gov (United States)

    Han, D. (Editor); Kim, Y. S. (Editor); Solimento, S. (Editor)

    2000-01-01

    These proceedings contain contributions from about 200 participants to the 6th International Conference on Squeezed States and Uncertainty Relations (ICSSUR'99) held in Naples May 24-29, 1999, and organized jointly by the University of Naples "Federico II," the University of Maryland at College Park, and the Lebedev Institute, Moscow. This was the sixth of a series of very successful meetings started in 1990 at the College Park Campus of the University of Maryland. The other meetings in the series were held in Moscow (1992), Baltimore (1993), Taiyuan P.R.C. (1995) and Balatonfuered, Hungary (1997). The present one was held at the campus Monte Sant'Angelo of the University "Federico II" of Naples. The meeting sought to provide a forum for updating and reviewing a wide range of quantum optics disciplines, including device developments and applications, and related areas of quantum measurements and quantum noise. Over the years, the ICSSUR Conference evolved from a meeting on quantum measurement sector of quantum optics, to a wide range of quantum optics themes, including multifacet aspects of generation, measurement, and applications of nonclassical light (squeezed and Schrodinger cat radiation fields, etc.), and encompassing several related areas, ranging from quantum measurement to quantum noise. ICSSUR'99 brought together about 250 people active in the field of quantum optics, with special emphasis on nonclassical light sources and related areas. The Conference was organized in 8 Sections: Squeezed states and uncertainty relations; Harmonic oscillators and squeeze transformations; Methods of quantum interference and correlations; Quantum measurements; Generation and characterisation of non-classical light; Quantum noise; Quantum communication and information; and Quantum-like systems.

  19. Decay heat uncertainty quantification of MYRRHA

    Directory of Open Access Journals (Sweden)

    Fiorito Luca

    2017-01-01

    Full Text Available MYRRHA is a lead-bismuth cooled MOX-fueled accelerator driven system (ADS currently in the design phase at SCK·CEN in Belgium. The correct evaluation of the decay heat and of its uncertainty level is very important for the safety demonstration of the reactor. In the first part of this work we assessed the decay heat released by the MYRRHA core using the ALEPH-2 burnup code. The second part of the study focused on the nuclear data uncertainty and covariance propagation to the MYRRHA decay heat. Radioactive decay data, independent fission yield and cross section uncertainties/covariances were propagated using two nuclear data sampling codes, namely NUDUNA and SANDY. According to the results, 238U cross sections and fission yield data are the largest contributors to the MYRRHA decay heat uncertainty. The calculated uncertainty values are deemed acceptable from the safety point of view as they are well within the available regulatory limits.

  20. Operational Flexibility Responses to Environmental Uncertainties

    OpenAIRE

    Miller, Kent D.

    1994-01-01

    This study develops and tests a behavioral model of organizational changes in operational flexibility. Regression results using an international data set provide strong support for the general proposition that uncertainties associated with different environmental components--poitical, government policy, macroeconomic, competitive, input and product demand uncertainties--have different implications for firm internal, locational, and supploer flexibility. Slack acts as a buffer attenuating, a...

  1. Optimisation of internal contamination monitoring programme by integration of uncertainties

    International Nuclear Information System (INIS)

    Davesne, E.; Casanova, P.; Chojnacki, E.; Paquet, F.; Blanchardon, E.

    2011-01-01

    Potential internal contamination of workers is monitored by periodic bioassay measurements interpreted in terms of intake and committed effective dose by the use of biokinetic and dosimetric models. After a prospective evaluation of exposure at a workplace, a suitable monitoring programme can be defined by choosing adequate measurement techniques and frequency. In this study, the sensitivity of a programme is evaluated by the minimum intake and dose, which may be detected with a given level of confidence by taking into account uncertainties on exposure conditions and measurements. This is made for programme optimisation, which is performed by comparing the sensitivities of different alternative programmes. These methods were applied at the AREVA NC reprocessing plant and support the current monitoring programme as the best compromise between the cost of the measurements and the sensitivity of the programme. (authors)

  2. Managing uncertainty in adaptation | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2011-03-01

    Mar 1, 2011 ... Along with methodological issues that are common elsewhere, researchers in Africa also face a lack of solid basic information, such as historical climate data or reliable census data. But experience from participatory action research in Africa suggests that scientific uncertainty is not the main obstacle to ...

  3. 46 CFR 32.50-35 - Remote manual shutdown for internal combustion engine driven cargo pump on tank vessels-TB/ALL.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Remote manual shutdown for internal combustion engine... for Cargo Handling § 32.50-35 Remote manual shutdown for internal combustion engine driven cargo pump on tank vessels—TB/ALL. (a) Any tank vessel which is equipped with an internal combustion engine...

  4. Pupil-linked arousal is driven by decision uncertainty and alters serial choice bias

    Science.gov (United States)

    Urai, Anne E.; Braun, Anke; Donner, Tobias H.

    2017-03-01

    While judging their sensory environments, decision-makers seem to use the uncertainty about their choices to guide adjustments of their subsequent behaviour. One possible source of these behavioural adjustments is arousal: decision uncertainty might drive the brain's arousal systems, which control global brain state and might thereby shape subsequent decision-making. Here, we measure pupil diameter, a proxy for central arousal state, in human observers performing a perceptual choice task of varying difficulty. Pupil dilation, after choice but before external feedback, reflects three hallmark signatures of decision uncertainty derived from a computational model. This increase in pupil-linked arousal boosts observers' tendency to alternate their choice on the subsequent trial. We conclude that decision uncertainty drives rapid changes in pupil-linked arousal state, which shape the serial correlation structure of ongoing choice behaviour.

  5. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    Science.gov (United States)

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  6. Calibration uncertainty

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...

  7. Quantifying uncertainty due to internal variability using high-resolution regional climate model simulations

    Science.gov (United States)

    Gutmann, E. D.; Ikeda, K.; Deser, C.; Rasmussen, R.; Clark, M. P.; Arnold, J. R.

    2015-12-01

    The uncertainty in future climate predictions is as large or larger than the mean climate change signal. As such, any predictions of future climate need to incorporate and quantify the sources of this uncertainty. One of the largest sources comes from the internal, chaotic, variability within the climate system itself. This variability has been approximated using the 30 ensemble members of the Community Earth System Model (CESM) large ensemble. Here we examine the wet and dry end members of this ensemble for cool-season precipitation in the Colorado Rocky Mountains with a set of high-resolution regional climate model simulations. We have used the Weather Research and Forecasting model (WRF) to simulate the periods 1990-2000, 2025-2035, and 2070-2080 on a 4km grid. These simulations show that the broad patterns of change depicted in CESM are inherited by the high-resolution simulations; however, the differences in the height and location of the mountains in the WRF simulation, relative to the CESM simulation, means that the location and magnitude of the precipitation changes are very different. We further show that high-resolution simulations with the Intermediate Complexity Atmospheric Research model (ICAR) predict a similar spatial pattern in the change signal as WRF for these ensemble members. We then use ICAR to examine the rest of the CESM Large Ensemble as well as the uncertainty in the regional climate model due to the choice of physics parameterizations.

  8. Quantifying uncertainty in nuclear analytical measurements

    International Nuclear Information System (INIS)

    2004-07-01

    The lack of international consensus on the expression of uncertainty in measurements was recognised by the late 1970s and led, after the issuance of a series of rather generic recommendations, to the publication of a general publication, known as GUM, the Guide to the Expression of Uncertainty in Measurement. This publication, issued in 1993, was based on co-operation over several years by the Bureau International des Poids et Mesures, the International Electrotechnical Commission, the International Federation of Clinical Chemistry, the International Organization for Standardization (ISO), the International Union of Pure and Applied Chemistry, the International Union of Pure and Applied Physics and the Organisation internationale de metrologie legale. The purpose was to promote full information on how uncertainty statements are arrived at and to provide a basis for harmonized reporting and the international comparison of measurement results. The need to provide more specific guidance to different measurement disciplines was soon recognized and the field of analytical chemistry was addressed by EURACHEM in 1995 in the first edition of a guidance report on Quantifying Uncertainty in Analytical Measurements, produced by a group of experts from the field. That publication translated the general concepts of the GUM into specific applications for analytical laboratories and illustrated the principles with a series of selected examples as a didactic tool. Based on feedback from the actual practice, the EURACHEM publication was extensively reviewed in 1997-1999 under the auspices of the Co-operation on International Traceability in Analytical Chemistry (CITAC), and a second edition was published in 2000. Still, except for a single example on the measurement of radioactivity in GUM, the field of nuclear and radiochemical measurements was not covered. The explicit requirement of ISO standard 17025:1999, General Requirements for the Competence of Testing and Calibration

  9. Uncertainty analysis comes to integrated assessment models for climate change…and conversely

    NARCIS (Netherlands)

    Cooke, R.M.

    2012-01-01

    This article traces the development of uncertainty analysis through three generations punctuated by large methodology investments in the nuclear sector. Driven by a very high perceived legitimation burden, these investments aimed at strengthening the scientific basis of uncertainty quantification.

  10. Pupil-linked arousal is driven by decision uncertainty and alters serial choice bias

    NARCIS (Netherlands)

    Urai, A.E.; Braun, A.; Donner, T.H.

    2017-01-01

    While judging their sensory environments, decision-makers seem to use the uncertainty about their choices to guide adjustments of their subsequent behaviour. One possible source of these behavioural adjustments is arousal: decision uncertainty might drive the brain's arousal systems, which control

  11. Routine internal- and external-quality control data in clinical laboratories for estimating measurement and diagnostic uncertainty using GUM principles.

    Science.gov (United States)

    Magnusson, Bertil; Ossowicki, Haakan; Rienitz, Olaf; Theodorsson, Elvar

    2012-05-01

    Healthcare laboratories are increasingly joining into larger laboratory organizations encompassing several physical laboratories. This caters for important new opportunities for re-defining the concept of a 'laboratory' to encompass all laboratories and measurement methods measuring the same measurand for a population of patients. In order to make measurement results, comparable bias should be minimized or eliminated and measurement uncertainty properly evaluated for all methods used for a particular patient population. The measurement as well as diagnostic uncertainty can be evaluated from internal and external quality control results using GUM principles. In this paper the uncertainty evaluations are described in detail using only two main components, within-laboratory reproducibility and uncertainty of the bias component according to a Nordtest guideline. The evaluation is exemplified for the determination of creatinine in serum for a conglomerate of laboratories both expressed in absolute units (μmol/L) and relative (%). An expanded measurement uncertainty of 12 μmol/L associated with concentrations of creatinine below 120 μmol/L and of 10% associated with concentrations above 120 μmol/L was estimated. The diagnostic uncertainty encompasses both measurement uncertainty and biological variation, and can be estimated for a single value and for a difference. This diagnostic uncertainty for the difference for two samples from the same patient was determined to be 14 μmol/L associated with concentrations of creatinine below 100 μmol/L and 14 % associated with concentrations above 100 μmol/L.

  12. Regime-dependent forecast uncertainty of convective precipitation

    Energy Technology Data Exchange (ETDEWEB)

    Keil, Christian; Craig, George C. [Muenchen Univ. (Germany). Meteorologisches Inst.

    2011-04-15

    Forecast uncertainty of convective precipitation is influenced by all scales, but in different ways in different meteorological situations. Forecasts of the high resolution ensemble prediction system COSMO-DE-EPS of Deutscher Wetterdienst (DWD) are used to examine the dominant sources of uncertainty of convective precipitation. A validation with radar data using traditional as well as spatial verification measures highlights differences in precipitation forecast performance in differing weather regimes. When the forecast uncertainty can primarily be associated with local, small-scale processes individual members run with the same variation of the physical parameterisation driven by different global models outperform all other ensemble members. In contrast when the precipitation is governed by the large-scale flow all ensemble members perform similarly. Application of the convective adjustment time scale confirms this separation and shows a regime-dependent forecast uncertainty of convective precipitation. (orig.)

  13. International target values 2000 for measurement uncertainties in safeguarding nuclear materials

    International Nuclear Information System (INIS)

    Aigner, H.; Binner, R.; Kuhn, E.

    2001-01-01

    The IAEA has prepared a revised and updated version of International Target Values (ITVs) for uncertainty components in measurements of nuclear material. The ITVs represent uncertainties to be considered in judging the reliability of analytical techniques applied to industrial nuclear and fissile material subject to safeguards verification. The tabulated values represent estimates of the 'state of the practice' which ought to be achievable under routine conditions by adequately equipped, experienced laboratories. The ITVs 2000 are intended to be used by plant operators and safeguards organizations as a reference of the quality of measurements achievable in nuclear material accountancy, and for planning purposes. The IAEA prepared a draft of a technical report presenting the proposed ITVs 2000, and in April 2000 the chairmen or officers of the panels or organizations listed below were invited to co- author the report and to submit the draft to a discussion by their panels and organizations. Euratom Safeguards Inspectorate, ESAKDA Working Group on Destructive Analysis, ESARDA Working Group on Non Destructive Analysis, Institute of Nuclear Material Management, Japanese Expert Group on ITV-2000, ISO Working Group on Analyses in Spent Fuel Reprocessing, ISO Working Group on Analyses in Uranium Fuel Fabrication, ISO Working Group on Analyses in MOX Fuel Fabrication, Agencia Brasileno-Argentina de Contabilidad y Control de Materiales Nucleares (ABACC). Comments from the above groups were received and incorporated into the final version of the document, completed in April 2001. The ITVs 2000 represent target standard uncertainties, expressing the precision achievable under stipulated conditions. These conditions typically fall in one of the two following categories: 'repeatability conditions' normally encountered during the measurements done within one inspection period; or 'reproducibility conditions' involving additional sources of measurement variability such as

  14. Robustness of dynamic systems with parameter uncertainties

    CERN Document Server

    Balemi, S; Truöl, W

    1992-01-01

    Robust Control is one of the fastest growing and promising areas of research today. In many practical systems there exist uncertainties which have to be considered in the analysis and design of control systems. In the last decade methods were developed for dealing with dynamic systems with unstructured uncertainties such as HOO_ and £I-optimal control. For systems with parameter uncertainties, the seminal paper of V. L. Kharitonov has triggered a large amount of very promising research. An international workshop dealing with all aspects of robust control was successfully organized by S. P. Bhattacharyya and L. H. Keel in San Antonio, Texas, USA in March 1991. We organized the second international workshop in this area in Ascona, Switzer­ land in April 1992. However, this second workshop was restricted to robust control of dynamic systems with parameter uncertainties with the objective to concentrate on some aspects of robust control. This book contains a collection of papers presented at the International W...

  15. An approach to routine individual internal dose monitoring at the object 'Shelter' personnel considering uncertainties

    International Nuclear Information System (INIS)

    Mel'nichuk, D.V.; Bondarenko, O.O.; Medvedjev, S.Yu.

    2002-01-01

    An approach to organisation of routine individual internal dose monitoring of the personnel of the Object 'Shelter' is presented in the work, that considers individualised uncertainties. In this aspect two methods of effective dose assessment based on bioassay are considered in the work: (1) traditional indirect method at which application results of workplace monitoring are not taken into account, and (2) a combined method in which both results of bioassay measurements and workplace monitoring are considered

  16. The Effects of Data-Driven Learning upon Vocabulary Acquisition for Secondary International School Students in Vietnam

    Science.gov (United States)

    Karras, Jacob Nolen

    2016-01-01

    Within the field of computer assisted language learning (CALL), scant literature exists regarding the effectiveness and practicality for secondary students to utilize data-driven learning (DDL) for vocabulary acquisition. In this study, there were 100 participants, who had a mean age of thirteen years, and were attending an international school in…

  17. On uncertainty in information and ignorance in knowledge

    Science.gov (United States)

    Ayyub, Bilal M.

    2010-05-01

    This paper provides an overview of working definitions of knowledge, ignorance, information and uncertainty and summarises formalised philosophical and mathematical framework for their analyses. It provides a comparative examination of the generalised information theory and the generalised theory of uncertainty. It summarises foundational bases for assessing the reliability of knowledge constructed as a collective set of justified true beliefs. It discusses system complexity for ancestor simulation potentials. It offers value-driven communication means of knowledge and contrarian knowledge using memes and memetics.

  18. Drivers And Uncertainties Of Increasing Global Water Scarcity

    Science.gov (United States)

    Scherer, L.; Pfister, S.

    2015-12-01

    Water scarcity threatens ecosystems and human health and hampers economic development. It generally depends on the ratio of water consumption to availability. We calculated global, spatially explicit water stress indices (WSIs) which describe the vulnerability to additional water consumption on a scale from 0 (low) to 1 (high) and compare them for the decades 1981-1990 and 2001-2010. Input data are obtained from a multi-model ensemble at a resolution of 0.5 degrees. The variability among the models was used to run 1000 Monte Carlo simulations (latin hypercube sampling) and to subsequently estimate uncertainties of the WSIs. Globally, a trend of increasing water scarcity can be observed, however, uncertainties are large. The probability that this trend is actually occurring is as low as 53%. The increase in WSIs is rather driven by higher water use than lower water availability. Water availability is only 40% likely to decrease whereas water consumption is 67% likely to increase. Independent from the trend, we are already living under water scarce conditions, which is reflected in a consumption-weighted average of monthly WSIs of 0.51 in the recent decade. Its coefficient of variation points with 0.8 to the high uncertainties entailed, which might still hide poor model performance where all models consistently over- or underestimate water availability or use. Especially in arid areas, models generally overestimate availability. Although we do not traverse the planetary boundary of freshwater use as global water availability is sufficient, local water scarcity might be high. Therefore the regionalized assessment of WSIs under uncertainty helps to focus on specific regions to optimise water consumption. These global results can also help to raise awareness of water scarcity, and to suggest relevant measures such as more water efficient technologies to international companies, which have to deal with complex and distributed supply chains (e.g. in food production).

  19. Decommissioning funding: ethics, implementation, uncertainties

    International Nuclear Information System (INIS)

    2006-01-01

    This status report on Decommissioning Funding: Ethics, Implementation, Uncertainties also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). The report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems. (authors)

  20. The Uncertainty estimation of Alanine/ESR dosimetry

    International Nuclear Information System (INIS)

    Kim, Bo Rum; An, Jin Hee; Choi, Hoon; Kim, Young Ki

    2008-01-01

    Machinery, tools and cable etc are in the nuclear power plant which environment is very severe. By measuring actual dose, it needs for extending life expectancy of the machinery and tools and the cable. Therefore, we estimated on dose (gamma ray) of Wolsong nuclear power division 1 by dose estimation technology for three years. The dose estimation technology was secured by ESR(Electron Spin Resonance) dose estimation using regression analysis. We estimate uncertainty for secure a reliability of results. The uncertainty estimation will be able to judge the reliability of measurement results. The estimation of uncertainty referred the international unified guide in order; GUM(Guide to the Expression of Uncertainty in Measurement). It was published by International Standardization for Organization (ISO) in 1993. In this study the uncertainty of e-scan and EMX those are ESR equipment were evaluated and compared. Base on these results, it will improve the reliability of measurement

  1. ICG: a wiki-driven knowledgebase of internal control genes for RT-qPCR normalization.

    Science.gov (United States)

    Sang, Jian; Wang, Zhennan; Li, Man; Cao, Jiabao; Niu, Guangyi; Xia, Lin; Zou, Dong; Wang, Fan; Xu, Xingjian; Han, Xiaojiao; Fan, Jinqi; Yang, Ye; Zuo, Wanzhu; Zhang, Yang; Zhao, Wenming; Bao, Yiming; Xiao, Jingfa; Hu, Songnian; Hao, Lili; Zhang, Zhang

    2018-01-04

    Real-time quantitative PCR (RT-qPCR) has become a widely used method for accurate expression profiling of targeted mRNA and ncRNA. Selection of appropriate internal control genes for RT-qPCR normalization is an elementary prerequisite for reliable expression measurement. Here, we present ICG (http://icg.big.ac.cn), a wiki-driven knowledgebase for community curation of experimentally validated internal control genes as well as their associated experimental conditions. Unlike extant related databases that focus on qPCR primers in model organisms (mainly human and mouse), ICG features harnessing collective intelligence in community integration of internal control genes for a variety of species. Specifically, it integrates a comprehensive collection of more than 750 internal control genes for 73 animals, 115 plants, 12 fungi and 9 bacteria, and incorporates detailed information on recommended application scenarios corresponding to specific experimental conditions, which, collectively, are of great help for researchers to adopt appropriate internal control genes for their own experiments. Taken together, ICG serves as a publicly editable and open-content encyclopaedia of internal control genes and accordingly bears broad utility for reliable RT-qPCR normalization and gene expression characterization in both model and non-model organisms. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Laser-driven nuclear-polarized hydrogen internal gas target

    International Nuclear Information System (INIS)

    Seely, J.; Crawford, C.; Clasie, B.; Xu, W.; Dutta, D.; Gao, H.

    2006-01-01

    We report the performance of a laser-driven polarized internal hydrogen gas target (LDT) in a configuration similar to that used in scattering experiments. This target used the technique of spin-exchange optical pumping to produce nuclear spin polarized hydrogen gas that was fed into a cylindrical storage (target) cell. We present in this paper the performance of the target, methods that were tried to improve the figure-of-merit (FOM) of the target, and a Monte Carlo simulation of spin-exchange optical pumping. The dimensions of the apparatus were optimized using the simulation and the experimental results were in good agreement with the results from the simulation. The best experimental result achieved was at a hydrogen flow rate of 1.1x10 18 atoms/s, where the sample beam exiting the storage cell had 58.2% degree of dissociation and 50.5% polarization. Based on this measurement, the atomic fraction in the storage cell was 49.6% and the density averaged nuclear polarization was 25.0%. This represents the highest FOM for hydrogen from an LDT and is higher than the best FOM reported by atomic beam sources that used storage cells

  3. Uncertainty in predictions of oil spill trajectories in a coastal zone

    Science.gov (United States)

    Sebastião, P.; Guedes Soares, C.

    2006-12-01

    A method is introduced to determine the uncertainties in the predictions of oil spill trajectories using a classic oil spill model. The method considers the output of the oil spill model as a function of random variables, which are the input parameters, and calculates the standard deviation of the output results which provides a measure of the uncertainty of the model as a result of the uncertainties of the input parameters. In addition to a single trajectory that is calculated by the oil spill model using the mean values of the parameters, a band of trajectories can be defined when various simulations are done taking into account the uncertainties of the input parameters. This band of trajectories defines envelopes of the trajectories that are likely to be followed by the spill given the uncertainties of the input. The method was applied to an oil spill that occurred in 1989 near Sines in the southwestern coast of Portugal. This model represented well the distinction between a wind driven part that remained offshore, and a tide driven part that went ashore. For both parts, the method defined two trajectory envelopes, one calculated exclusively with the wind fields, and the other using wind and tidal currents. In both cases reasonable approximation to the observed results was obtained. The envelope of likely trajectories that is obtained with the uncertainty modelling proved to give a better interpretation of the trajectories that were simulated by the oil spill model.

  4. DREISS: Using State-Space Models to Infer the Dynamics of Gene Expression Driven by External and Internal Regulatory Networks

    Science.gov (United States)

    Gerstein, Mark

    2016-01-01

    Gene expression is controlled by the combinatorial effects of regulatory factors from different biological subsystems such as general transcription factors (TFs), cellular growth factors and microRNAs. A subsystem’s gene expression may be controlled by its internal regulatory factors, exclusively, or by external subsystems, or by both. It is thus useful to distinguish the degree to which a subsystem is regulated internally or externally–e.g., how non-conserved, species-specific TFs affect the expression of conserved, cross-species genes during evolution. We developed a computational method (DREISS, dreiss.gerteinlab.org) for analyzing the Dynamics of gene expression driven by Regulatory networks, both External and Internal based on State Space models. Given a subsystem, the “state” and “control” in the model refer to its own (internal) and another subsystem’s (external) gene expression levels. The state at a given time is determined by the state and control at a previous time. Because typical time-series data do not have enough samples to fully estimate the model’s parameters, DREISS uses dimensionality reduction, and identifies canonical temporal expression trajectories (e.g., degradation, growth and oscillation) representing the regulatory effects emanating from various subsystems. To demonstrate capabilities of DREISS, we study the regulatory effects of evolutionarily conserved vs. divergent TFs across distant species. In particular, we applied DREISS to the time-series gene expression datasets of C. elegans and D. melanogaster during their embryonic development. We analyzed the expression dynamics of the conserved, orthologous genes (orthologs), seeing the degree to which these can be accounted for by orthologous (internal) versus species-specific (external) TFs. We found that between two species, the orthologs have matched, internally driven expression patterns but very different externally driven ones. This is particularly true for genes with

  5. DREISS: Using State-Space Models to Infer the Dynamics of Gene Expression Driven by External and Internal Regulatory Networks.

    Directory of Open Access Journals (Sweden)

    Daifeng Wang

    2016-10-01

    Full Text Available Gene expression is controlled by the combinatorial effects of regulatory factors from different biological subsystems such as general transcription factors (TFs, cellular growth factors and microRNAs. A subsystem's gene expression may be controlled by its internal regulatory factors, exclusively, or by external subsystems, or by both. It is thus useful to distinguish the degree to which a subsystem is regulated internally or externally-e.g., how non-conserved, species-specific TFs affect the expression of conserved, cross-species genes during evolution. We developed a computational method (DREISS, dreiss.gerteinlab.org for analyzing the Dynamics of gene expression driven by Regulatory networks, both External and Internal based on State Space models. Given a subsystem, the "state" and "control" in the model refer to its own (internal and another subsystem's (external gene expression levels. The state at a given time is determined by the state and control at a previous time. Because typical time-series data do not have enough samples to fully estimate the model's parameters, DREISS uses dimensionality reduction, and identifies canonical temporal expression trajectories (e.g., degradation, growth and oscillation representing the regulatory effects emanating from various subsystems. To demonstrate capabilities of DREISS, we study the regulatory effects of evolutionarily conserved vs. divergent TFs across distant species. In particular, we applied DREISS to the time-series gene expression datasets of C. elegans and D. melanogaster during their embryonic development. We analyzed the expression dynamics of the conserved, orthologous genes (orthologs, seeing the degree to which these can be accounted for by orthologous (internal versus species-specific (external TFs. We found that between two species, the orthologs have matched, internally driven expression patterns but very different externally driven ones. This is particularly true for genes with

  6. 2nd International Conference on Cable-Driven Parallel Robots

    CERN Document Server

    Bruckmann, Tobias

    2015-01-01

    This volume presents the outcome of the second forum to cable-driven parallel robots, bringing the cable robot community together. It shows the new ideas of the active researchers developing cable-driven robots. The book presents the state of the art, including both summarizing contributions as well as latest research and future options. The book cover all topics which are essential for cable-driven robots: Classification Kinematics, Workspace and Singularity Analysis Statics and Dynamics Cable Modeling Control and Calibration Design Methodology Hardware Development Experimental Evaluation Prototypes, Application Reports and new Application concepts

  7. Directed International Technological Change and Climate Policy: New Methods for Identifying Robust Policies Under Conditions of Deep Uncertainty

    Science.gov (United States)

    Molina-Perez, Edmundo

    It is widely recognized that international environmental technological change is key to reduce the rapidly rising greenhouse gas emissions of emerging nations. In 2010, the United Nations Framework Convention on Climate Change (UNFCCC) Conference of the Parties (COP) agreed to the creation of the Green Climate Fund (GCF). This new multilateral organization has been created with the collective contributions of COP members, and has been tasked with directing over USD 100 billion per year towards investments that can enhance the development and diffusion of clean energy technologies in both advanced and emerging nations (Helm and Pichler, 2015). The landmark agreement arrived at the COP 21 has reaffirmed the key role that the GCF plays in enabling climate mitigation as it is now necessary to align large scale climate financing efforts with the long-term goals agreed at Paris 2015. This study argues that because of the incomplete understanding of the mechanics of international technological change, the multiplicity of policy options and ultimately the presence of climate and technological change deep uncertainty, climate financing institutions such as the GCF, require new analytical methods for designing long-term robust investment plans. Motivated by these challenges, this dissertation shows that the application of new analytical methods, such as Robust Decision Making (RDM) and Exploratory Modeling (Lempert, Popper and Bankes, 2003) to the study of international technological change and climate policy provides useful insights that can be used for designing a robust architecture of international technological cooperation for climate change mitigation. For this study I developed an exploratory dynamic integrated assessment model (EDIAM) which is used as the scenario generator in a large computational experiment. The scope of the experimental design considers an ample set of climate and technological scenarios. These scenarios combine five sources of uncertainty

  8. A hierarchical approach to multi-project planning under uncertainty

    NARCIS (Netherlands)

    Leus, R.; Wullink, Gerhard; Hans, Elias W.; Herroelen, W.

    2004-01-01

    We survey several viewpoints on the management of the planning complexity of multi-project organisations under uncertainty. A positioning framework is proposed to distinguish between different types of project-driven organisations, which is meant to aid project management in the choice between the

  9. A hierarchical approach to multi-project planning under uncertainty

    NARCIS (Netherlands)

    Hans, Elias W.; Herroelen, W.; Wullink, Gerhard; Leus, R.

    2007-01-01

    We survey several viewpoints on the management of the planning complexity of multi-project organisations under uncertainty. Based on these viewpoints we propose a positioning framework to distinguish between different types of project-driven organisations. This framework is meant to aid project

  10. Qualification and application of nuclear reactor accident analysis code with the capability of internal assessment of uncertainty; Qualificacao e aplicacao de codigo de acidentes de reatores nucleares com capacidade interna de avaliacao de incerteza

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Ronaldo Celem

    2001-10-15

    This thesis presents an independent qualification of the CIAU code ('Code with the capability of - Internal Assessment of Uncertainty') which is part of the internal uncertainty evaluation process with a thermal hydraulic system code on a realistic basis. This is done by combining the uncertainty methodology UMAE ('Uncertainty Methodology based on Accuracy Extrapolation') with the RELAP5/Mod3.2 code. This allows associating uncertainty band estimates with the results obtained by the realistic calculation of the code, meeting licensing requirements of safety analysis. The independent qualification is supported by simulations with RELAP5/Mod3.2 related to accident condition tests of LOBI experimental facility and to an event which has occurred in Angra 1 nuclear power plant, by comparison with measured results and by establishing uncertainty bands on safety parameter calculated time trends. These bands have indeed enveloped the measured trends. Results from this independent qualification of CIAU have allowed to ascertain the adequate application of a systematic realistic code procedure to analyse accidents with uncertainties incorporated in the results, although there is an evident need of extending the uncertainty data base. It has been verified that use of the code with this internal assessment of uncertainty is feasible in the design and license stages of a NPP. (author)

  11. Uncertainty, learning and international environmental policy coordination

    International Nuclear Information System (INIS)

    Ulph, A.; Maddison, D.

    1997-01-01

    In this paper we construct a simple model of global warming which captures a number of key features of the global warming problem: (1) environmental damages are related to the stock of greenhouse gases in the atmosphere; (2) the global commons nature of the problem means that these are strategic interactions between the emissions policies of the governments of individual nation states; (3) there is uncertainty about the extent of the future damages that will be incurred by each country from any given level of concentration of greenhouse gases but there is the possibility that at a future date better information about the true extent of environmental damages may become available; an important aspect of the problem is the extent to which damages in different countries may be correlated. In the first part of the paper we consider a simple model with two symmetric countries and show that the value of perfect information is an increasing function of the correlation between damages in the two countries in both the cooperative and non-cooperative equilibria. However, while the value of perfect information is always non-negative in the cooperative equilibrium, in the non- cooperative equilibrium there is a critical value of the correlation coefficient below which the value of perfect information will be negative. In the second part of the paper we construct an empirical model of global warming distinguishing between OECD and non-OECD countries and show that in the non-cooperative equilibrium the value of perfect information for OECD countries is negative when the correlation coefficient between environmental damages for OECD and non-OECD countries is negative. The implications of these results for international agreements are discussed. 3 tabs., 26 refs

  12. Nuclear data sensitivity/uncertainty analysis for XT-ADS

    International Nuclear Information System (INIS)

    Sugawara, Takanori; Sarotto, Massimo; Stankovskiy, Alexey; Van den Eynde, Gert

    2011-01-01

    Highlights: → The sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. → The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. → When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. → To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition. - Abstract: The XT-ADS, an accelerator-driven system for an experimental demonstration, has been investigated in the framework of IP EUROTRANS FP6 project. In this study, the sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. For the sensitivity analysis, it was found that the sensitivity coefficients were significantly different by changing the geometry models and calculation codes. For the uncertainty analysis, it was confirmed that the uncertainties deduced from the covariance data varied significantly by changing them. The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition.

  13. Uncertainty of slip measurements in a cutting system of converting machinery for diapers production

    Directory of Open Access Journals (Sweden)

    D’Aponte F.

    2015-01-01

    Full Text Available In this paper slip measurements are described between the peripheral surfaces of knife and a not driven anvil cylinders in a high velocity, high quality cutting unit of a diaper production line. Laboratory tests have been carried out on a test bench with real scale components for possible on line application of the method. With reference to both starting and steady state conditions correlations with the process parameters have been found, achieving a very satisfactory reduction of the slip between the knife cylinder and the not driven anvil one. Accuracy evaluation of measurements allowed us to validate the obtained information and to evaluate the detection threshold of the measurement method in the present configuration The analysis of specific uncertainty contributions to the whole uncertainty could be also used, to further reduce the requested uncertainty of the measurement method.

  14. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  15. Simplified propagation of standard uncertainties

    International Nuclear Information System (INIS)

    Shull, A.H.

    1997-01-01

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper

  16. Uncertainty analysis in the task of individual monitoring data

    International Nuclear Information System (INIS)

    Molokanov, A.; Badjin, V.; Gasteva, G.; Antipin, E.

    2003-01-01

    Assessment of internal doses is an essential component of individual monitoring programmes for workers and consists of two stages: individual monitoring measurements and interpretation of the monitoring data in terms of annual intake and/or annual internal dose. The overall uncertainty in assessed dose is a combination of the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainty in the assessment of internal dose in the task of individual monitoring data interpretation. Two main influencing factors are analysed in this paper: the unknown time of the exposure and variability of bioassay measurements. The aim of this analysis is to show that the algorithm is applicable in designing an individual monitoring programme for workers so as to guarantee that the individual dose calculated from individual monitoring measurements does not exceed a required limit with a certain confidence probability. (author)

  17. Management of internal communication in times of uncertainty; Gestion de la comunicacion interna en tiempos de incertidumbre

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez de la Gala, F.

    2014-07-01

    Garona is having a strong media coverage since 2009. The continuity process is under great controversy that has generated increased uncertainty for workers and their families, affecting motivation. Although internal communication has sought to manage its effects on the structure of the company, the rate of spread of alien information has made this complex mission. The regulatory body has been interested in its potential impact on safety culture, making a significant difference compared to other industrial sectors. (Author)

  18. Using internal discharge data in a distributed conceptual model to reduce uncertainty in streamflow simulations

    Science.gov (United States)

    Guerrero, J.; Halldin, S.; Xu, C.; Lundin, L.

    2011-12-01

    Distributed hydrological models are important tools in water management as they account for the spatial variability of the hydrological data, as well as being able to produce spatially distributed outputs. They can directly incorporate and assess potential changes in the characteristics of our basins. A recognized problem for models in general is equifinality, which is only exacerbated for distributed models who tend to have a large number of parameters. We need to deal with the fundamentally ill-posed nature of the problem that such models force us to face, i.e. a large number of parameters and very few variables that can be used to constrain them, often only the catchment discharge. There is a growing but yet limited literature showing how the internal states of a distributed model can be used to calibrate/validate its predictions. In this paper, a distributed version of WASMOD, a conceptual rainfall runoff model with only three parameters, combined with a routing algorithm based on the high-resolution HydroSHEDS data was used to simulate the discharge in the Paso La Ceiba basin in Honduras. The parameter space was explored using Monte-Carlo simulations and the region of space containing the parameter-sets that were considered behavioral according to two different criteria was delimited using the geometric concept of alpha-shapes. The discharge data from five internal sub-basins was used to aid in the calibration of the model and to answer the following questions: Can this information improve the simulations at the outlet of the catchment, or decrease their uncertainty? Also, after reducing the number of model parameters needing calibration through sensitivity analysis: Is it possible to relate them to basin characteristics? The analysis revealed that in most cases the internal discharge data can be used to reduce the uncertainty in the discharge at the outlet, albeit with little improvement in the overall simulation results.

  19. The Uncertainties of Risk Management

    DEFF Research Database (Denmark)

    Vinnari, Eija; Skærbæk, Peter

    2014-01-01

    for expanding risk management. More generally, such uncertainties relate to the professional identities and responsibilities of operational managers as defined by the framing devices. Originality/value – The paper offers three contributions to the extant literature: first, it shows how risk management itself......Purpose – The purpose of this paper is to analyse the implementation of risk management as a tool for internal audit activities, focusing on unexpected effects or uncertainties generated during its application. Design/methodology/approach – Public and confidential documents as well as semi......-structured interviews are analysed through the lens of actor-network theory to identify the effects of risk management devices in a Finnish municipality. Findings – The authors found that risk management, rather than reducing uncertainty, itself created unexpected uncertainties that would otherwise not have emerged...

  20. Development of Probabilistic Internal Dosimetry Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Noh, Siwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kwon, Tae-Eun [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of); Lee, Jai-Ki [Korean Association for Radiation Protection, Seoul (Korea, Republic of)

    2017-02-15

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values (e.g. the 2.5{sup th}, 5{sup th}, median, 95{sup th}, and 97.5{sup th} percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various

  1. Development of Probabilistic Internal Dosimetry Computer Code

    International Nuclear Information System (INIS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-01-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values (e.g. the 2.5 th , 5 th , median, 95 th , and 97.5 th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases

  2. Proceedings of the international symposium on acceleration-driven transmutation systems and Asia ADS network initiative

    International Nuclear Information System (INIS)

    Oigawa, Hiroyuki

    2003-09-01

    An International Symposium on 'Accelerator-Driven Transmutation Systems and Asia ADS Network Initiative' was held on March 24 and 25, 2003 at Gakushi-Kaikan, Tokyo, hosted by Japan Atomic Energy Research Institute, Kyoto University, Osaka University, High Energy Accelerator Research Organization and Tokyo Institute of Technology. The objectives of this symposium are to make participants acquainted with the current status and future plans for research and development (R and D) of ADS in the world and to enhance the initiation of an international collaborative network for ADS in Asia. This report records the papers and the materials of 15 presentations in the symposium. On the first day of the symposium, current activities for R and D of ADS were presented from United States, Europe, Japan, Korea, and China. On the second day, R and D activities in the fields of accelerator and nuclear physics were presented. After these presentations, a panel discussion was organized with regard to the prospective international collaboration and multidisciplinary synergy effect, which are essential to manage various technological issues encountered in R and D stage of ADS. Through the discussion, common understanding was promoted concerning the importance of establishing international network. It was agreed to establish the international network for scientific information exchange among Asian countries including Japan, Korea, China, and Vietnam in view of the future international collaboration in R and D of ADS. (author)

  3. Corporate income taxation uncertainty and foreign direct investment

    OpenAIRE

    Zagler, Martin; Zanzottera, Cristiana

    2012-01-01

    This paper analyzes the effects of legal uncertainty around corporate income taxation on foreign direct investment (FDI). Legal uncertainty can take many forms: double tax agreements, different types of legal systems and corruption. We test the effect of legal uncertainty on foreign direct investment with an international panel. We find that an increase in the ratio of the statutory corporate income tax rate of the destination relative to the source country exhibits a negati...

  4. Validation and calculation of uncertainties of the method of determination of creatinine in urine in internal dosimetry

    International Nuclear Information System (INIS)

    Sierra Barcedo, I.; Hernandez Gonzalez, C.; Benito Alonso, P.; Lopez Zarza, C.

    2011-01-01

    This paper describes the methodology used Lo conduct the validation of the quantification technique of content by specLrophoLomeLry creatinine in urine sarnples of exposed workers at risk of internal counterirritant, and the sludgy of ah sources uncertainty that influence in the proceas. This technique is used Lo carry ouL Lhe normahizaLion of Lhe amount of urine to urinary 24h, necessary for dosimeLric purposes, as well as a criterion for accepLance ox rejecLion of urine specimens received by the laboraLory.

  5. Decommissioning Funding: Ethics, Implementation, Uncertainties

    International Nuclear Information System (INIS)

    2007-01-01

    This status report on decommissioning funding: ethics, implementation, uncertainties is based on a review of recent literature and materials presented at NEA meetings in 2003 and 2004, and particularly at a topical session organised in November 2004 on funding issues associated with the decommissioning of nuclear power facilities. The report also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). This report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems

  6. Different approaches to overcome uncertainties of production systems

    Science.gov (United States)

    Azizi, Amir; Sorooshian, Shahryar

    2015-05-01

    This study presented a comprehensive review on the understanding of uncertainty and the current approaches that have been proposed to handle the uncertainties in the production systems. This paper classified proposed approaches into 11 groups. The paper studied 114 scholarly papers through various international journals. The paper added the latest findings to the body of knowledge to the current reservoir of understanding of the production uncertainties. Thus, the paper prepared the needs of researchers and practitioners for easy references in this area. This review also provided an excellent source to continue further studies on how to deal with the uncertainties of production system.

  7. Background and Qualification of Uncertainty Methods

    International Nuclear Information System (INIS)

    D'Auria, F.; Petruzzi, A.

    2008-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. The paper reviews the salient features of two independent approaches for estimating uncertainties associated with predictions of complex system codes. Namely the propagation of code input error and the propagation of the calculation output error constitute the key-words for identifying the methods of current interest for industrial applications. Throughout the developed methods, uncertainty bands can be derived (both upper and lower) for any desired quantity of the transient of interest. For the second case, the uncertainty method is coupled with the thermal-hydraulic code to get the Code with capability of Internal Assessment of Uncertainty, whose features are discussed in more detail.

  8. Implicit knowledge of visual uncertainty guides decisions with asymmetric outcomes

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma; Sahani, Maneesh

    2008-01-01

    under conditions of uncertainty. Here we show that human observers performing a simple visual choice task under an externally imposed loss function approach the optimal strategy, as defined by Bayesian probability and decision theory (Berger, 1985; Cox, 1961). In concert with earlier work, this suggests...... are pre-existing, widespread, and can be propagated to decision-making areas of the brain....... that observers possess a model of their internal uncertainty and can utilize this model in the neural computations that underlie their behavior (Knill & Pouget, 2004). In our experiment, optimal behavior requires that observers integrate the loss function with an estimate of their internal uncertainty rather...

  9. Collaborative Project: The problem of bias in defining uncertainty in computationally enabled strategies for data-driven climate model development. Final Technical Report.

    Energy Technology Data Exchange (ETDEWEB)

    Huerta, Gabriel [Univ. of New Mexico, Albuquerque, NM (United States)

    2016-05-10

    The objective of the project is to develop strategies for better representing scientific sensibilities within statistical measures of model skill that then can be used within a Bayesian statistical framework for data-driven climate model development and improved measures of model scientific uncertainty. One of the thorny issues in model evaluation is quantifying the effect of biases on climate projections. While any bias is not desirable, only those biases that affect feedbacks affect scatter in climate projections. The effort at the University of Texas is to analyze previously calculated ensembles of CAM3.1 with perturbed parameters to discover how biases affect projections of global warming. The hypothesis is that compensating errors in the control model can be identified by their effect on a combination of processes and that developing metrics that are sensitive to dependencies among state variables would provide a way to select version of climate models that may reduce scatter in climate projections. Gabriel Huerta at the University of New Mexico is responsible for developing statistical methods for evaluating these field dependencies. The UT effort will incorporate these developments into MECS, which is a set of python scripts being developed at the University of Texas for managing the workflow associated with data-driven climate model development over HPC resources. This report reflects the main activities at the University of New Mexico where the PI (Huerta) and the Postdocs (Nosedal, Hattab and Karki) worked on the project.

  10. Uncertainty Quantification Techniques for Sensor Calibration Monitoring in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Ramuhalli, Pradeep; Lin, Guang; Crawford, Susan L.; Konomi, Bledar A.; Coble, Jamie B.; Shumaker, Brent; Hashemian, Hash

    2014-04-30

    This report describes research towards the development of advanced algorithms for online calibration monitoring. The objective of this research is to develop the next generation of online monitoring technologies for sensor calibration interval extension and signal validation in operating and new reactors. These advances are expected to improve the safety and reliability of current and planned nuclear power systems as a result of higher accuracies and increased reliability of sensors used to monitor key parameters. The focus of this report is on documenting the outcomes of the first phase of R&D under this project, which addressed approaches to uncertainty quantification (UQ) in online monitoring that are data-driven, and can therefore adjust estimates of uncertainty as measurement conditions change. Such data-driven approaches to UQ are necessary to address changing plant conditions, for example, as nuclear power plants experience transients, or as next-generation small modular reactors (SMR) operate in load-following conditions.

  11. Uncertainty Quantification Techniques for Sensor Calibration Monitoring in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Ramuhalli, Pradeep; Lin, Guang; Crawford, Susan L.; Konomi, Bledar A.; Braatz, Brett G.; Coble, Jamie B.; Shumaker, Brent; Hashemian, Hash

    2013-09-01

    This report describes the status of ongoing research towards the development of advanced algorithms for online calibration monitoring. The objective of this research is to develop the next generation of online monitoring technologies for sensor calibration interval extension and signal validation in operating and new reactors. These advances are expected to improve the safety and reliability of current and planned nuclear power systems as a result of higher accuracies and increased reliability of sensors used to monitor key parameters. The focus of this report is on documenting the outcomes of the first phase of R&D under this project, which addressed approaches to uncertainty quantification (UQ) in online monitoring that are data-driven, and can therefore adjust estimates of uncertainty as measurement conditions change. Such data-driven approaches to UQ are necessary to address changing plant conditions, for example, as nuclear power plants experience transients, or as next-generation small modular reactors (SMR) operate in load-following conditions.

  12. The use of measurement uncertainty in nuclear materials accuracy and verification

    International Nuclear Information System (INIS)

    Alique, O.; Vaccaro, S.; Svedkauskaite, J.

    2015-01-01

    EURATOM nuclear safeguards are based on the nuclear operators’ accounting for and declaring of the amounts of nuclear materials in their possession, as well as on the European Commission verifying the correctness and completeness of such declarations by means of conformity assessment practices. Both the accountancy and the verification processes comprise the measurements of amounts and characteristics of nuclear materials. The uncertainties associated to these measurements play an important role in the reliability of the results of nuclear material accountancy and verification. The document “JCGM 100:2008 Evaluation of measurement data – Guide to the expression of uncertainty in measurement” - issued jointly by the International Bureau of Weights and Measures (BIPM) and international organisations for metrology, standardisation and accreditation in chemistry, physics and electro technology - describes a universal, internally consistent, transparent and applicable method for the evaluation and expression of uncertainty in measurements. This paper discusses different processes of nuclear materials accountancy and verification where measurement uncertainty plays a significant role. It also suggests the way measurement uncertainty could be used to enhance the reliability of the results of the nuclear materials accountancy and verification processes.

  13. Analysis of uncertainties in the IAEA/WHO TLD postal dose audit system

    Energy Technology Data Exchange (ETDEWEB)

    Izewska, J. [Department of Nuclear Sciences and Applications, International Atomic Energy Agency, Wagramer Strasse 5, Vienna (Austria)], E-mail: j.izewska@iaea.org; Hultqvist, M. [Department of Medical Radiation Physics, Karolinska Institute, Stockholm University, Stockholm (Sweden); Bera, P. [Department of Nuclear Sciences and Applications, International Atomic Energy Agency, Wagramer Strasse 5, Vienna (Austria)

    2008-02-15

    The International Atomic Energy Agency (IAEA) and the World Health Organisation (WHO) operate the IAEA/WHO TLD postal dose audit programme. Thermoluminescence dosimeters (TLDs) are used as transfer devices in this programme. In the present work the uncertainties in the dose determination from TLD measurements have been evaluated. The analysis of uncertainties comprises uncertainties in the calibration coefficient of the TLD system and uncertainties in factors correcting for dose response non-linearity, fading of TL signal, energy response and influence of TLD holder. The individual uncertainties have been combined to estimate the total uncertainty in the dose evaluated from TLD measurements. The combined relative standard uncertainty in the dose determined from TLD measurements has been estimated to be 1.2% for irradiations with Co-60 {gamma}-rays and 1.6% for irradiations with high-energy X-rays. Results from irradiations by the Bureau international des poids et mesures (BIPM), Primary Standard Dosimetry Laboratories (PSDLs) and Secondary Standards Dosimetry Laboratories (SSDLs) compare favourably with the estimated uncertainties, whereas TLD results of radiotherapy centres show higher standard deviations than those derived theoretically.

  14. Coupling ontology driven semantic representation with multilingual natural language generation for tuning international terminologies.

    Science.gov (United States)

    Rassinoux, Anne-Marie; Baud, Robert H; Rodrigues, Jean-Marie; Lovis, Christian; Geissbühler, Antoine

    2007-01-01

    The importance of clinical communication between providers, consumers and others, as well as the requisite for computer interoperability, strengthens the need for sharing common accepted terminologies. Under the directives of the World Health Organization (WHO), an approach is currently being conducted in Australia to adopt a standardized terminology for medical procedures that is intended to become an international reference. In order to achieve such a standard, a collaborative approach is adopted, in line with the successful experiment conducted for the development of the new French coding system CCAM. Different coding centres are involved in setting up a semantic representation of each term using a formal ontological structure expressed through a logic-based representation language. From this language-independent representation, multilingual natural language generation (NLG) is performed to produce noun phrases in various languages that are further compared for consistency with the original terms. Outcomes are presented for the assessment of the International Classification of Health Interventions (ICHI) and its translation into Portuguese. The initial results clearly emphasize the feasibility and cost-effectiveness of the proposed method for handling both a different classification and an additional language. NLG tools, based on ontology driven semantic representation, facilitate the discovery of ambiguous and inconsistent terms, and, as such, should be promoted for establishing coherent international terminologies.

  15. Uncertainties in Nuclear Proliferation Modeling

    International Nuclear Information System (INIS)

    Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok

    2015-01-01

    There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies

  16. Addressing Uncertainties in Cost Estimates for Decommissioning Nuclear Facilities

    International Nuclear Information System (INIS)

    Benjamin, Serge; Descures, Sylvain; Du Pasquier, Louis; Francois, Patrice; Buonarotti, Stefano; Mariotti, Giovanni; Tarakonov, Jurij; Daniska, Vladimir; Bergh, Niklas; Carroll, Simon; AaSTRoeM, Annika; Cato, Anna; De La Gardie, Fredrik; Haenggi, Hannes; Rodriguez, Jose; Laird, Alastair; Ridpath, Andy; La Guardia, Thomas; O'Sullivan, Patrick; ); Weber, Inge; )

    2017-01-01

    The cost estimation process of decommissioning nuclear facilities has continued to evolve in recent years, with a general trend towards demonstrating greater levels of detail in the estimate and more explicit consideration of uncertainties, the latter of which may have an impact on decommissioning project costs. The 2012 report on the International Structure for Decommissioning Costing (ISDC) of Nuclear Installations, a joint recommendation by the Nuclear Energy Agency (NEA), the International Atomic Energy Agency (IAEA) and the European Commission, proposes a standardised structure of cost items for decommissioning projects that can be used either directly for the production of cost estimates or for mapping of cost items for benchmarking purposes. The ISDC, however, provides only limited guidance on the treatment of uncertainty when preparing cost estimates. Addressing Uncertainties in Cost Estimates for Decommissioning Nuclear Facilities, prepared jointly by the NEA and IAEA, is intended to complement the ISDC, assisting cost estimators and reviewers in systematically addressing uncertainties in decommissioning cost estimates. Based on experiences gained in participating countries and projects, the report describes how uncertainty and risks can be analysed and incorporated in decommissioning cost estimates, while presenting the outcomes in a transparent manner

  17. Uncertainties in the Antarctic Ice Sheet Contribution to Sea Level Rise: Exploration of Model Response to Errors in Climate Forcing, Boundary Conditions, and Internal Parameters

    Science.gov (United States)

    Schlegel, N.; Seroussi, H. L.; Boening, C.; Larour, E. Y.; Limonadi, D.; Schodlok, M.; Watkins, M. M.

    2017-12-01

    The Jet Propulsion Laboratory-University of California at Irvine Ice Sheet System Model (ISSM) is a thermo-mechanical 2D/3D parallelized finite element software used to physically model the continental-scale flow of ice at high resolutions. Embedded into ISSM are uncertainty quantification (UQ) tools, based on the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) software. ISSM-DAKOTA offers various UQ methods for the investigation of how errors in model input impact uncertainty in simulation results. We utilize these tools to regionally sample model input and key parameters, based on specified bounds of uncertainty, and run a suite of continental-scale 100-year ISSM forward simulations of the Antarctic Ice Sheet. Resulting diagnostics (e.g., spread in local mass flux and regional mass balance) inform our conclusion about which parameters and/or forcing has the greatest impact on century-scale model simulations of ice sheet evolution. The results allow us to prioritize the key datasets and measurements that are critical for the minimization of ice sheet model uncertainty. Overall, we find that Antartica's total sea level contribution is strongly affected by grounding line retreat, which is driven by the magnitude of ice shelf basal melt rates and by errors in bedrock topography. In addition, results suggest that after 100 years of simulation, Thwaites glacier is the most significant source of model uncertainty, and its drainage basin has the largest potential for future sea level contribution. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  18. Wave Energy Converter Annual Energy Production Uncertainty Using Simulations

    Directory of Open Access Journals (Sweden)

    Clayton E. Hiles

    2016-09-01

    Full Text Available Critical to evaluating the economic viability of a wave energy project is: (1 a robust estimate of the electricity production throughout the project lifetime and (2 an understanding of the uncertainty associated with said estimate. Standardization efforts have established mean annual energy production (MAEP as the metric for quantification of wave energy converter (WEC electricity production and the performance matrix approach as the appropriate method for calculation. General acceptance of a method for calculating the MAEP uncertainty has not yet been achieved. Several authors have proposed methods based on the standard engineering approach to error propagation, however, a lack of available WEC deployment data has restricted testing of these methods. In this work the magnitude and sensitivity of MAEP uncertainty is investigated. The analysis is driven by data from simulated deployments of 2 WECs of different operating principle at 4 different locations. A Monte Carlo simulation approach is proposed for calculating the variability of MAEP estimates and is used to explore the sensitivity of the calculation. The uncertainty of MAEP ranged from 2%–20% of the mean value. Of the contributing uncertainties studied, the variability in the wave climate was found responsible for most of the uncertainty in MAEP. Uncertainty in MAEP differs considerably between WEC types and between deployment locations and is sensitive to the length of the input data-sets. This implies that if a certain maximum level of uncertainty in MAEP is targeted, the minimum required lengths of the input data-sets will be different for every WEC-location combination.

  19. Uncertainty As a Trigger for a Paradigm Change in Science Communication

    Science.gov (United States)

    Schneider, S.

    2014-12-01

    Over the last decade, the need to communicate uncertainty increased. Climate sciences and environmental sciences have faced massive propaganda campaigns by global industry and astroturf organizations. These organizations use the deep societal mistrust in uncertainty to point out alleged unethical and intentional delusion of decision makers and the public by scientists and their consultatory function. Scientists, who openly communicate uncertainty of climate model calculations, earthquake occurrence frequencies, or possible side effects of genetic manipulated semen have to face massive campaigns against their research, and sometimes against their person and live as well. Hence, new strategies to communicate uncertainty have to face the societal roots of the misunderstanding of the concept of uncertainty itself. Evolutionary biology has shown, that human mind is well suited for practical decision making by its sensory structures. Therefore, many of the irrational concepts about uncertainty are mitigated if data is presented in formats the brain is adapted to understand. At the end, the impact of uncertainty to the decision-making process is finally dominantly driven by preconceptions about terms such as uncertainty, vagueness or probabilities. Parallel to the increasing role of scientific uncertainty in strategic communication, science communicators for example at the Research and Development Program GEOTECHNOLOGIEN developed a number of techniques to master the challenge of putting uncertainty in the focus. By raising the awareness of scientific uncertainty as a driving force for scientific development and evolution, the public perspective on uncertainty is changing. While first steps to implement this process are under way, the value of uncertainty still is underestimated in the public and in politics. Therefore, science communicators are in need for new and innovative ways to talk about scientific uncertainty.

  20. Evaluating uncertainties in regional climate simulations over South America at the seasonal scale

    Energy Technology Data Exchange (ETDEWEB)

    Solman, Silvina A. [Centro de Investigaciones del Mar y la Atmosfera CIMA/CONICET-UBA, DCAO/FCEN, UMI-IFAECI/CNRS, CIMA-Ciudad Universitaria, Buenos Aires (Argentina); Pessacg, Natalia L. [Centro Nacional Patagonico (CONICET), Puerto Madryn, Chubut (Argentina)

    2012-07-15

    This work focuses on the evaluation of different sources of uncertainty affecting regional climate simulations over South America at the seasonal scale, using the MM5 model. The simulations cover a 3-month period for the austral spring season. Several four-member ensembles were performed in order to quantify the uncertainty due to: the internal variability; the definition of the regional model domain; the choice of physical parameterizations and the selection of physical parameters within a particular cumulus scheme. The uncertainty was measured by means of the spread among individual members of each ensemble during the integration period. Results show that the internal variability, triggered by differences in the initial conditions, represents the lowest level of uncertainty for every variable analyzed. The geographic distribution of the spread among ensemble members depends on the variable: for precipitation and temperature the largest spread is found over tropical South America while for the mean sea level pressure the largest spread is located over the southeastern Atlantic Ocean, where large synoptic-scale activity occurs. Using nudging techniques to ingest the boundary conditions reduces dramatically the internal variability. The uncertainty due to the domain choice displays a similar spatial pattern compared with the internal variability, except for the mean sea level pressure field, though its magnitude is larger all over the model domain for every variable. The largest spread among ensemble members is found for the ensemble in which different combinations of physical parameterizations are selected. The perturbed physics ensemble produces a level of uncertainty slightly larger than the internal variability. This study suggests that no matter what the source of uncertainty is, the geographical distribution of the spread among members of the ensembles is invariant, particularly for precipitation and temperature. (orig.)

  1. Minimization of energy consumption in HVAC systems with data-driven models and an interior-point method

    International Nuclear Information System (INIS)

    Kusiak, Andrew; Xu, Guanglin; Zhang, Zijun

    2014-01-01

    Highlights: • We study the energy saving of HVAC systems with a data-driven approach. • We conduct an in-depth analysis of the topology of developed Neural Network based HVAC model. • We apply interior-point method to solving a Neural Network based HVAC optimization model. • The uncertain building occupancy is incorporated in the minimization of HVAC energy consumption. • A significant potential of saving HVAC energy is discovered. - Abstract: In this paper, a data-driven approach is applied to minimize energy consumption of a heating, ventilating, and air conditioning (HVAC) system while maintaining the thermal comfort of a building with uncertain occupancy level. The uncertainty of arrival and departure rate of occupants is modeled by the Poisson and uniform distributions, respectively. The internal heating gain is calculated from the stochastic process of the building occupancy. Based on the observed and simulated data, a multilayer perceptron algorithm is employed to model and simulate the HVAC system. The data-driven models accurately predict future performance of the HVAC system based on the control settings and the observed historical information. An optimization model is formulated and solved with the interior-point method. The optimization results are compared with the results produced by the simulation models

  2. Uncertainties in projecting climate-change impacts in marine ecosystems

    DEFF Research Database (Denmark)

    Payne, Mark; Barange, Manuel; Cheung, William W. L.

    2016-01-01

    with a projection and building confidence in its robustness. We review how uncertainties in such projections are handled in marine science. We employ an approach developed in climate modelling by breaking uncertainty down into (i) structural (model) uncertainty, (ii) initialization and internal variability......Projections of the impacts of climate change on marine ecosystems are a key prerequisite for the planning of adaptation strategies, yet they are inevitably associated with uncertainty. Identifying, quantifying, and communicating this uncertainty is key to both evaluating the risk associated...... and highlight the opportunities and challenges associated with doing a better job. We find that even within a relatively small field such as marine science, there are substantial differences between subdisciplines in the degree of attention given to each type of uncertainty. We find that initialization...

  3. Uncertainties in Transport Project Evaluation: Editorial

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Nielsen, Otto Anker

    2015-01-01

    University of Denmark, September 2013. The conference was held under the auspices of the project ‘Uncertainties in transport project evaluation’ (UNITE) which is a research project (2009-2014) financed by the Danish Strategic Research Agency. UNITE was coordinated by the Department of Transport......This following special issue of the European Journal of Transport Infrastructure Research (EJTIR) containing five scientific papers is the result of an open call for papers at the 1st International Conference on Uncertainties in Transport Project Evaluation that took place at the Technical...

  4. Predicting Multiple Functions of Sustainable Flood Retention Basins under Uncertainty via Multi-Instance Multi-Label Learning

    Directory of Open Access Journals (Sweden)

    Qinli Yang

    2015-03-01

    Full Text Available The ambiguity of diverse functions of sustainable flood retention basins (SFRBs may lead to conflict and risk in water resources planning and management. How can someone provide an intuitive yet efficient strategy to uncover and distinguish the multiple potential functions of SFRBs under uncertainty? In this study, by exploiting both input and output uncertainties of SFRBs, the authors developed a new data-driven framework to automatically predict the multiple functions of SFRBs by using multi-instance multi-label (MIML learning. A total of 372 sustainable flood retention basins, characterized by 40 variables associated with confidence levels, were surveyed in Scotland, UK. A Gaussian model with Monte Carlo sampling was used to capture the variability of variables (i.e., input uncertainty, and the MIML-support vector machine (SVM algorithm was subsequently applied to predict the potential functions of SFRBs that have not yet been assessed, allowing for one basin belonging to different types (i.e., output uncertainty. Experiments demonstrated that the proposed approach enables effective automatic prediction of the potential functions of SFRBs (e.g., accuracy >93%. The findings suggest that the functional uncertainty of SFRBs under investigation can be better assessed in a more comprehensive and cost-effective way, and the proposed data-driven approach provides a promising method of doing so for water resources management.

  5. Information-theoretic approach to uncertainty importance

    International Nuclear Information System (INIS)

    Park, C.K.; Bari, R.A.

    1985-01-01

    A method is presented for importance analysis in probabilistic risk assessments (PRA) for which the results of interest are characterized by full uncertainty distributions and not just point estimates. The method is based on information theory in which entropy is a measure of uncertainty of a probability density function. We define the relative uncertainty importance between two events as the ratio of the two exponents of the entropies. For the log-normal and log-uniform distributions the importance measure is comprised of the median (central tendency) and of the logarithm of the error factor (uncertainty). Thus, if accident sequences are ranked this way, and the error factors are not all equal, then a different rank order would result than if the sequences were ranked by the central tendency measure alone. As an illustration, the relative importance of internal events and in-plant fires was computed on the basis of existing PRA results

  6. BEPU methods and combining of uncertainties

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2004-01-01

    After approval of the revised rule on the acceptance of emergency core cooling system (ECCS) performance in 1988 there has been significant interest in the development of codes and methodologies for best-estimate loss-of-coolant accident (LOCAs) analyses. The Code Scaling, Applicability and Uncertainty (CSAU) evaluation method was developed and demonstrated for large-break (LB) LOCA in a pressurized water reactor. Later several new best estimate plus uncertainty methods (BEPUs) were developed in the world. The purpose of the paper is to identify and compare the statistical approaches of BEPU methods and present their important plant and licensing applications. The study showed that uncertainty analysis with random sampling of input parameters and the use of order statistics for desired tolerance limits of output parameters is today commonly accepted approach. The existing BEPU methods seems mature enough while the future research may be focused on the codes with internal assessment of uncertainty. (author)

  7. Derivation of RCM-driven potential evapotranspiration for hydrological climate change impact analysis in Great Britain: a comparison of methods and associated uncertainty in future projections

    Directory of Open Access Journals (Sweden)

    C. Prudhomme

    2013-04-01

    Full Text Available Potential evapotranspiration (PET is the water that would be lost by plants through evaporation and transpiration if water was not limited in the soil, and it is commonly used in conceptual hydrological modelling in the calculation of runoff production and hence river discharge. Future changes of PET are likely to be as important as changes in precipitation patterns in determining changes in river flows. However PET is not calculated routinely by climate models so it must be derived independently when the impact of climate change on river flow is to be assessed. This paper compares PET estimates from 12 equations of different complexity, driven by the Hadley Centre's HadRM3-Q0 model outputs representative of 1961–1990, with MORECS PET, a product used as reference PET in Great Britain. The results show that the FAO56 version of the Penman–Monteith equations reproduces best the spatial and seasonal variability of MORECS PET across GB when driven by HadRM3-Q0 estimates of relative humidity, total cloud, wind speed and linearly bias-corrected mean surface temperature. This suggests that potential biases in HadRM3-Q0 climate do not result in significant biases when the physically based FAO56 equations are used. Percentage changes in PET between the 1961–1990 and 2041–2070 time slices were also calculated for each of the 12 PET equations from HadRM3-Q0. Results show a large variation in the magnitude (and sometimes direction of changes estimated from different PET equations, with Turc, Jensen–Haise and calibrated Blaney–Criddle methods systematically projecting the largest increases across GB for all months and Priestley–Taylor, Makkink, and Thornthwaite showing the smallest changes. We recommend the use of the FAO56 equation as, when driven by HadRM3-Q0 climate data, this best reproduces the reference MORECS PET across Great Britain for the reference period of 1961–1990. Further, the future changes of PET estimated by FAO56 are within

  8. Visualization-based decision support for value-driven system design

    Science.gov (United States)

    Tibor, Elliott

    In the past 50 years, the military, communication, and transportation systems that permeate our world, have grown exponentially in size and complexity. The development and production of these systems has seen ballooning costs and increased risk. This is particularly critical for the aerospace industry. The inability to deal with growing system complexity is a crippling force in the advancement of engineered systems. Value-Driven Design represents a paradigm shift in the field of design engineering that has potential to help counteract this trend. The philosophy of Value-Driven Design places the desires of the stakeholder at the forefront of the design process to capture true preferences and reveal system alternatives that were never previously thought possible. Modern aerospace engineering design problems are large, complex, and involve multiple levels of decision-making. To find the best design, the decision-maker is often required to analyze hundreds or thousands of combinations of design variables and attributes. Visualization can be used to support these decisions, by communicating large amounts of data in a meaningful way. Understanding the design space, the subsystem relationships, and the design uncertainties is vital to the advancement of Value-Driven Design as an accepted process for the development of more effective, efficient, robust, and elegant aerospace systems. This research investigates the use of multi-dimensional data visualization tools to support decision-making under uncertainty during the Value-Driven Design process. A satellite design system comprising a satellite, ground station, and launch vehicle is used to demonstrate effectiveness of new visualization methods to aid in decision support during complex aerospace system design. These methods are used to facilitate the exploration of the feasible design space by representing the value impact of system attribute changes and comparing the results of multi-objective optimization formulations

  9. Verification of gyrokinetic particle simulation of current-driven instability in fusion plasmas. I. Internal kink mode

    Energy Technology Data Exchange (ETDEWEB)

    McClenaghan, J.; Lin, Z.; Holod, I.; Deng, W.; Wang, Z. [University of California, Irvine, California 92697 (United States)

    2014-12-15

    The gyrokinetic toroidal code (GTC) capability has been extended for simulating internal kink instability with kinetic effects in toroidal geometry. The global simulation domain covers the magnetic axis, which is necessary for simulating current-driven instabilities. GTC simulation in the fluid limit of the kink modes in cylindrical geometry is verified by benchmarking with a magnetohydrodynamic eigenvalue code. Gyrokinetic simulations of the kink modes in the toroidal geometry find that ion kinetic effects significantly reduce the growth rate even when the banana orbit width is much smaller than the radial width of the perturbed current layer at the mode rational surface.

  10. Measurements of fusion neutron yields by neutron activation technique: Uncertainty due to the uncertainty on activation cross-sections

    Energy Technology Data Exchange (ETDEWEB)

    Stankunas, Gediminas, E-mail: gediminas.stankunas@lei.lt [Lithuanian Energy Institute, Laboratory of Nuclear Installation Safety, Breslaujos str. 3, LT-44403 Kaunas (Lithuania); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Batistoni, Paola [ENEA, Via E. Fermi, 45, 00044 Frascati, Rome (Italy); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Sjöstrand, Henrik; Conroy, Sean [Department of Physics and Astronomy, Uppsala University, PO Box 516, SE-75120 Uppsala (Sweden); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)

    2015-07-11

    The neutron activation technique is routinely used in fusion experiments to measure the neutron yields. This paper investigates the uncertainty on these measurements as due to the uncertainties on dosimetry and activation reactions. For this purpose, activation cross-sections were taken from the International Reactor Dosimetry and Fusion File (IRDFF-v1.05) in 640 groups ENDF-6 format for several reactions of interest for both 2.5 and 14 MeV neutrons. Activation coefficients (reaction rates) have been calculated using the neutron flux spectra at JET vacuum vessel, both for DD and DT plasmas, calculated by MCNP in the required 640-energy group format. The related uncertainties for the JET neutron spectra are evaluated as well using the covariance data available in the library. These uncertainties are in general small, but not negligible when high accuracy is required in the determination of the fusion neutron yields.

  11. Stakeholder-Driven Quality Improvement: A Compelling Force for Clinical Practice Guidelines.

    Science.gov (United States)

    Rosenfeld, Richard M; Wyer, Peter C

    2018-01-01

    Clinical practice guideline development should be driven by rigorous methodology, but what is less clear is where quality improvement enters the process: should it be a priority-guiding force, or should it enter only after recommendations are formulated? We argue for a stakeholder-driven approach to guideline development, with an overriding goal of quality improvement based on stakeholder perceptions of needs, uncertainties, and knowledge gaps. In contrast, the widely used topic-driven approach, which often makes recommendations based only on randomized controlled trials, is driven by epidemiologic purity and evidence rigor, with quality improvement a downstream consideration. The advantages of a stakeholder-driven versus a topic-driven approach are highlighted by comparisons of guidelines for otitis media with effusion, thyroid nodules, sepsis, and acute bacterial rhinosinusitis. These comparisons show that stakeholder-driven guidelines are more likely to address the quality improvement needs and pressing concerns of clinicians and patients, including understudied populations and patients with multiple chronic conditions. Conversely, a topic-driven approach often addresses "typical" patients, based on research that may not reflect the needs of high-risk groups excluded from studies because of ethical issues or a desire for purity of research design.

  12. Chapter 3: Traceability and uncertainty

    International Nuclear Information System (INIS)

    McEwen, Malcolm

    2014-01-01

    Chapter 3 presents: an introduction; Traceability (measurement standard, role of the Bureau International des Poids et Mesures, Secondary Standards Laboratories, documentary standards and traceability as process review); Uncertainty (Example 1 - Measurement, M raw (SSD), Example 2 - Calibration data, N D.w 60 Co, kQ, Example 3 - Correction factor, P TP ) and Conclusion

  13. Uncertainty in Measurement: A Review of Monte Carlo Simulation Using Microsoft Excel for the Calculation of Uncertainties Through Functional Relationships, Including Uncertainties in Empirically Derived Constants

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-01-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional

  14. Uncertainty in measurement: a review of monte carlo simulation using microsoft excel for the calculation of uncertainties through functional relationships, including uncertainties in empirically derived constants.

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-02-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship

  15. Parametric instability and wave turbulence driven by tidal excitation of internal waves

    Science.gov (United States)

    Le Reun, Thomas; Favier, Benjamin; Le Bars, Michael

    2018-04-01

    We investigate the stability of stratified fluid layers undergoing homogeneous and periodic tidal deformation. We first introduce a local model which allows to study velocity and buoyancy fluctuations in a Lagrangian domain periodically stretched and sheared by the tidal base flow. While keeping the key physical ingredients only, such a model is efficient to simulate planetary regimes where tidal amplitudes and dissipation are small. With this model, we prove that tidal flows are able to drive parametric subharmonic resonances of internal waves, in a way reminiscent of the elliptical instability in rotating fluids. The growth rates computed via Direct Numerical Simulations (DNS) are in very good agreement with WKB analysis and Floquet theory. We also investigate the turbulence driven by this instability mechanism. With spatio-temporal analysis, we show that it is a weak internal wave turbulence occurring at small Froude and buoyancy Reynolds numbers. When the gap between the excitation and the Brunt-V\\"ais\\"al\\"a frequencies is increased, the frequency spectrum of this wave turbulence displays a -2 power law reminiscent of the high-frequency branch of the Garett and Munk spectrum (Garrett & Munk 1979) which has been measured in the oceans. In addition, we find that the mixing efficiency is altered compared to what is computed in the context of DNS of stratified turbulence excited at small Froude and large buoyancy Reynolds numbers and is consistent with a superposition of waves.

  16. Uncertainty: a discriminator for above and below boiling repository design decisions

    International Nuclear Information System (INIS)

    Wilder, D G; Lin, W; Buscheck, T A; Wolery, T J; Francis, N D

    2000-01-01

    The US nuclear waste disposal program is evaluating the Yucca Mountain (YM) site for possible disposal of nuclear waste. Radioactive decay of the waste, particularly spent fuel, generates sufficient heat to significantly raise repository temperatures. Environmental conditions in the repository system evolve in response to this heat. The amount of temperature increase, and thus environmental changes, depends on repository design and operations. Because the evolving environment cannot be directly measured until after waste is emplaced, licensing decisions must be based upon model and analytical projections of the environmental conditions. These analyses have inherent uncertainties. There is concern that elevated temperatures increase uncertainty, because most chemical reaction rates increase with temperature and boiling introduces additional complexity of vapor phase reactions and transport. This concern was expressed by the NWTRB, particularly for above boiling temperatures. They state that ''the cooler the repository, the lower the uncertainty about heat-driven water migration and the better the performance of waste package materials. Above this temperature, technical uncertainties tend to be significantly higher than those associated with below-boiling conditions.'' (Cohon 1999). However, not all uncertainties are reduced by lower temperatures, indeed some may even be increased. This paper addresses impacts of temperatures on uncertainties

  17. Uncertainties in radioecological assessment models

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.; Ng, Y.C.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because models are inexact representations of real systems. The major sources of this uncertainty are related to bias in model formulation and imprecision in parameter estimation. The magnitude of uncertainty is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, health risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible. 41 references, 4 figures, 4 tables

  18. Calibration and Measurement Uncertainty Estimation of Radiometric Data: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Habte, A.; Sengupta, M.; Reda, I.; Andreas, A.; Konings, J.

    2014-11-01

    Evaluating the performance of photovoltaic cells, modules, and arrays that form large solar deployments relies on accurate measurements of the available solar resource. Therefore, determining the accuracy of these solar radiation measurements provides a better understanding of investment risks. This paper provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements by radiometers using methods that follow the International Bureau of Weights and Measures Guide to the Expression of Uncertainty (GUM). Standardized analysis based on these procedures ensures that the uncertainty quoted is well documented.

  19. Managing project risks and uncertainties

    Directory of Open Access Journals (Sweden)

    Mike Mentis

    2015-01-01

    Full Text Available This article considers threats to a project slipping on budget, schedule and fit-for-purpose. Threat is used here as the collective for risks (quantifiable bad things that can happen and uncertainties (poorly or not quantifiable bad possible events. Based on experience with projects in developing countries this review considers that (a project slippage is due to uncertainties rather than risks, (b while eventuation of some bad things is beyond control, managed execution and oversight are still the primary means to keeping within budget, on time and fit-for-purpose, (c improving project delivery is less about bigger and more complex and more about coordinated focus, effectiveness and developing thought-out heuristics, and (d projects take longer and cost more partly because threat identification is inaccurate, the scope of identified threats is too narrow, and the threat assessment product is not integrated into overall project decision-making and execution. Almost by definition, what is poorly known is likely to cause problems. Yet it is not just the unquantifiability and intangibility of uncertainties causing project slippage, but that they are insufficiently taken into account in project planning and execution that cause budget and time overruns. Improving project performance requires purpose-driven and managed deployment of scarce seasoned professionals. This can be aided with independent oversight by deeply experienced panelists who contribute technical insights and can potentially show that diligence is seen to be done.

  20. PROCEEDINGS OF THE INTERNATIONAL WORKSHOP ON UNCERTAINTY, SENSITIVITY, AND PARAMETER ESTIMATION FOR MULTIMEDIA ENVIRONMENTAL MODELING. EPA/600/R-04/117, NUREG/CP-0187, ERDC SR-04-2.

    Science.gov (United States)

    An International Workshop on Uncertainty, Sensitivity, and Parameter Estimation for Multimedia Environmental Modeling was held August 1921, 2003, at the U.S. Nuclear Regulatory Commission Headquarters in Rockville, Maryland, USA. The workshop was organized and convened by the Fe...

  1. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  2. Correlated quadratures of resonance fluorescence and the generalized uncertainty relation

    Science.gov (United States)

    Arnoldus, Henk F.; George, Thomas F.; Gross, Rolf W. F.

    1994-01-01

    Resonance fluorescence from a two-state atom has been predicted to exhibit quadrature squeezing below the Heisenberg uncertainty limit, provided that the optical parameters (Rabi frequency, detuning, laser linewidth, etc.) are chosen carefully. When the correlation between two quadratures of the radiation field does not vanish, however, the Heisenberg limit for quantum fluctuations might be an unrealistic lower bound. A generalized uncertainty relation, due to Schroedinger, takes into account the possible correlation between the quadrature components of the radiation, and it suggests a modified definition of squeezing. We show that the coherence between the two levels of a laser-driven atom is responsible for the correlation between the quadrature components of the emitted fluorescence, and that the Schrodinger uncertainty limit increases monotonically with the coherence. On the other hand, the fluctuations in the quadrature field diminish with an increasing coherence, and can disappear completely when the coherence reaches 1/2, provided that certain phase relations hold.

  3. Communication of uncertainty in hydrological predictions: a user-driven example web service for Europe

    Science.gov (United States)

    Fry, Matt; Smith, Katie; Sheffield, Justin; Watts, Glenn; Wood, Eric; Cooper, Jon; Prudhomme, Christel; Rees, Gwyn

    2017-04-01

    Water is fundamental to society as it impacts on all facets of life, the economy and the environment. But whilst it creates opportunities for growth and life, it can also cause serious damages to society and infrastructure through extreme hydro-meteorological events such as floods or droughts. Anticipation of future water availability and extreme event risks would both help optimise growth and limit damage through better preparedness and planning, hence providing huge societal benefits. Recent scientific research advances make it now possible to provide hydrological outlooks at monthly to seasonal lead time, and future projections up to the end of the century accounting for climatic changes. However, high uncertainty remains in the predictions, which varies depending on location, time of the year, prediction range and hydrological variable. It is essential that this uncertainty is fully understood by decision makers so they can account for it in their planning. Hence, the challenge is to finds ways to communicate such uncertainty for a range of stakeholders with different technical background and environmental science knowledge. The project EDgE (End-to end Demonstrator for improved decision making in the water sector for Europe) funded by the Copernicus programme (C3S) is a proof-of-concept project that develops a unique service to support decision making for the water sector at monthly to seasonal and to multi-decadal lead times. It is a mutual effort of co-production between hydrologists and environmental modellers, computer scientists and stakeholders representative of key decision makers in Europe for the water sector. This talk will present the iterative co-production process of a web service that serves the need of the user community. Through a series of Focus Group meetings in Spain, Norway and the UK, options for visualising the hydrological predictions and associated uncertainties are presented and discussed first as mock-up dash boards, off-line tools

  4. Do oil shocks predict economic policy uncertainty?

    Science.gov (United States)

    Rehman, Mobeen Ur

    2018-05-01

    Oil price fluctuations have influential role in global economic policies for developed as well as emerging countries. I investigate the role of international oil prices disintegrated into structural (i) oil supply shock, (ii) aggregate demand shock and (iii) oil market specific demand shocks, based on the work of Kilian (2009) using structural VAR framework on economic policies uncertainty of sampled markets. Economic policy uncertainty, due to its non-linear behavior is modeled in a regime switching framework with disintegrated structural oil shocks. Our results highlight that Indian, Spain and Japanese economic policy uncertainty responds to the global oil price shocks, however aggregate demand shocks fail to induce any change. Oil specific demand shocks are significant only for China and India in high volatility state.

  5. A Hypothesis-Driven Approach to Site Investigation

    Science.gov (United States)

    Nowak, W.

    2008-12-01

    Variability of subsurface formations and the scarcity of data lead to the notion of aquifer parameters as geostatistical random variables. Given an information need and limited resources for field campaigns, site investigation is often put into the context of optimal design. In optimal design, the types, numbers and positions of samples are optimized under case-specific objectives to meet the information needs. Past studies feature optimal data worth (balancing maximum financial profit in an engineering task versus the cost of additional sampling), or aim at a minimum prediction uncertainty of stochastic models for a prescribed investigation budget. Recent studies also account for other sources of uncertainty outside the hydrogeological range, such as uncertain toxicity, ingestion and behavioral parameters of the affected population when predicting the human health risk from groundwater contaminations. The current study looks at optimal site investigation from a new angle. Answering a yes/no question under uncertainty directly requires recasting the original question as a hypothesis test. Otherwise, false confidence in the resulting answer would be pretended. A straightforward example is whether a recent contaminant spill will cause contaminant concentrations in excess of a legal limit at a nearby drinking water well. This question can only be answered down to a specified chance of error, i.e., based on the significance level used in hypothesis tests. Optimal design is placed into the hypothesis-driven context by using the chance of providing a false yes/no answer as new criterion to be minimized. Different configurations apply for one-sided and two-sided hypothesis tests. If a false answer entails financial liability, the hypothesis-driven context can be re-cast in the context of data worth. The remaining difference is that failure is a hard constraint in the data worth context versus a monetary punishment term in the hypothesis-driven context. The basic principle

  6. The Uncertainty Test for the MAAP Computer Code

    International Nuclear Information System (INIS)

    Park, S. H.; Song, Y. M.; Park, S. Y.; Ahn, K. I.; Kim, K. R.; Lee, Y. J.

    2008-01-01

    After the Three Mile Island Unit 2 (TMI-2) and Chernobyl accidents, safety issues for a severe accident are treated in various aspects. Major issues in our research part include a level 2 PSA. The difficulty in expanding the level 2 PSA as a risk information activity is the uncertainty. In former days, it attached a weight to improve the quality in a internal accident PSA, but the effort is insufficient for decrease the phenomenon uncertainty in the level 2 PSA. In our country, the uncertainty degree is high in the case of a level 2 PSA model, and it is necessary to secure a model to decrease the uncertainty. We have not yet experienced the uncertainty assessment technology, the assessment system itself depends on advanced nations. In advanced nations, the severe accident simulator is implemented in the hardware level. But in our case, basic function in a software level can be implemented. In these circumstance at home and abroad, similar instances are surveyed such as UQM and MELCOR. Referred to these instances, SAUNA (Severe Accident UNcertainty Analysis) system is being developed in our project to assess and decrease the uncertainty in a level 2 PSA. It selects the MAAP code to analyze the uncertainty in a severe accident

  7. Different top-down approaches to estimate measurement uncertainty of whole blood tacrolimus mass concentration values.

    Science.gov (United States)

    Rigo-Bonnin, Raül; Blanco-Font, Aurora; Canalias, Francesca

    2018-05-08

    Values of mass concentration of tacrolimus in whole blood are commonly used by the clinicians for monitoring the status of a transplant patient and for checking whether the administered dose of tacrolimus is effective. So, clinical laboratories must provide results as accurately as possible. Measurement uncertainty can allow ensuring reliability of these results. The aim of this study was to estimate measurement uncertainty of whole blood mass concentration tacrolimus values obtained by UHPLC-MS/MS using two top-down approaches: the single laboratory validation approach and the proficiency testing approach. For the single laboratory validation approach, we estimated the uncertainties associated to the intermediate imprecision (using long-term internal quality control data) and the bias (utilizing a certified reference material). Next, we combined them together with the uncertainties related to the calibrators-assigned values to obtain a combined uncertainty for, finally, to calculate the expanded uncertainty. For the proficiency testing approach, the uncertainty was estimated in a similar way that the single laboratory validation approach but considering data from internal and external quality control schemes to estimate the uncertainty related to the bias. The estimated expanded uncertainty for single laboratory validation, proficiency testing using internal and external quality control schemes were 11.8%, 13.2%, and 13.0%, respectively. After performing the two top-down approaches, we observed that their uncertainty results were quite similar. This fact would confirm that either two approaches could be used to estimate the measurement uncertainty of whole blood mass concentration tacrolimus values in clinical laboratories. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  8. A rigorous treatment of uncertainty quantification for Silicon damage metrics

    International Nuclear Information System (INIS)

    Griffin, P.

    2016-01-01

    These report summaries the contributions made by Sandia National Laboratories in support of the International Atomic Energy Agency (IAEA) Nuclear Data Section (NDS) Technical Meeting (TM) on Nuclear Reaction Data and Uncertainties for Radiation Damage. This work focused on a rigorous treatment of the uncertainties affecting the characterization of the displacement damage seen in silicon semiconductors. (author)

  9. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Science.gov (United States)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  10. Drivers and uncertainties of forecasted range shifts for warm-water fishes under climate and land cover change

    Science.gov (United States)

    Bouska, Kristen; Whitledge, Gregory W.; Lant, Christopher; Schoof, Justin

    2018-01-01

    Land cover is an important determinant of aquatic habitat and is projected to shift with climate changes, yet climate-driven land cover changes are rarely factored into climate assessments. To quantify impacts and uncertainty of coupled climate and land cover change on warm-water fish species’ distributions, we used an ensemble model approach to project distributions of 14 species. For each species, current range projections were compared to 27 scenario-based projections and aggregated to visualize uncertainty. Multiple regression and model selection techniques were used to identify drivers of range change. Novel, or no-analogue, climates were assessed to evaluate transferability of models. Changes in total probability of occurrence ranged widely across species, from a 63% increase to a 65% decrease. Distributional gains and losses were largely driven by temperature and flow variables and underscore the importance of habitat heterogeneity and connectivity to facilitate adaptation to changing conditions. Finally, novel climate conditions were driven by mean annual maximum temperature, which stresses the importance of understanding the role of temperature on fish physiology and the role of temperature-mitigating management practices.

  11. Uncertainty-driven nuclear data evaluation including thermal (n,α) applied to 59Ni

    Science.gov (United States)

    Helgesson, P.; Sjöstrand, H.; Rochman, D.

    2017-11-01

    This paper presents a novel approach to the evaluation of nuclear data (ND), combining experimental data for thermal cross sections with resonance parameters and nuclear reaction modeling. The method involves sampling of various uncertain parameters, in particular uncertain components in experimental setups, and provides extensive covariance information, including consistent cross-channel correlations over the whole energy spectrum. The method is developed for, and applied to, 59Ni, but may be used as a whole, or in part, for other nuclides. 59Ni is particularly interesting since a substantial amount of 59Ni is produced in thermal nuclear reactors by neutron capture in 58Ni and since it has a non-threshold (n,α) cross section. Therefore, 59Ni gives a very important contribution to the helium production in stainless steel in a thermal reactor. However, current evaluated ND libraries contain old information for 59Ni, without any uncertainty information. The work includes a study of thermal cross section experiments and a novel combination of this experimental information, giving the full multivariate distribution of the thermal cross sections. In particular, the thermal (n,α) cross section is found to be 12.7 ± . 7 b. This is consistent with, but yet different from, current established values. Further, the distribution of thermal cross sections is combined with reported resonance parameters, and with TENDL-2015 data, to provide full random ENDF files; all of this is done in a novel way, keeping uncertainties and correlations in mind. The random files are also condensed into one single ENDF file with covariance information, which is now part of a beta version of JEFF 3.3. Finally, the random ENDF files have been processed and used in an MCNP model to study the helium production in stainless steel. The increase in the (n,α) rate due to 59Ni compared to fresh stainless steel is found to be a factor of 5.2 at a certain time in the reactor vessel, with a relative

  12. Inherently safe nuclear-driven internal combustion engines

    International Nuclear Information System (INIS)

    Alesso, P.; Chow, Tze-Show; Condit, R.; Heidrich, J.; Pettibone, J.; Streit, R.

    1991-01-01

    A family of nuclear driven engines is described in which nuclear energy released by fissioning of uranium or plutonium in a prompt critical assembly is used to heat a working gas. Engine performance is modeled using a code that calculates hydrodynamics, fission energy production, and neutron transport self-consistently. Results are given demonstrating a large negative temperature coefficient that produces self-shutoff of energy production. Reduced fission product inventory and the self-shutoff provide inherent nuclear safety. It is expected that nuclear engine reactor units could be scaled from 100 MW on up. 7 refs., 3 figs

  13. Vector network analyzer (VNA) measurements and uncertainty assessment

    CERN Document Server

    Shoaib, Nosherwan

    2017-01-01

    This book describes vector network analyzer measurements and uncertainty assessments, particularly in waveguide test-set environments, in order to establish their compatibility to the International System of Units (SI) for accurate and reliable characterization of communication networks. It proposes a fully analytical approach to measurement uncertainty evaluation, while also highlighting the interaction and the linear propagation of different uncertainty sources to compute the final uncertainties associated with the measurements. The book subsequently discusses the dimensional characterization of waveguide standards and the quality of the vector network analyzer (VNA) calibration techniques. The book concludes with an in-depth description of the novel verification artefacts used to assess the performance of the VNAs. It offers a comprehensive reference guide for beginners to experts, in both academia and industry, whose work involves the field of network analysis, instrumentation and measurements.

  14. FRACTURE MECHANICS UNCERTAINTY ANALYSIS IN THE RELIABILITY ASSESSMENT OF THE REACTOR PRESSURE VESSEL: (2D SUBJECTED TO INTERNAL PRESSURE

    Directory of Open Access Journals (Sweden)

    Entin Hartini

    2016-06-01

    Full Text Available ABSTRACT FRACTURE MECHANICS UNCERTAINTY ANALYSIS IN THE RELIABILITY ASSESSMENT OF THE REACTOR PRESSURE VESSEL: (2D SUBJECTED TO INTERNAL PRESSURE. The reactor pressure vessel (RPV is a pressure boundary in the PWR type reactor which serves to confine radioactive material during chain reaction process. The integrity of the RPV must be guaranteed either  in a normal operation or accident conditions. In analyzing the integrity of RPV, especially related to the crack behavior which can introduce break to the reactor pressure vessel, a fracture mechanic approach should be taken for this assessment. The uncertainty of input used in the assessment, such as mechanical properties and physical environment, becomes a reason that the assessment is not sufficient if it is perfomed only by deterministic approach. Therefore, the uncertainty approach should be applied. The aim of this study is to analize the uncertainty of fracture mechanics calculations in evaluating the reliability of PWR`s reactor pressure vessel. Random character of input quantity was generated using probabilistic principles and theories. Fracture mechanics analysis is solved by Finite Element Method (FEM with  MSC MARC software, while uncertainty input analysis is done based on probability density function with Latin Hypercube Sampling (LHS using python script. The output of MSC MARC is a J-integral value, which is converted into stress intensity factor for evaluating the reliability of RPV’s 2D. From the result of the calculation, it can be concluded that the SIF from  probabilistic method, reached the limit value of  fracture toughness earlier than SIF from  deterministic method.  The SIF generated by the probabilistic method is 105.240 MPa m0.5. Meanwhile, the SIF generated by deterministic method is 100.876 MPa m0.5. Keywords: Uncertainty analysis, fracture mechanics, LHS, FEM, reactor pressure vessels   ABSTRAK ANALISIS KETIDAKPASTIAN FRACTURE MECHANIC PADA EVALUASI KEANDALAN

  15. 9th International Bielefeld Conference 2009: Upgrading the eLibrary: Enhanced Information Services Driven by Technology and Economics

    Directory of Open Access Journals (Sweden)

    Almuth Gastinger

    2009-10-01

    Full Text Available Thisarticle reports on the 9th International Bielefeld Conference ‘Upgrading the eLibrary: Enhanced Information Services Driven by Technology and Economics’, 3-5 February 2009, in Bielefeld, Germany. The conference focused on future challenges for libraries regarding the development of information services and infrastructures that meet the changing needs of scholarly communication, collaboration (e-science and publication (open access as well as new requirements regarding teaching and learning (virtual learning spaces. In addition attention was paid to economic conditions and strategic positioning of libraries as a general framework for information services.

  16. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  17. ENTERPRISE OPERATION PLANNING IN THE CONDITIONS OF RISK AND UNCERTAINTY IN THE EXTERNAL AND INTERNAL ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Titov V. V.

    2017-09-01

    Full Text Available Optimization of the enterprise activity planning taking into account the risk and uncertainty of the external and internal environment is a complex scientific and methodological problem. Its solution is important for the planning practice. Therefore, the relevance of this research topic is beyond doubt. Planning is based on the use of a multilevel system of models. At the top level, the achievement of key strategic indicators is ensured by the development and implementation of innovations, mainly related to the planning of the release of new high-tech products. However, it is at this level that the risks and uncertainties have the greatest impact on the planning processes for the development, production and marketing of new products. In the scientific literature it is proposed to use the stochastic graphs with returns for this purpose. This idea is also supported in this work. However, the implementation of such an idea requires additional methodological developments and quantitative calculations. The coordination of strategic decisions with tactical plans is based on the idea of eliminating the economic and other risks associated with the economic activity of the enterprise in tactical planning, by creating the stochastic reserves based on the implementation of additional innovations that ensure the receipt of above-target sales volumes, profits and other indicators of the strategic plan. The organization of operational management of production is represented by an iterative, sliding process (reducing risks in production, which is realized taking into account the limitations of tactical control.

  18. Integrating uncertainties for climate change mitigation

    Science.gov (United States)

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    The target of keeping global average temperature increase to below 2°C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by

  19. Economic uncertainty, parental selection, and the criminal activity of the 'children of the wall'

    NARCIS (Netherlands)

    Chevalier, A.; Marie, O.

    2013-01-01

    We explore the link between parental selection and criminality of children in a new context. After the collapse of the Berlin Wall in 1989, East Germany experienced a very large, but temporary, drop in birth rates mostly driven by economic uncertainty. We exploit this natural experiment in a

  20. Predicting ecological responses in a changing ocean: the effects of future climate uncertainty.

    Science.gov (United States)

    Freer, Jennifer J; Partridge, Julian C; Tarling, Geraint A; Collins, Martin A; Genner, Martin J

    2018-01-01

    Predicting how species will respond to climate change is a growing field in marine ecology, yet knowledge of how to incorporate the uncertainty from future climate data into these predictions remains a significant challenge. To help overcome it, this review separates climate uncertainty into its three components (scenario uncertainty, model uncertainty, and internal model variability) and identifies four criteria that constitute a thorough interpretation of an ecological response to climate change in relation to these parts (awareness, access, incorporation, communication). Through a literature review, the extent to which the marine ecology community has addressed these criteria in their predictions was assessed. Despite a high awareness of climate uncertainty, articles favoured the most severe emission scenario, and only a subset of climate models were used as input into ecological analyses. In the case of sea surface temperature, these models can have projections unrepresentative against a larger ensemble mean. Moreover, 91% of studies failed to incorporate the internal variability of a climate model into results. We explored the influence that the choice of emission scenario, climate model, and model realisation can have when predicting the future distribution of the pelagic fish, Electrona antarctica . Future distributions were highly influenced by the choice of climate model, and in some cases, internal variability was important in determining the direction and severity of the distribution change. Increased clarity and availability of processed climate data would facilitate more comprehensive explorations of climate uncertainty, and increase in the quality and standard of marine prediction studies.

  1. On backstops and boomerangs. Environmental R and D under technological uncertainty

    International Nuclear Information System (INIS)

    Goeschl, Timo; Perino, Grischa

    2009-01-01

    In areas such as climate change, the recent economic literature has been emphasizing and addressing the pervasive presence of uncertainty. This paper considers a new and salient form of uncertainty, namely uncertainty regarding the environmental characteristics of 'green' innovations. Here, R and D may generate both backstop technologies and technologies that turn out to involve a new pollution problem ('boomerangs'). In the optimum, R and D will therefore typically be undertaken more than once. Extending results from multi-stage optimal control theory, we present a tractable model with a full characterization of the optimal pollution and R and D policies and the role of uncertainty. In this setting, (i) the optimal R and D program is defined by a research trigger condition in which the decision-maker's belief about the probability of finding a backstop enters in an intuitive way; (2) a decreasing probability of finding a backstop leads to the toleration of higher pollution levels, slower R and D, a slower turnover of technologies, and an ambiguous effect on the expected number of innovations; (3) learning about the probability of a backstop is driven by failures only and leads to decreasing research incentives; and (4) small to moderate delays in the resolution of technological uncertainty do not affect the optimal policy. (author)

  2. An adaptive control algorithm for optimization of intensity modulated radiotherapy considering uncertainties in beam profiles, patient set-up and internal organ motion

    International Nuclear Information System (INIS)

    Loef, Johan; Lind, Bengt K.; Brahme, Anders

    1998-01-01

    A new general beam optimization algorithm for inverse treatment planning is presented. It utilizes a new formulation of the probability to achieve complication-free tumour control. The new formulation explicitly describes the dependence of the treatment outcome on the incident fluence distribution, the patient geometry, the radiobiological properties of the patient and the fractionation schedule. In order to account for both measured and non-measured positioning uncertainties, the algorithm is based on a combination of dynamic and stochastic optimization techniques. Because of the difficulty in measuring all aspects of the intra- and interfractional variations in the patient geometry, such as internal organ displacements and deformations, these uncertainties are primarily accounted for in the treatment planning process by intensity modulation using stochastic optimization. The information about the deviations from the nominal fluence profiles and the nominal position of the patient relative to the beam that is obtained by portal imaging during treatment delivery, is used in a feedback loop to automatically adjust the profiles and the location of the patient for all subsequent treatments. Based on the treatment delivered in previous fractions, the algorithm furnishes optimal corrections for the remaining dose delivery both with regard to the fluence profile and its position relative to the patient. By dynamically refining the beam configuration from fraction to fraction, the algorithm generates an optimal sequence of treatments that very effectively reduces the influence of systematic and random set-up uncertainties to minimize and almost eliminate their overall effect on the treatment. Computer simulations have shown that the present algorithm leads to a significant increase in the probability of uncomplicated tumour control compared with the simple classical approach of adding fixed set-up margins to the internal target volume. (author)

  3. Outcome and value uncertainties in global-change policy

    International Nuclear Information System (INIS)

    Hammitt, J.K.

    1995-01-01

    Choices among environmental policies can be informed by analysis of the potential physical, biological, and social outcomes of alternative choices, and analysis of social preferences among these outcomes. Frequently, however, the consequences of alternative policies cannot be accurately predicted because of substantial outcome uncertainties concerning physical, chemical, biological, and social processes linking policy choices to consequences. Similarly, assessments of social preferences among alternative outcomes are limited by value uncertainties arising from limitations of moral principles, the absence of economic markets for many environmental attributes, and other factors. Outcome and value uncertainties relevant to global-change policy are described and their magnitudes are examined for two cases: stratospheric-ozone depletion and global climate change. Analysis of information available in the mid 1980s, when international ozone regulations were adopted, suggests that contemporary uncertainties surrounding CFC emissions and the atmospheric response were so large that plausible ozone depletion, absent regulation, ranged from negligible to catastrophic, a range that exceeded the plausible effect of the regulations considered. Analysis of climate change suggests that, important as outcome uncertainties are, uncertainties about values may be even more important for policy choice. 53 refs., 3 figs., 3 tabs

  4. Helioseismic and neutrino data-driven reconstruction of solar properties

    Science.gov (United States)

    Song, Ningqiang; Gonzalez-Garcia, M. C.; Villante, Francesco L.; Vinyoles, Nuria; Serenelli, Aldo

    2018-06-01

    In this work, we use Bayesian inference to quantitatively reconstruct the solar properties most relevant to the solar composition problem using as inputs the information provided by helioseismic and solar neutrino data. In particular, we use a Gaussian process to model the functional shape of the opacity uncertainty to gain flexibility and become as free as possible from prejudice in this regard. With these tools we first readdress the statistical significance of the solar composition problem. Furthermore, starting from a composition unbiased set of standard solar models (SSMs) we are able to statistically select those with solar chemical composition and other solar inputs which better describe the helioseismic and neutrino observations. In particular, we are able to reconstruct the solar opacity profile in a data-driven fashion, independently of any reference opacity tables, obtaining a 4 per cent uncertainty at the base of the convective envelope and 0.8 per cent at the solar core. When systematic uncertainties are included, results are 7.5 per cent and 2 per cent, respectively. In addition, we find that the values of most of the other inputs of the SSMs required to better describe the helioseismic and neutrino data are in good agreement with those adopted as the standard priors, with the exception of the astrophysical factor S11 and the microscopic diffusion rates, for which data suggests a 1 per cent and 30 per cent reduction, respectively. As an output of the study we derive the corresponding data-driven predictions for the solar neutrino fluxes.

  5. Uncertainties in scaling factors for ab initio vibrational zero-point energies

    Science.gov (United States)

    Irikura, Karl K.; Johnson, Russell D.; Kacker, Raghu N.; Kessel, Rüdiger

    2009-03-01

    Vibrational zero-point energies (ZPEs) determined from ab initio calculations are often scaled by empirical factors. An empirical scaling factor partially compensates for the effects arising from vibrational anharmonicity and incomplete treatment of electron correlation. These effects are not random but are systematic. We report scaling factors for 32 combinations of theory and basis set, intended for predicting ZPEs from computed harmonic frequencies. An empirical scaling factor carries uncertainty. We quantify and report, for the first time, the uncertainties associated with scaling factors for ZPE. The uncertainties are larger than generally acknowledged; the scaling factors have only two significant digits. For example, the scaling factor for B3LYP/6-31G(d) is 0.9757±0.0224 (standard uncertainty). The uncertainties in the scaling factors lead to corresponding uncertainties in predicted ZPEs. The proposed method for quantifying the uncertainties associated with scaling factors is based upon the Guide to the Expression of Uncertainty in Measurement, published by the International Organization for Standardization. We also present a new reference set of 60 diatomic and 15 polyatomic "experimental" ZPEs that includes estimated uncertainties.

  6. The treatment of climate-driven environmental change and associated uncertainty in post-closure assessments

    International Nuclear Information System (INIS)

    Wilmot, R.D.

    1993-01-01

    The post-closure performance of radioactive waste repositories is influenced by a range of processes such as groundwater flow and fracture movement which are in turn affected by conditions in the surface environment. For deep repositories the period for which an assessment must be performed is in the order of 10 6 years. The geological record of the last 10 6 years shows that surface environmental conditions have varied considerably over such time-scales. A model of surface environmental change, known as TIME4, has been developed on behalf of the UK Department of the Environment for use with the probabilistic risk assessment code VANDAL. This paper describes the extent of surface environmental change, discusses possible driving mechanisms for such changes and summarises the processes which have been incorporated within the TIME4 model. The underlying cause of change in surface environment sub-systems is inferred to be climate change but considerable uncertainty remains over the mechanisms of such change. Methods for treating these uncertainties are described. (author)

  7. Uncertainty Model for Total Solar Irradiance Estimation on Australian Rooftops

    Science.gov (United States)

    Al-Saadi, Hassan; Zivanovic, Rastko; Al-Sarawi, Said

    2017-11-01

    The installations of solar panels on Australian rooftops have been in rise for the last few years, especially in the urban areas. This motivates academic researchers, distribution network operators and engineers to accurately address the level of uncertainty resulting from grid-connected solar panels. The main source of uncertainty is the intermittent nature of radiation, therefore, this paper presents a new model to estimate the total radiation incident on a tilted solar panel. Where a probability distribution factorizes clearness index, the model is driven upon clearness index with special attention being paid for Australia with the utilization of best-fit-correlation for diffuse fraction. The assessment of the model validity is achieved with the adoption of four goodness-of-fit techniques. In addition, the Quasi Monte Carlo and sparse grid methods are used as sampling and uncertainty computation tools, respectively. High resolution data resolution of solar irradiations for Adelaide city were used for this assessment, with an outcome indicating a satisfactory agreement between actual data variation and model.

  8. Analysis of core damage frequency from internal events: Peach Bottom, Unit 2

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Lambright, J.A.; Ferrell, W.L.; Cathey, N.G.; Najafi, B.; Harper, F.T.

    1986-10-01

    This document contains the internal event initiated accident sequence analyses for Peach Bottom, Unit 2; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Peach Bottom, Unit 2, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provided additional insights regarding the dominant contributors to the Peach Bottom core damage frequency estimate. The mean core damage frequency at Peach Bottom was calculated to be 8.2E-6. Station blackout type accidents (loss of all ac power) were found to dominate the overall results. Anticipated Transient Without Scram accidents were also found to be non-negligible contributors. The numerical results are largely driven by common mode failure probability estimates and to some extent, human error. Because of significant data and analysis uncertainties in these two areas (important, for instance, to the most dominant scenario in this study), it is recommended that the results of the uncertainty and sensitivity analyses be considered before any actions are taken based on this analysis

  9. Economic uncertainty, parental selection, and the criminal activity of the ‘children of the wall’

    NARCIS (Netherlands)

    Chevalier, A.; Marie, O.

    2013-01-01

    We explore the link between parental selection and criminality of children in a new context. After the collapse of the Berlin Wall in 1989, East Germany experienced a very large, but temporary, drop in birth rates mostly driven by economic uncertainty. We exploit this natural experiment in a

  10. Handling Uncertainties within R&D Modules of a developing Technology

    DEFF Research Database (Denmark)

    Pedersen, Jørgen Lindgaard; Perunovic, Zoran

    2004-01-01

    . Third, the modules that have had always been present in the insulin's R&D, enabled companies to develop mechanism for internal learning and are able to master that part of the process. Finally, in the R&D, outsourcing is related to the whole knowledge acquisition while it seems that minor uncertainties...... and an interview conducted have generated following. First, the further along the process train a module is, it accumulates uncertainties from previous modules. Second, with the growth of complexity, uncertainties grew as well, resulting in the necessity for companies to seek for knowledge on them externally...

  11. Impact of nuclear data uncertainties on neutronics parameters of MYRRHA/XT-ADS

    International Nuclear Information System (INIS)

    Sugawara, T.; Stankovskiy, A.; Van den Eynde, G.; Sarotto, M.

    2011-01-01

    A flexible fast spectrum research reactor MYRRHA able to operate in subcritical (driven by a proton accelerator) and critical modes is being developed in SCK-CEN. In the framework of IP EUROTRANS programme the XT-ADS model has been investigated for MYRRHA. This paper reports the comparison of the sensitivity coefficients calculated for different calculation models and the uncertainties deduced from various covariance data for the discussion on the reliability of XT-ADS neutronics design. Sensitivity analysis is based on the comparison of three-dimensional heterogeneous and two-dimensional RZ calculation models. Three covariance data sets were employed to perform uncertainty analysis. The obtained sensitivity coefficients differ substantially between the 3D heterogeneous and RZ homogenized calculation models. The uncertainties deduced from the covariance data strongly depend on the covariance data variation. The covariance data of the nuclear data libraries is an open issue to discuss the reliability of the neutronics design. The uncertainties deduced from the covariance data for XT-ADS are 0.94% and 1.9% by the SCALE-6 44-group and TENDL-2009 covariance data, accordingly. The uncertainties exceed the 0.3% Δk (confidence level 1σ) target accuracy level. To achieve this target accuracy, the uncertainties should be improved by experiments under adequate conditions such as LBE or Pb moderated environment with MOX or Uranium fuel

  12. Approach and methods to evaluate the uncertainty in system thermalhydraulic calculations

    International Nuclear Information System (INIS)

    D'Auria, F.

    2004-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate (BE) calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. In the present paper the approaches to uncertainty are outlined and the CIAU (Code with capability of Internal Assessment of Uncertainty) method proposed by the University of Pisa is described including ideas at the basis and results from applications. An activity in progress at the International Atomic Energy Agency (IAEA) is considered. Two approaches are distinguished that are characterized as 'propagation of code input uncertainty' and 'propagation of code output errors'. For both methods, the thermal-hydraulic code is at the centre of the process of uncertainty evaluation: in the former case the code itself is adopted to compute the error bands and to propagate the input errors, in the latter case the errors in code application to relevant measurements are used to derive the error bands. The CIAU method exploits the idea of the 'status approach' for identifying the thermalhydraulic conditions of an accident in any Nuclear Power Plant (NPP). Errors in predicting such status are derived from the comparison between predicted and measured quantities and, in the stage of the application of the method, are used to compute the uncertainty. (author)

  13. Summary of existing uncertainty methods

    International Nuclear Information System (INIS)

    Glaeser, Horst

    2013-01-01

    A summary of existing and most used uncertainty methods is presented, and the main features are compared. One of these methods is the order statistics method based on Wilks' formula. It is applied in safety research as well as in licensing. This method has been first proposed by GRS for use in deterministic safety analysis, and is now used by many organisations world-wide. Its advantage is that the number of potential uncertain input and output parameters is not limited to a small number. Such a limitation was necessary for the first demonstration of the Code Scaling Applicability Uncertainty Method (CSAU) by the United States Regulatory Commission (USNRC). They did not apply Wilks' formula in their statistical method propagating input uncertainties to obtain the uncertainty of a single output variable, like peak cladding temperature. A Phenomena Identification and Ranking Table (PIRT) was set up in order to limit the number of uncertain input parameters, and consequently, the number of calculations to be performed. Another purpose of such a PIRT process is to identify the most important physical phenomena which a computer code should be suitable to calculate. The validation of the code should be focused on the identified phenomena. Response surfaces are used in some applications replacing the computer code for performing a high number of calculations. The second well known uncertainty method is the Uncertainty Methodology Based on Accuracy Extrapolation (UMAE) and the follow-up method 'Code with the Capability of Internal Assessment of Uncertainty (CIAU)' developed by the University Pisa. Unlike the statistical approaches, the CIAU does compare experimental data with calculation results. It does not consider uncertain input parameters. Therefore, the CIAU is highly dependent on the experimental database. The accuracy gained from the comparison between experimental data and calculated results are extrapolated to obtain the uncertainty of the system code predictions

  14. Effect of user interpretation on uncertainty estimates: examples from the air-to-milk transfer of radiocesium

    International Nuclear Information System (INIS)

    Kirchner, G.; Ring Peterson, S.; Bergstroem, U.; Bushell, S.; Davis, P.; Filistovic, V.; Hinton, T.G.; Krajewski, P.; Riesen, T.; Uijt de Haag, P.

    1998-01-01

    An important source of uncertainty in predictions of numerical simulation codes of environmental transport processes arises from the assumptions made by the user when interpreting the model and the scenario to be assessed. This type of uncertainty was examined systematically in this study and was compared with uncertainty due to varying parameter values in a code. Three terrestrial food chain codes that are driven by deposition of radionuclides from the atmosphere were used by up to ten participants to predict total deposition of 137 Cs and concentrations on pasture and in milk for two release scenarios. Collective uncertainty among the predictions of the ten users for concentrations in milk calculated for one scenario by one code was a factor of 2000, while the largest individual uncertainty was 20 times lower. Choice of parameter values contributed most to user-induced uncertainty, followed by scenario interpretation. Due to the significant disparity in predictions, it is recommended that assessments should not be carried out alone by a single code user. (Copyright (c) 1998 Elsevier Science B.V., Amsterdam. All rights reserved.)

  15. Data-free and data-driven spectral perturbations for RANS UQ

    Science.gov (United States)

    Edeling, Wouter; Mishra, Aashwin; Iaccarino, Gianluca

    2017-11-01

    Despite recent developments in high-fidelity turbulent flow simulations, RANS modeling is still vastly used by industry, due to its inherent low cost. Since accuracy is a concern in RANS modeling, model-form UQ is an essential tool for assessing the impacts of this uncertainty on quantities of interest. Applying the spectral decomposition to the modeled Reynolds-Stress Tensor (RST) allows for the introduction of decoupled perturbations into the baseline intensity (kinetic energy), shape (eigenvalues), and orientation (eigenvectors). This constitutes a natural methodology to evaluate the model form uncertainty associated to different aspects of RST modeling. In a predictive setting, one frequently encounters an absence of any relevant reference data. To make data-free predictions with quantified uncertainty we employ physical bounds to a-priori define maximum spectral perturbations. When propagated, these perturbations yield intervals of engineering utility. High-fidelity data opens up the possibility of inferring a distribution of uncertainty, by means of various data-driven machine-learning techniques. We will demonstrate our framework on a number of flow problems where RANS models are prone to failure. This research was partially supported by the Defense Advanced Research Projects Agency under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo), and the DOE PSAAP-II program.

  16. Uncertainty calculation for modal parameters used with stochastic subspace identification: an application to a bridge structure

    Science.gov (United States)

    Hsu, Wei-Ting; Loh, Chin-Hsiung; Chao, Shu-Hsien

    2015-03-01

    Stochastic subspace identification method (SSI) has been proven to be an efficient algorithm for the identification of liner-time-invariant system using multivariate measurements. Generally, the estimated modal parameters through SSI may be afflicted with statistical uncertainty, e.g. undefined measurement noises, non-stationary excitation, finite number of data samples etc. Therefore, the identified results are subjected to variance errors. Accordingly, the concept of the stabilization diagram can help users to identify the correct model, i.e. through removing the spurious modes. Modal parameters are estimated at successive model orders where the physical modes of the system are extracted and separated from the spurious modes. Besides, an uncertainty computation scheme was derived for the calculation of uncertainty bounds for modal parameters at some given model order. The uncertainty bounds of damping ratios are particularly interesting, as the estimation of damping ratios are difficult to obtain. In this paper, an automated stochastic subspace identification algorithm is addressed. First, the identification of modal parameters through covariance-driven stochastic subspace identification from the output-only measurements is used for discussion. A systematic way of investigation on the criteria for the stabilization diagram is presented. Secondly, an automated algorithm of post-processing on stabilization diagram is demonstrated. Finally, the computation of uncertainty bounds for each mode with all model order in the stabilization diagram is utilized to determine system natural frequencies and damping ratios. Demonstration of this study on the system identification of a three-span steel bridge under operation condition is presented. It is shown that the proposed new operation procedure for the automated covariance-driven stochastic subspace identification can enhance the robustness and reliability in structural health monitoring.

  17. Practical Policy Applications of Uncertainty Analysis for National Greenhouse Gas Inventories

    International Nuclear Information System (INIS)

    Gillenwater, M.; Sussman, F.; Cohen, J.

    2007-01-01

    International policy makers and climate researchers use greenhouse gas emissions inventory estimates in a variety of ways. Because of the varied uses of the inventory data, as well as the high uncertainty surrounding some of the source category estimates, considerable effort has been devoted to understanding the causes and magnitude of uncertainty in national emissions inventories. In this paper, we focus on two aspects of the rationale for quantifying uncertainty: (1) the possible uses of the quantified uncertainty estimates for policy (e.g., as a means of adjusting inventories used to determine compliance with international commitments); and (2) the direct benefits of the process of investigating uncertainties in terms of improving inventory quality. We find that there are particular characteristics that an inventory uncertainty estimate should have if it is to be used for policy purposes: (1) it should be comparable across countries; (2) it should be relatively objective, or at least subject to review and verification; (3) it should not be subject to gaming by countries acting in their own self-interest; (4) it should be administratively feasible to estimate and use; (5) the quality of the uncertainty estimate should be high enough to warrant the additional compliance costs that its use in an adjustment factor may impose on countries; and (6) it should attempt to address all types of inventory uncertainty. Currently, inventory uncertainty estimates for national greenhouse gas inventories do not have these characteristics. For example, the information used to develop quantitative uncertainty estimates for national inventories is often based on expert judgments, which are, by definition, subjective rather than objective, and therefore difficult to review and compare. Further, the practical design of a potential factor to adjust inventory estimates using uncertainty estimates would require policy makers to (1) identify clear environmental goals; (2) define these

  18. Practical Policy Applications of Uncertainty Analysis for National Greenhouse Gas Inventories

    Energy Technology Data Exchange (ETDEWEB)

    Gillenwater, M. [Environmental Resources Trust (United States)], E-mail: mgillenwater@ert.net; Sussman, F.; Cohen, J. [ICF International (United States)

    2007-09-15

    International policy makers and climate researchers use greenhouse gas emissions inventory estimates in a variety of ways. Because of the varied uses of the inventory data, as well as the high uncertainty surrounding some of the source category estimates, considerable effort has been devoted to understanding the causes and magnitude of uncertainty in national emissions inventories. In this paper, we focus on two aspects of the rationale for quantifying uncertainty: (1) the possible uses of the quantified uncertainty estimates for policy (e.g., as a means of adjusting inventories used to determine compliance with international commitments); and (2) the direct benefits of the process of investigating uncertainties in terms of improving inventory quality. We find that there are particular characteristics that an inventory uncertainty estimate should have if it is to be used for policy purposes: (1) it should be comparable across countries; (2) it should be relatively objective, or at least subject to review and verification; (3) it should not be subject to gaming by countries acting in their own self-interest; (4) it should be administratively feasible to estimate and use; (5) the quality of the uncertainty estimate should be high enough to warrant the additional compliance costs that its use in an adjustment factor may impose on countries; and (6) it should attempt to address all types of inventory uncertainty. Currently, inventory uncertainty estimates for national greenhouse gas inventories do not have these characteristics. For example, the information used to develop quantitative uncertainty estimates for national inventories is often based on expert judgments, which are, by definition, subjective rather than objective, and therefore difficult to review and compare. Further, the practical design of a potential factor to adjust inventory estimates using uncertainty estimates would require policy makers to (1) identify clear environmental goals; (2) define these

  19. Wealth dynamics in a sentiment-driven market

    Science.gov (United States)

    Goykhman, Mikhail

    2017-12-01

    We study dynamics of a simulated world with stock and money, driven by the externally given processes which we refer to as sentiments. The considered sentiments influence the buy/sell stock trading attitude, the perceived price uncertainty, and the trading intensity of all or a part of the market participants. We study how the wealth of market participants evolves in time in such an environment. We discuss the opposite perspective in which the parameters of the sentiment processes can be inferred a posteriori from the observed market behavior.

  20. Quantifying Carbon Financial Risk in the International Greenhouse Gas Market: An Application Using Remotely-Sensed Data to Align Scientific Uncertainty with Financial Decisions

    Science.gov (United States)

    Hultman, N. E.

    2002-12-01

    A common complaint about environmental policy is that regulations inadequately reflect scientific uncertainty and scientific consensus. While the causes of this phenomenon are complex and hard to discern, we know that corporations are the primary implementers of environmental regulations; therefore, focusing on how policy relates scientific knowledge to corporate decisions can provide valuable insights. Within the context of the developing international market for greenhouse gas emissions, I examine how corporations would apply finance theory into their investment decisions for carbon abatement projects. Using remotely-sensed ecosystem scale carbon flux measurements, I show how to determine much financial risk of carbon is diversifiable. I also discuss alternative, scientifically sound methods for hedging the non-diversifiable risks in carbon abatement projects. In providing a quantitative common language for scientific and corporate uncertainties, the concept of carbon financial risk provides an opportunity for expanding communication between these elements essential to successful climate policy.

  1. Modelling pesticide leaching under climate change: parameter vs. climate input uncertainty

    Directory of Open Access Journals (Sweden)

    K. Steffens

    2014-02-01

    Full Text Available Assessing climate change impacts on pesticide leaching requires careful consideration of different sources of uncertainty. We investigated the uncertainty related to climate scenario input and its importance relative to parameter uncertainty of the pesticide leaching model. The pesticide fate model MACRO was calibrated against a comprehensive one-year field data set for a well-structured clay soil in south-western Sweden. We obtained an ensemble of 56 acceptable parameter sets that represented the parameter uncertainty. Nine different climate model projections of the regional climate model RCA3 were available as driven by different combinations of global climate models (GCM, greenhouse gas emission scenarios and initial states of the GCM. The future time series of weather data used to drive the MACRO model were generated by scaling a reference climate data set (1970–1999 for an important agricultural production area in south-western Sweden based on monthly change factors for 2070–2099. 30 yr simulations were performed for different combinations of pesticide properties and application seasons. Our analysis showed that both the magnitude and the direction of predicted change in pesticide leaching from present to future depended strongly on the particular climate scenario. The effect of parameter uncertainty was of major importance for simulating absolute pesticide losses, whereas the climate uncertainty was relatively more important for predictions of changes of pesticide losses from present to future. The climate uncertainty should be accounted for by applying an ensemble of different climate scenarios. The aggregated ensemble prediction based on both acceptable parameterizations and different climate scenarios has the potential to provide robust probabilistic estimates of future pesticide losses.

  2. Uncertainties in smart grids behavior and modeling: What are the risks and vulnerabilities? How to analyze them?

    Energy Technology Data Exchange (ETDEWEB)

    Zio, Enrico, E-mail: enrico.zio@ecp.fr [Ecole Centrale Paris - Supelec, Paris (France); Politecnico di Milano, Milano (Italy); Aven, Terje, E-mail: terje.aven@uis.no [University of Stavanger, Stavanger (Norway)

    2011-10-15

    This paper looks into the future world of smart grids from a rather different perspective than usual: that of uncertainties and the related risks and vulnerabilities. An analysis of the foreseen constituents of smart grids and of the technological, operational, economical and policy-driven challenges is carried out to identify and characterize the related uncertainties and associated risks and vulnerabilities. The focus is on the challenges posed to the representation and treatment of uncertainties in the performance assessment of such systems, given their complexity and high-level of integration of novel technologies. A general framework of analysis is proposed. - Highlights: > We have looked into the development and operation of smart grids for power distribution. > We have identified a number of uncertainties, which are expected to play an influential role. > We have provided some guidelines to address these issues, based on probability intervals.

  3. The cerebellum and decision making under uncertainty.

    Science.gov (United States)

    Blackwood, Nigel; Ffytche, Dominic; Simmons, Andrew; Bentall, Richard; Murray, Robin; Howard, Robert

    2004-06-01

    This study aimed to identify the neural basis of probabilistic reasoning, a type of inductive inference that aids decision making under conditions of uncertainty. Eight normal subjects performed two separate two-alternative-choice tasks (the balls in a bottle and personality survey tasks) while undergoing functional magnetic resonance imaging (fMRI). The experimental conditions within each task were chosen so that they differed only in their requirement to make a decision under conditions of uncertainty (probabilistic reasoning and frequency determination required) or under conditions of certainty (frequency determination required). The same visual stimuli and motor responses were used in the experimental conditions. We provide evidence that the neo-cerebellum, in conjunction with the premotor cortex, inferior parietal lobule and medial occipital cortex, mediates the probabilistic inferences that guide decision making under uncertainty. We hypothesise that the neo-cerebellum constructs internal working models of uncertain events in the external world, and that such probabilistic models subserve the predictive capacity central to induction. Copyright 2004 Elsevier B.V.

  4. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  5. Probabilistic Accident Consequence Uncertainty - A Joint CEC/USNRC Study

    International Nuclear Information System (INIS)

    Gregory, Julie J.; Harper, Frederick T.

    1999-01-01

    The joint USNRC/CEC consequence uncertainty study was chartered after the development of two new probabilistic accident consequence codes, MACCS in the U.S. and COSYMA in Europe. Both the USNRC and CEC had a vested interest in expanding the knowledge base of the uncertainty associated with consequence modeling, and teamed up to co-sponsor a consequence uncertainty study. The information acquired from the study was expected to provide understanding of the strengths and weaknesses of current models as well as a basis for direction of future research. This paper looks at the elicitation process implemented in the joint study and discusses some of the uncertainty distributions provided by eight panels of experts from the U.S. and Europe that were convened to provide responses to the elicitation. The phenomenological areas addressed by the expert panels include atmospheric dispersion and deposition, deposited material and external doses, food chain, early health effects, late health effects and internal dosimetry

  6. Probabilistic Accident Consequence Uncertainty - A Joint CEC/USNRC Study

    Energy Technology Data Exchange (ETDEWEB)

    Gregory, Julie J.; Harper, Frederick T.

    1999-07-28

    The joint USNRC/CEC consequence uncertainty study was chartered after the development of two new probabilistic accident consequence codes, MACCS in the U.S. and COSYMA in Europe. Both the USNRC and CEC had a vested interest in expanding the knowledge base of the uncertainty associated with consequence modeling, and teamed up to co-sponsor a consequence uncertainty study. The information acquired from the study was expected to provide understanding of the strengths and weaknesses of current models as well as a basis for direction of future research. This paper looks at the elicitation process implemented in the joint study and discusses some of the uncertainty distributions provided by eight panels of experts from the U.S. and Europe that were convened to provide responses to the elicitation. The phenomenological areas addressed by the expert panels include atmospheric dispersion and deposition, deposited material and external doses, food chain, early health effects, late health effects and internal dosimetry.

  7. Causal uncertainty, claimed and behavioural self-handicapping.

    Science.gov (United States)

    Thompson, Ted; Hepburn, Jonathan

    2003-06-01

    Causal uncertainty beliefs involve doubts about the causes of events, and arise as a consequence of non-contingent evaluative feedback: feedback that leaves the individual uncertain about the causes of his or her achievement outcomes. Individuals high in causal uncertainty are frequently unable to confidently attribute their achievement outcomes, experience anxiety in achievement situations and as a consequence are likely to engage in self-handicapping behaviour. Accordingly, we sought to establish links between trait causal uncertainty, claimed and behavioural self-handicapping. Participants were N=72 undergraduate students divided equally between high and low causally uncertain groups. We used a 2 (causal uncertainty status: high, low) x 3 (performance feedback condition: success, non-contingent success, non-contingent failure) between-subjects factorial design to examine the effects of causal uncertainty on achievement behaviour. Following performance feedback, participants completed 20 single-solution anagrams and 12 remote associate tasks serving as performance measures, and 16 unicursal tasks to assess practice effort. Participants also completed measures of claimed handicaps, state anxiety and attributions. Relative to low causally uncertain participants, high causally uncertain participants claimed more handicaps prior to performance on the anagrams and remote associates, reported higher anxiety, attributed their failure to internal, stable factors, and reduced practice effort on the unicursal tasks, evident in fewer unicursal tasks solved. These findings confirm links between trait causal uncertainty and claimed and behavioural self-handicapping, highlighting the need for educators to facilitate means by which students can achieve surety in the manner in which they attribute the causes of their achievement outcomes.

  8. Monkeys and humans take local uncertainty into account when localizing a change.

    Science.gov (United States)

    Devkar, Deepna; Wright, Anthony A; Ma, Wei Ji

    2017-09-01

    Since sensory measurements are noisy, an observer is rarely certain about the identity of a stimulus. In visual perception tasks, observers generally take their uncertainty about a stimulus into account when doing so helps task performance. Whether the same holds in visual working memory tasks is largely unknown. Ten human and two monkey subjects localized a single change in orientation between a sample display containing three ellipses and a test display containing two ellipses. To manipulate uncertainty, we varied the reliability of orientation information by making each ellipse more or less elongated (two levels); reliability was independent across the stimuli. In both species, a variable-precision encoding model equipped with an "uncertainty-indifferent" decision rule, which uses only the noisy memories, fitted the data poorly. In both species, a much better fit was provided by a model in which the observer also takes the levels of reliability-driven uncertainty associated with the memories into account. In particular, a measured change in a low-reliability stimulus was given lower weight than the same change in a high-reliability stimulus. We did not find strong evidence that observers took reliability-independent variations in uncertainty into account. Our results illustrate the importance of studying the decision stage in comparison tasks and provide further evidence for evolutionary continuity of working memory systems between monkeys and humans.

  9. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, C. J.

    2017-12-01

    How to derive and present uncertainty in climate data records (CDRs) has been debated within the European Space Agency Climate Change Initiative, in search of common principles applicable across a range of essential climate variables. Various points of consensus have been reached, including the importance of improving provision of uncertainty information and the benefit of adopting international norms of metrology for language around the distinct concepts of uncertainty and error. Providing an estimate of standard uncertainty per datum (or the means to readily calculate it) emerged as baseline good practice, and should be highly relevant to users of CDRs when the uncertainty in data is variable (the usual case). Given this baseline, the role of quality flags is clarified as being complementary to and not repetitive of uncertainty information. Data with high uncertainty are not poor quality if a valid estimate of the uncertainty is available. For CDRs and their applications, the error correlation properties across spatio-temporal scales present important challenges that are not fully solved. Error effects that are negligible in the uncertainty of a single pixel may dominate uncertainty in the large-scale and long-term. A further principle is that uncertainty estimates should themselves be validated. The concepts of estimating and propagating uncertainty are generally acknowledged in geophysical sciences, but less widely practised in Earth observation and development of CDRs. Uncertainty in a CDR depends in part (and usually significantly) on the error covariance of the radiances and auxiliary data used in the retrieval. Typically, error covariance information is not available in the fundamental CDR (FCDR) (i.e., with the level-1 radiances), since provision of adequate level-1 uncertainty information is not yet standard practice. Those deriving CDRs thus cannot propagate the radiance uncertainty to their geophysical products. The FIDUCEO project (www.fiduceo.eu) is

  10. Introducing blended learning: An experience of uncertainty for students in the United Arab Emirates

    Directory of Open Access Journals (Sweden)

    Linzi J. Kemp

    2013-05-01

    Full Text Available The cultural dimension of Uncertainty Avoidance is analysed in this study of an introduction to blended learning for international students. Content analysis was conducted on the survey narratives collected from three cohorts of management undergraduates in the United Arab Emirates. Interpretation of certainty with blended learning was found in: student skills with technology; student acknowledgement of course organisation; and student appreciation of online feedback. Uncertainty with the introduction of blended learning was found: when membership was assigned for group work, higher quality research methods were introduced; where course structure lacked detail, increased time was required for new and different online activities. These international students, from countries with a high score on Uncertainty Avoidance, exhibited that dimension when introduced to blended learning. The implications of these findings are discussed, and strategies suggested for introducing blended learning to international students. The limitations of the study are considered, and a direction for future research is suggested. This is the first study on undergraduates in the Middle East for the effects of a cultural dimension when introducing blended learning. The findings increase the body of knowledge that relates to learning technology in the international business classroom.

  11. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    International Nuclear Information System (INIS)

    Kirchner, G.; Peterson, R.

    1996-11-01

    Uncertainty in predictions of environmental transfer models arises from, among other sources, the adequacy of the conceptual model, the approximations made in coding the conceptual model, the quality of the input data, the uncertainty in parameter values, and the assumptions made by the user. In recent years efforts to quantify the confidence that can be placed in predictions have been increasing, but have concentrated on a statistical propagation of the influence of parameter uncertainties on the calculational results. The primary objective of this Working Group of BIOMOVS II was to test user's influence on model predictions on a more systematic basis than has been done before. The main goals were as follows: To compare differences between predictions from different people all using the same model and the same scenario description with the statistical uncertainties calculated by the model. To investigate the main reasons for different interpretations by users. To create a better awareness of the potential influence of the user on the modeling results. Terrestrial food chain models driven by deposition of radionuclides from the atmosphere were used. Three codes were obtained and run with three scenarios by a maximum of 10 users. A number of conclusions can be drawn some of which are general and independent of the type of models and processes studied, while others are restricted to the few processes that were addressed directly: For any set of predictions, the variation in best estimates was greater than one order of magnitude. Often the range increased from deposition to pasture to milk probably due to additional transfer processes. The 95% confidence intervals about the predictions calculated from the parameter distributions prepared by the participants did not always overlap the observations; similarly, sometimes the confidence intervals on the predictions did not overlap. Often the 95% confidence intervals of individual predictions were smaller than the

  12. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Kirchner, G. [Univ. of Bremen (Germany); Peterson, R. [AECL, Chalk River, ON (Canada)] [and others

    1996-11-01

    Uncertainty in predictions of environmental transfer models arises from, among other sources, the adequacy of the conceptual model, the approximations made in coding the conceptual model, the quality of the input data, the uncertainty in parameter values, and the assumptions made by the user. In recent years efforts to quantify the confidence that can be placed in predictions have been increasing, but have concentrated on a statistical propagation of the influence of parameter uncertainties on the calculational results. The primary objective of this Working Group of BIOMOVS II was to test user's influence on model predictions on a more systematic basis than has been done before. The main goals were as follows: To compare differences between predictions from different people all using the same model and the same scenario description with the statistical uncertainties calculated by the model. To investigate the main reasons for different interpretations by users. To create a better awareness of the potential influence of the user on the modeling results. Terrestrial food chain models driven by deposition of radionuclides from the atmosphere were used. Three codes were obtained and run with three scenarios by a maximum of 10 users. A number of conclusions can be drawn some of which are general and independent of the type of models and processes studied, while others are restricted to the few processes that were addressed directly: For any set of predictions, the variation in best estimates was greater than one order of magnitude. Often the range increased from deposition to pasture to milk probably due to additional transfer processes. The 95% confidence intervals about the predictions calculated from the parameter distributions prepared by the participants did not always overlap the observations; similarly, sometimes the confidence intervals on the predictions did not overlap. Often the 95% confidence intervals of individual predictions were smaller than the

  13. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  14. Assessing concentration uncertainty estimates from passive microwave sea ice products

    Science.gov (United States)

    Meier, W.; Brucker, L.; Miller, J. A.

    2017-12-01

    Sea ice concentration is an essential climate variable and passive microwave derived estimates of concentration are one of the longest satellite-derived climate records. However, until recently uncertainty estimates were not provided. Numerous validation studies provided insight into general error characteristics, but the studies have found that concentration error varied greatly depending on sea ice conditions. Thus, an uncertainty estimate from each observation is desired, particularly for initialization, assimilation, and validation of models. Here we investigate three sea ice products that include an uncertainty for each concentration estimate: the NASA Team 2 algorithm product, the EUMETSAT Ocean and Sea Ice Satellite Application Facility (OSI-SAF) product, and the NOAA/NSIDC Climate Data Record (CDR) product. Each product estimates uncertainty with a completely different approach. The NASA Team 2 product derives uncertainty internally from the algorithm method itself. The OSI-SAF uses atmospheric reanalysis fields and a radiative transfer model. The CDR uses spatial variability from two algorithms. Each approach has merits and limitations. Here we evaluate the uncertainty estimates by comparing the passive microwave concentration products with fields derived from the NOAA VIIRS sensor. The results show that the relationship between the product uncertainty estimates and the concentration error (relative to VIIRS) is complex. This may be due to the sea ice conditions, the uncertainty methods, as well as the spatial and temporal variability of the passive microwave and VIIRS products.

  15. Visualizing the uncertainty in the relationship between seasonal average climate and malaria risk.

    Science.gov (United States)

    MacLeod, D A; Morse, A P

    2014-12-02

    Around $1.6 billion per year is spent financing anti-malaria initiatives, and though malaria morbidity is falling, the impact of annual epidemics remains significant. Whilst malaria risk may increase with climate change, projections are highly uncertain and to sidestep this intractable uncertainty, adaptation efforts should improve societal ability to anticipate and mitigate individual events. Anticipation of climate-related events is made possible by seasonal climate forecasting, from which warnings of anomalous seasonal average temperature and rainfall, months in advance are possible. Seasonal climate hindcasts have been used to drive climate-based models for malaria, showing significant skill for observed malaria incidence. However, the relationship between seasonal average climate and malaria risk remains unquantified. Here we explore this relationship, using a dynamic weather-driven malaria model. We also quantify key uncertainty in the malaria model, by introducing variability in one of the first order uncertainties in model formulation. Results are visualized as location-specific impact surfaces: easily integrated with ensemble seasonal climate forecasts, and intuitively communicating quantified uncertainty. Methods are demonstrated for two epidemic regions, and are not limited to malaria modeling; the visualization method could be applied to any climate impact.

  16. Redesign of a pilot international online course on accelerator driven systems for nuclear transmutation to implement a massive open online course

    Energy Technology Data Exchange (ETDEWEB)

    Alonso-Ramos, M.; Fernandez-Luna, A. J.; Gonzalez-Romero, E. M.; Sanchez-Elvira, A.; Castro, M.; Ogando, F.; Sanz, J.; Martin, S.

    2014-07-01

    In April 2013, a full-distance international pilot course on ADS (Accelerator Driven Systems) for advanced nuclear waste transmutation was taught by UNED-CIEMAT within FP7 ENEN-III project. The experience ran with 10 trainees from the project, using UNED virtual learning platform a LF. Video classes, web-conferences and recorded simulations of case studies were the main learning materials. Asynchronous and synchronous communication tools were used for tutoring purposes, and a final examination for online submission and a final survey were included. (Author)

  17. Redesign of a pilot international online course on accelerator driven systems for nuclear transmutation to implement a massive open online course

    International Nuclear Information System (INIS)

    Alonso-Ramos, M.; Fernandez-Luna, A. J.; Gonzalez-Romero, E. M.; Sanchez-Elvira, A.; Castro, M.; Ogando, F.; Sanz, J.; Martin, S.

    2014-01-01

    In April 2013, a full-distance international pilot course on ADS (Accelerator Driven Systems) for advanced nuclear waste transmutation was taught by UNED-CIEMAT within FP7 ENEN-III project. The experience ran with 10 trainees from the project, using UNED virtual learning platform a LF. Video classes, web-conferences and recorded simulations of case studies were the main learning materials. Asynchronous and synchronous communication tools were used for tutoring purposes, and a final examination for online submission and a final survey were included. (Author)

  18. Closed-loop suppression of chaos in nonlinear driven oscillators

    Science.gov (United States)

    Aguirre, L. A.; Billings, S. A.

    1995-05-01

    This paper discusses the suppression of chaos in nonlinear driven oscillators via the addition of a periodic perturbation. Given a system originally undergoing chaotic motions, it is desired that such a system be driven to some periodic orbit. This can be achieved by the addition of a weak periodic signal to the oscillator input. This is usually accomplished in open loop, but this procedure presents some difficulties which are discussed in the paper. To ensure that this is attained despite uncertainties and possible disturbances on the system, a procedure is suggested to perform control in closed loop. In addition, it is illustrated how a model, estimated from input/output data, can be used in the design. Numerical examples which use the Duffing-Ueda and modified van der Pol oscillators are included to illustrate some of the properties of the new approach.

  19. A Data-Driven Stochastic Reactive Power Optimization Considering Uncertainties in Active Distribution Networks and Decomposition Method

    DEFF Research Database (Denmark)

    Ding, Tao; Yang, Qingrun; Yang, Yongheng

    2018-01-01

    To address the uncertain output of distributed generators (DGs) for reactive power optimization in active distribution networks, the stochastic programming model is widely used. The model is employed to find an optimal control strategy with minimum expected network loss while satisfying all......, in this paper, a data-driven modeling approach is introduced to assume that the probability distribution from the historical data is uncertain within a confidence set. Furthermore, a data-driven stochastic programming model is formulated as a two-stage problem, where the first-stage variables find the optimal...... control for discrete reactive power compensation equipment under the worst probability distribution of the second stage recourse. The second-stage variables are adjusted to uncertain probability distribution. In particular, this two-stage problem has a special structure so that the second-stage problem...

  20. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys.

    Science.gov (United States)

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R

    2017-09-14

    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey. ©Elizabeth Fearon, Sungai T Chabata, Jennifer A Thompson, Frances M Cowan, James R Hargreaves. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.09.2017.

  1. Laser-driven polarized sources of hydrogen and deuterium

    International Nuclear Information System (INIS)

    Young, L.; Holt, R.J.; Green, M.C.; Kowalczyk, R.S.

    1988-01-01

    A novel laser-driven polarized source of hydrogen and deuterium which operates on the principle of spin exchange optical pumping is described. The advantages of this method over conventional polarized sources for internal target experiments are presented. Technological difficulties which prevent ideal source operation are outlined along with proposed solutions. At present, the laser-driven polarized hydrogen source delivers 8 /times/ 10 16 atoms/s with a polarization (P/sub z/) of 24%. 9 refs., 2 figs

  2. Impulsive control of permanent magnet synchronous motors with parameters uncertainties

    International Nuclear Information System (INIS)

    Li Dong; Zhang Xiaohong; Wang Shilong; Yan Dan; Wang Hui

    2008-01-01

    The permanent magnet synchronous motors (PMSMs) may have chaotic behaviours for the uncertain values of parameters or under certain working conditions, which threatens the secure and stable operation of motor-driven. It is important to study methods of controlling or suppressing chaos in PMSMs. In this paper, robust stabilities of PMSM with parameter uncertainties are investigated. After the uncertain matrices which represent the variable system parameters are formulated through matrix analysis, a novel asymptotical stability criterion is established. Some illustrated examples are also given to show the effectiveness of the obtained results

  3. Towards minimizing measurement uncertainty in total petroleum hydrocarbon determination by GC-FID

    Energy Technology Data Exchange (ETDEWEB)

    Saari, E.

    2009-07-01

    Despite tightened environmental legislation, spillages of petroleum products remain a serious problem worldwide. The environmental impacts of these spillages are always severe and reliable methods for the identification and quantitative determination of petroleum hydrocarbons in environmental samples are therefore needed. Great improvements in the definition and analysis of total petroleum hydrocarbons (TPH) were finally introduced by international organizations for standardization in 2004. This brought some coherence to the determination and, nowadays, most laboratories seem to employ ISO/DIS 16703:2004, ISO 9377-2:2000 and CEN prEN 14039:2004:E draft international standards for analysing TPH in soil. The implementation of these methods, however, usually fails because the reliability of petroleum hydrocarbon determination has proved to be poor.This thesis describes the assessment of measurement uncertainty for TPH determination in soil. Chemometric methods were used to both estimate the main uncertainty sources and identify the most significant factors affecting these uncertainty sources. The method used for the determinations was based on gas chromatography utilizing flame ionization detection (GC-FID).Chemometric methodology applied in estimating measurement uncertainty for TPH determination showed that the measurement uncertainty is in actual fact dominated by the analytical uncertainty. Within the specific concentration range studied, the analytical uncertainty accounted for as much as 68-80% of the measurement uncertainty. The robustness of the analytical method used for petroleum hydrocarbon determination was then studied in more detail. A two-level Plackett-Burman design and a D-optimal design were utilized to assess the main analytical uncertainty sources of the sample treatment and GC determination procedures. It was also found that the matrix-induced systematic error may also significantly reduce the reliability of petroleum hydrocarbon determination

  4. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    Science.gov (United States)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  5. Evaluating variability and uncertainty in radiological impact assessment using SYMBIOSE

    International Nuclear Information System (INIS)

    Simon-Cornu, M.; Beaugelin-Seiller, K.; Boyer, P.; Calmon, P.; Garcia-Sanchez, L.; Mourlon, C.; Nicoulaud, V.; Sy, M.; Gonze, M.A.

    2015-01-01

    SYMBIOSE is a modelling platform that accounts for variability and uncertainty in radiological impact assessments, when simulating the environmental fate of radionuclides and assessing doses to human populations. The default database of SYMBIOSE is partly based on parameter values that are summarized within International Atomic Energy Agency (IAEA) documents. To characterize uncertainty on the transfer parameters, 331 Probability Distribution Functions (PDFs) were defined from the summary statistics provided within the IAEA documents (i.e. sample size, minimal and maximum values, arithmetic and geometric means, standard and geometric standard deviations) and are made available as spreadsheet files. The methods used to derive the PDFs without complete data sets, but merely the summary statistics, are presented. Then, a simple case-study illustrates the use of the database in a second-order Monte Carlo calculation, separating parametric uncertainty and inter-individual variability. - Highlights: • Parametric uncertainty in radioecology was derived from IAEA documents. • 331 Probability Distribution Functions were defined for transfer parameters. • Parametric uncertainty and inter-individual variability were propagated

  6. Status of XSUSA for sampling based nuclear data uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Zwermann, W.; Gallner, L.; Klein, M.; Krzydacz-Hausmann; Pasichnyk, I.; Pautz, A.; Velkov, K.

    2013-01-01

    In the present contribution, an overview of the sampling based XSUSA method for sensitivity and uncertainty analysis with respect to nuclear data is given. The focus is on recent developments and applications of XSUSA. These applications include calculations for critical assemblies, fuel assembly depletion calculations, and steady state as well as transient reactor core calculations. The analyses are partially performed in the framework of international benchmark working groups (UACSA - Uncertainty Analyses for Criticality Safety Assessment, UAM - Uncertainty Analysis in Modelling). It is demonstrated that particularly for full-scale reactor calculations the influence of the nuclear data uncertainties on the results can be substantial. For instance, for the radial fission rate distributions of mixed UO 2 /MOX light water reactor cores, the 2σ uncertainties in the core centre and periphery can reach values exceeding 10%. For a fast transient, the resulting time behaviour of the reactor power was covered by a wide uncertainty band. Overall, the results confirm the necessity of adding systematic uncertainty analyses to best-estimate reactor calculations. (authors)

  7. Realising the Uncertainty Enabled Model Web

    Science.gov (United States)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    conversion between uncertainty types, and between the spatial / temporal support of service inputs / outputs. Finally we describe the tools being generated within the UncertWeb project, considering three main aspects: i) Elicitation of uncertainties on model inputs. We are developing tools to enable domain experts to provide judgements about input uncertainties from UncertWeb model components (e.g. parameters in meteorological models) which allow panels of experts to engage in the process and reach a consensus view on the current knowledge / beliefs about that parameter or variable. We are developing systems for continuous and categorical variables as well as stationary spatial fields. ii) Visualisation of the resulting uncertain outputs from the end of the workflow, but also at intermediate steps. At this point we have prototype implementations driven by the requirements from the use cases that motivate UncertWeb. iii) Sensitivity and uncertainty analysis on model outputs. Here we show the design of the overall system we are developing, including the deployment of an emulator framework to allow computationally efficient approaches. We conclude with a summary of the open issues and remaining challenges we are facing in UncertWeb, and provide a brief overview of how we plan to tackle these.

  8. On the uncertainties in effective dose estimates of adult CT head scans

    International Nuclear Information System (INIS)

    Gregory, Kent J.; Bibbo, Giovanni; Pattison, John E.

    2008-01-01

    Estimates of the effective dose to adult patients from computed tomography (CT) head scanning can be calculated using a number of different methods. These estimates can be used for a variety of purposes, such as improving scanning protocols, comparing different CT imaging centers, and weighing the benefits of the scan against the risk of radiation-induced cancer. The question arises: What is the uncertainty in these effective dose estimates? This study calculates the uncertainty of effective dose estimates produced by three computer programs (CT-EXPO, CTDosimetry, and ImpactDose) and one method that makes use of dose-length product (DLP) values. Uncertainties were calculated in accordance with an internationally recognized uncertainty analysis guide. For each of the four methods, the smallest and largest overall uncertainties (stated at the 95% confidence interval) were: 20%-31% (CT-EXPO), 15%-28% (CTDosimetry), 20%-36% (ImpactDose), and 22%-32% (DLP), respectively. The overall uncertainties for each method vary due to differences in the uncertainties of factors used in each method. The smallest uncertainties apply when the CT dose index for the scanner has been measured using a calibrated pencil ionization chamber

  9. Dealing with uncertainties in the context of post mining hazard evaluation

    OpenAIRE

    Cauvin , Maxime; Salmon , Romuald; Verdel , Thierry

    2008-01-01

    International audience; Risk analyses related to a past mining activity are generally performed in a strong context in uncertainties. A PhD Thesis has been undertaken in 2004 in order to draw up solutions to take into account these uncertainties in the practice. The possibility of elaborating a more quantified evaluation of risk has also been discussed, and in particular the contribution that probabilistic methods may brought to an analysis. This paper summarizes the main results of the Thesi...

  10. Managing uncertainty in potential supplier identification

    OpenAIRE

    Ye , Yun ,; Jankovic , Marija; Okudan Kremer , Gül E.; Bocquet , Jean-Claude

    2014-01-01

    International audience; As a benefit of modularization of complex systems, Original Equipment Manufacturers (OEMs) can choose suppliers in a less constricted way when faced with new or evolving requirements. Before a supplier is selected for each module, a group of potential suppliers should be identified in order to control the uncertainty along with other performance measures of the new system development. In modular design, because suppliers are more involved in system development, the pot...

  11. Uncertainty of inhalation dose coefficients for representative physical and chemical forms of iodine-131

    Science.gov (United States)

    Harvey, Richard Paul, III

    Releases of radioactive material have occurred at various Department of Energy (DOE) weapons facilities and facilities associated with the nuclear fuel cycle in the generation of electricity. Many different radionuclides have been released to the environment with resulting exposure of the population to these various sources of radioactivity. Radioiodine has been released from a number of these facilities and is a potential public health concern due to its physical and biological characteristics. Iodine exists as various isotopes, but our focus is on 131I due to its relatively long half-life, its prevalence in atmospheric releases and its contribution to offsite dose. The assumption of physical and chemical form is speculated to have a profound impact on the deposition of radioactive material within the respiratory tract. In the case of iodine, it has been shown that more than one type of physical and chemical form may be released to, or exist in, the environment; iodine can exist as a particle or as a gas. The gaseous species can be further segregated based on chemical form: elemental, inorganic, and organic iodides. Chemical compounds in each class are assumed to behave similarly with respect to biochemistry. Studies at Oak Ridge National Laboratories have demonstrated that 131I is released as a particulate, as well as in elemental, inorganic and organic chemical form. The internal dose estimate from 131I may be very different depending on the effect that chemical form has on fractional deposition, gas uptake, and clearance in the respiratory tract. There are many sources of uncertainty in the estimation of environmental dose including source term, airborne transport of radionuclides, and internal dosimetry. Knowledge of uncertainty in internal dosimetry is essential for estimating dose to members of the public and for determining total uncertainty in dose estimation. Important calculational steps in any lung model is regional estimation of deposition fractions

  12. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  13. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  14. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  15. Data Fusion: A decision analysis tool that quantifies geological and parametric uncertainty

    International Nuclear Information System (INIS)

    Porter, D.W.

    1995-01-01

    Engineering projects such as siting waste facilities and performing remediation are often driven by geological and hydrogeological uncertainties. Geological understanding and hydrogeological parameters such as hydraulic conductivity are needed to achieve reliable engineering design. Information form non-invasive and minimal invasive data sets offers potential for reduction in uncertainty, but a single data type does not usually meet all needs. Data Fusion uses Bayesian statistics to update prior knowledge with information from diverse data sets as the data is acquired. Prior knowledge takes the form of first principles models (e.g., groundwater flow) and spatial continuity models for heterogeneous properties. The variability of heterogeneous properties is modeled in a form motivated by statistical physics as a Markov random field. A computer reconstruction of targets of interest is produced within a quantified statistical uncertainty. The computed uncertainty provides a rational basis for identifying data gaps for assessing data worth to optimize data acquisition. Further, the computed uncertainty provides a way to determine the confidence of achieving adequate safety, margins in engineering design. Beyond design, Data Fusion provides the basis for real time computer monitoring of remediation. Working with the DOE Office of Technology (OTD), the authors have developed and patented a Data Fusion Workstation system that has been used on jobs at the Hanford, Savannah River, Pantex and Fernald DOE sites. Further, applications include an army depot at Letterkenney, PA and commercial industrial sites

  16. Data Fusion: A decision analysis tool that quantifies geological and parametric uncertainty

    International Nuclear Information System (INIS)

    Porter, D.W.

    1996-01-01

    Engineering projects such as siting waste facilities and performing remediation are often driven by geological and hydrogeological uncertainties. Geological understanding and hydrogeological parameters such as hydraulic conductivity are needed to achieve reliable engineering design. Information from non-invasive and minimally invasive data sets offers potential for reduction in uncertainty, but a single data type does not usually meet all needs. Data Fusion uses Bayesian statistics to update prior knowledge with information from diverse data sets as the data is acquired. Prior knowledge takes the form of first principles models (e.g., groundwater flow) and spatial continuity models for heterogeneous properties. The variability of heterogeneous properties is modeled in a form motivated by statistical physics as a Markov random field. A computer reconstruction of targets of interest is produced within a quantified statistical uncertainty. The computed uncertainty provides a rational basis for identifying data gaps for assessing data worth to optimize data acquisition. Further, the computed uncertainty provides a way to determine the confidence of achieving adequate safety margins in engineering design. Beyond design, Data Fusion provides the basis for real time computer monitoring of remediation. Working with the DOE Office of Technology (OTD), the author has developed and patented a Data Fusion Workstation system that has been used on jobs at the Hanford, Savannah River, Pantex and Fernald DOE sites. Further applications include an army depot at Letterkenney, PA and commercial industrial sites

  17. Uncertainties estimation in surveying measurands: application to lengths, perimeters and areas

    Science.gov (United States)

    Covián, E.; Puente, V.; Casero, M.

    2017-10-01

    The present paper develops a series of methods for the estimation of uncertainty when measuring certain measurands of interest in surveying practice, such as points elevation given a planimetric position within a triangle mesh, 2D and 3D lengths (including perimeters enclosures), 2D areas (horizontal surfaces) and 3D areas (natural surfaces). The basis for the proposed methodology is the law of propagation of variance-covariance, which, applied to the corresponding model for each measurand, allows calculating the resulting uncertainty from known measurement errors. The methods are tested first in a small example, with a limited number of measurement points, and then in two real-life measurements. In addition, the proposed methods have been incorporated to commercial software used in the field of surveying engineering and focused on the creation of digital terrain models. The aim of this evolution is, firstly, to comply with the guidelines of the BIPM (Bureau International des Poids et Mesures), as the international reference agency in the field of metrology, in relation to the determination and expression of uncertainty; and secondly, to improve the quality of the measurement by indicating the uncertainty associated with a given level of confidence. The conceptual and mathematical developments for the uncertainty estimation in the aforementioned cases were conducted by researchers from the AssIST group at the University of Oviedo, eventually resulting in several different mathematical algorithms implemented in the form of MATLAB code. Based on these prototypes, technicians incorporated the referred functionality to commercial software, developed in C++. As a result of this collaboration, in early 2016 a new version of this commercial software was made available, which will be the first, as far as the authors are aware, that incorporates the possibility of estimating the uncertainty for a given level of confidence when computing the aforementioned surveying

  18. 48{sup th} Annual meeting on nuclear technology (AMNT 2017). Key topic / Enhanced safety and operation excellence. Focus session: Uncertainty analyses in reactor core simulations

    Energy Technology Data Exchange (ETDEWEB)

    Zwermann, Winfried [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany). Forschungszentrum

    2017-12-15

    The supplementation of reactor simulations by uncertainty analyses is becoming increasingly important internationally due to the fact that the reliability of simulation calculations can be significantly increased by the quantification of uncertainties in comparison to the use of so-called conservative methods (BEPU- ''Best-Estimate plus Uncertainties''). While systematic uncertainty analyses for thermo-hydraulic calculations have been performed routinely for a long time, methods for taking into account uncertainties in nuclear data, which are the basis for neutron transport calculations, are under development. The Focus Session Uncertainty Analyses in Reactor Core Simulations was intended to provide an overview of international research and development with respect to supplementing reactor core simulations with uncertainty and sensitivity analyses, in research institutes as well as within the nuclear industry. The presented analyses not only focused on light water reactors, but also on advanced reactor systems. Particular emphasis was put on international benchmarks in the field. The session was chaired by Winfried Zwermann (Gesellschaft fuer Anlagen- und Reaktorsicherheit).

  19. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    The question of how to derive and present uncertainty information in climate data records (CDRs) has received sustained attention within the European Space Agency Climate Change Initiative (CCI), a programme to generate CDRs addressing a range of essential climate variables (ECVs) from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO) should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions). The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the shape of the

  20. Measurement uncertainty. A practical guide for Secondary Standards Dosimetry Laboratories

    International Nuclear Information System (INIS)

    2008-05-01

    The need for international traceability for radiation dose measurements has been understood since the early nineteen-sixties. The benefits of high dosimetric accuracy were recognized, particularly in radiotherapy, where the outcome of treatments is dependent on the radiation dose delivered to patients. When considering radiation protection dosimetry, the uncertainty may be greater than for therapy, but proper traceability of the measurements is no less important. To ensure harmonization and consistency in radiation measurements, the International Atomic Energy Agency (IAEA) and the World Health Organization (WHO) created a Network of Secondary Standards Dosimetry Laboratories (SSDLs) in 1976. An SSDL is a laboratory that has been designated by the competent national authorities to undertake the duty of providing the necessary link in the traceability chain of radiation dosimetry to the international measurement system (SI, for Systeme International) for radiation metrology users. The role of the SSDLs is crucial in providing traceable calibrations; they disseminate calibrations at specific radiation qualities appropriate for the use of radiation measuring instruments. Historically, although the first SSDLs were established mainly to provide radiotherapy level calibrations, the scope of their work has expanded over the years. Today, many SSDLs provide traceability for radiation protection measurements and diagnostic radiology in addition to radiotherapy. Some SSDLs, with the appropriate facilities and expertise, also conduct quality audits of the clinical use of the calibrated dosimeters - for example, by providing postal dosimeters for dose comparisons for medical institutions or on-site dosimetry audits with an ion chamber and other appropriate equipment. The requirements for traceable and reliable calibrations are becoming more important. For example, for international trade where radiation products are manufactured within strict quality control systems, it is

  1. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    Wickett, A.J.; Yadigaroglu, G.

    1994-08-01

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  2. The influence of perceived uncertainty on entrepreneurial action in emerging renewable energy technology; biomass gasification projects in the Netherlands

    International Nuclear Information System (INIS)

    Meijer, Ineke S.M.; Hekkert, Marko P.; Koppenjan, Joop F.M.

    2007-01-01

    Emerging renewable energy technologies cannot break through without the involvement of entrepreneurs who dare to take action amidst uncertainty. The uncertainties that the entrepreneurs involved perceive will greatly affect their innovation decisions and can prevent them from engaging in innovation projects aimed at developing and implementing emerging renewable energy technologies. This article analyzes how perceived uncertainties and motivation influence an entrepreneur's decision to act, using empirical data on biomass gasification projects in the Netherlands. Our empirical results show that technological, political and resource uncertainty are the most dominant sources of perceived uncertainty influencing entrepreneurial decision-making. By performing a dynamic analysis, we furthermore demonstrate that perceived uncertainties and motivation are not stable, but evolve over time. We identify critical factors in the project's internal and external environment which influence these changes in perceived uncertainties and motivation, and describe how various interactions between the different variables in the conceptual model (internal and external factors, perceived uncertainty, motivation and previous actions of the entrepreneurs) positively or negatively influence the decision of entrepreneurs to continue entrepreneurial action. We discuss how policymakers can use these insights for stimulating the development and diffusion of emerging renewable energy technologies

  3. PC analysis of stochastic differential equations driven by Wiener noise

    KAUST Repository

    Le Maitre, Olivier

    2015-03-01

    A polynomial chaos (PC) analysis with stochastic expansion coefficients is proposed for stochastic differential equations driven by additive or multiplicative Wiener noise. It is shown that for this setting, a Galerkin formalism naturally leads to the definition of a hierarchy of stochastic differential equations governing the evolution of the PC modes. Under the mild assumption that the Wiener and uncertain parameters can be treated as independent random variables, it is also shown that the Galerkin formalism naturally separates parametric uncertainty and stochastic forcing dependences. This enables us to perform an orthogonal decomposition of the process variance, and consequently identify contributions arising from the uncertainty in parameters, the stochastic forcing, and a coupled term. Insight gained from this decomposition is illustrated in light of implementation to simplified linear and non-linear problems; the case of a stochastic bifurcation is also considered.

  4. Update on international uranium and enrichment supply

    International Nuclear Information System (INIS)

    Cleveland, J.M.

    1987-01-01

    Commercial nuclear power generation came upon us in the late 1950s and should have been relatively uneventful due to its similarities to fossil-powered electrical generation. Procurement of nuclear fuel appears to have been treated totally different from the procurement of fossil fuel, however, and only recently have these practices started to change. The degree of utility reliance on US-mined uranium and US Dept. of Energy (DOE)-produced enrichment services has changed since the 1970s as federal government uncertainty, international fuel market opportunity, and public service commission scrutiny has increased. Accordingly, the uranium and enrichment market has recognized that it is international just like the fossil fuel market. There is now oversupply-driven competition in the international nuclear fuel market. Competition is increasing daily, as third-world countries develop their own nuclear resources. American utilities are now diversifying their fuel supply arrangements, as they do with their oil, coal, and gas supply. The degree of foreign fuel arrangements depends on each utility's risk posture and commitment to long-term contracts. In an era of rising capital, retrofit, operating, and maintenance costs, economical nuclear fuel supply is even more important. This economic advantage, however, may be nullified by congressional and judicial actions limiting uranium importation and access to foreign enrichment. Such artificial trade barriers will only defeat US nuclear generation and the US nuclear fuel industry in the long term

  5. Assessment of the uncertainties in the Radiological Protection Institute of Ireland (RPII) radon measurements service

    Energy Technology Data Exchange (ETDEWEB)

    Hanley, O. [Radiological Protection Institute of Ireland, 3 Clonskeagh Square, Clonskeagh Road, Dublin 14 (Ireland)], E-mail: ohanley@rpii.ie; Gutierrez-Villanueva, J.L. [Laboratorio LIBRA, Edificio I-D, Paseo Belen 3, 47011 Valladolid (Spain); Departamento de Fisica Teorica, Atomica y Optica, Facultad de Ciencias, Paseo Prado de la Magdalena, s/n. 47005 Valladolid (Spain)], E-mail: joselg@libra.uva.es; Currivan, L. [Radiological Protection Institute of Ireland, 3 Clonskeagh Square, Clonskeagh Road, Dublin 14 (Ireland)], E-mail: lcurrivan@rpii.ie; Pollard, D. [Radiological Protection Institute of Ireland, 3 Clonskeagh Square, Clonskeagh Road, Dublin 14 (Ireland)], E-mail: dpollard@rpii.ie

    2008-10-15

    The RPII radon (Rn) laboratory holds accreditation for the International Standard ISO/IEC 17025. A requirement of this standard is an estimate of the uncertainty of measurement. This work shows two approaches to estimate the uncertainty. The bottom-up approach involved identifying the components that were found to contribute to the uncertainty. Estimates were made for each of these components, which were combined to give a combined uncertainty of 13.5% at a Rn concentration of approximately 2500 Bq m{sup -3} at the 68% confidence level. By applying a coverage factor of k = 2, the expanded uncertainty is {+-}27% at the 95% confidence level. The top-down approach used information previously gathered from intercomparison exercises to estimate the uncertainty. This investigation found an expanded uncertainty of {+-}22% at approximately 95% confidence level. This is good agreement for such independent estimates.

  6. Assessment of the uncertainties in the Radiological Protection Institute of Ireland (RPII) radon measurements service.

    Science.gov (United States)

    Hanley, O; Gutiérrez-Villanueva, J L; Currivan, L; Pollard, D

    2008-10-01

    The RPII radon (Rn) laboratory holds accreditation for the International Standard ISO/IEC 17025. A requirement of this standard is an estimate of the uncertainty of measurement. This work shows two approaches to estimate the uncertainty. The bottom-up approach involved identifying the components that were found to contribute to the uncertainty. Estimates were made for each of these components, which were combined to give a combined uncertainty of 13.5% at a Rn concentration of approximately 2500 Bq m(-3) at the 68% confidence level. By applying a coverage factor of k=2, the expanded uncertainty is +/-27% at the 95% confidence level. The top-down approach used information previously gathered from intercomparison exercises to estimate the uncertainty. This investigation found an expanded uncertainty of +/-22% at approximately 95% confidence level. This is good agreement for such independent estimates.

  7. Uncertainty and the de Finetti tables

    OpenAIRE

    Baratgin , Jean; Over , David; Politzer , Guy

    2013-01-01

    International audience; The new paradigm in the psychology of reasoning adopts a Bayesian, or prob-abilistic, model for studying human reasoning. Contrary to the traditional binary approach based on truth functional logic, with its binary values of truth and falsity, a third value that represents uncertainty can be introduced in the new paradigm. A variety of three-valued truth table systems are available in the formal literature, including one proposed by de Finetti. We examine the descripti...

  8. Mass Spectrometric Calibration of Controlled Fluoroform Leak Rate Devices Technique and Uncertainty Analysis

    CERN Document Server

    Balsley, S D; Laduca, C A

    2003-01-01

    Controlled leak rate devices of fluoroform on the order of 10 sup - sup 8 atm centre dot cc sec sup - sup 1 at 25 C are used to calibrate QC-1 War Reserve neutron tube exhaust stations for leak detection sensitivity. Close-out calibration of these tritium-contaminated devices is provided by the Gas Dynamics and Mass Spectrometry Laboratory, Organization 14406, which is a tritium analytical facility. The mass spectrometric technique used for the measurement is discussed, as is the first principals calculation (pressure, volume, temperature and time). The uncertainty of the measurement is largely driven by contributing factors in the determination of P, V and T. The expanded uncertainty of the leak rate measurement is shown to be 4.42%, with a coverage factor of 3 (k=3).

  9. Challenges for sustainable resource use : Uncertainty, trade and climate policies

    NARCIS (Netherlands)

    Bretschger, L.; Smulders, Sjak A.

    2012-01-01

    We integrate new challenges to thinking about resource markets and sustainable resource use policies in a general framework. The challenges, emerging from six papers that JEEM publishes in a special issue, are (i) demand uncertainty and stockpiling, (ii) international trade and resource dependence,

  10. Transport barriers in bootstrap-driven tokamaks

    Science.gov (United States)

    Staebler, G. M.; Garofalo, A. M.; Pan, C.; McClenaghan, J.; Van Zeeland, M. A.; Lao, L. L.

    2018-05-01

    Experiments have demonstrated improved energy confinement due to the spontaneous formation of an internal transport barrier in high bootstrap fraction discharges. Gyrokinetic analysis, and quasilinear predictive modeling, demonstrates that the observed transport barrier is caused by the suppression of turbulence primarily from the large Shafranov shift. It is shown that the Shafranov shift can produce a bifurcation to improved confinement in regions of positive magnetic shear or a continuous reduction in transport for weak or negative magnetic shear. Operation at high safety factor lowers the pressure gradient threshold for the Shafranov shift-driven barrier formation. Two self-organized states of the internal and edge transport barrier are observed. It is shown that these two states are controlled by the interaction of the bootstrap current with magnetic shear, and the kinetic ballooning mode instability boundary. Election scale energy transport is predicted to be dominant in the inner 60% of the profile. Evidence is presented that energetic particle-driven instabilities could be playing a role in the thermal energy transport in this region.

  11. Dealing with uncertainties in the safety of geological disposal of radioactive waste

    International Nuclear Information System (INIS)

    Devillers, Ch.

    2002-01-01

    Confidence in the safety assessment of a possible project of radioactive waste geological repository will only be obtained if the development of the project is closely guided by transparent safety strategies, acknowledging uncertainties and striving for limiting their effects. This paper highlights some sources of uncertainties, external or internal to the project, which are of particular importance for safety. It suggests safety strategies adapted to the uncertainties considered. The case of a possible repository project in the Callovo-Oxfordian clay layer of the French Bure site is examined from that point of view. The German project at Gorleben and the Swedish KBS-3 project are also briefly examined. (author)

  12. Exploring the implication of climate process uncertainties within the Earth System Framework

    Science.gov (United States)

    Booth, B.; Lambert, F. H.; McNeal, D.; Harris, G.; Sexton, D.; Boulton, C.; Murphy, J.

    2011-12-01

    Uncertainties in the magnitude of future climate change have been a focus of a great deal of research. Much of the work with General Circulation Models has focused on the atmospheric response to changes in atmospheric composition, while other processes remain outside these frameworks. Here we introduce an ensemble of new simulations, based on an Earth System configuration of HadCM3C, designed to explored uncertainties in both physical (atmospheric, oceanic and aerosol physics) and carbon cycle processes, using perturbed parameter approaches previously used to explore atmospheric uncertainty. Framed in the context of the climate response to future changes in emissions, the resultant future projections represent significantly broader uncertainty than existing concentration driven GCM assessments. The systematic nature of the ensemble design enables interactions between components to be explored. For example, we show how metrics of physical processes (such as climate sensitivity) are also influenced carbon cycle parameters. The suggestion from this work is that carbon cycle processes represent a comparable contribution to uncertainty in future climate projections as contributions from atmospheric feedbacks more conventionally explored. The broad range of climate responses explored within these ensembles, rather than representing a reason for inaction, provide information on lower likelihood but high impact changes. For example while the majority of these simulations suggest that future Amazon forest extent is resilient to the projected climate changes, a small number simulate dramatic forest dieback. This ensemble represents a framework to examine these risks, breaking them down into physical processes (such as ocean temperature drivers of rainfall change) and vegetation processes (where uncertainties point towards requirements for new observational constraints).

  13. Validation and uncertainty quantification of Fuego simulations of calorimeter heating in a wind-driven hydrocarbon pool fire.

    Energy Technology Data Exchange (ETDEWEB)

    Domino, Stefan Paul; Figueroa, Victor G.; Romero, Vicente Jose; Glaze, David Jason; Sherman, Martin P.; Luketa-Hanlin, Anay Josephine

    2009-12-01

    The objective of this work is to perform an uncertainty quantification (UQ) and model validation analysis of simulations of tests in the cross-wind test facility (XTF) at Sandia National Laboratories. In these tests, a calorimeter was subjected to a fire and the thermal response was measured via thermocouples. The UQ and validation analysis pertains to the experimental and predicted thermal response of the calorimeter. The calculations were performed using Sierra/Fuego/Syrinx/Calore, an Advanced Simulation and Computing (ASC) code capable of predicting object thermal response to a fire environment. Based on the validation results at eight diversely representative TC locations on the calorimeter the predicted calorimeter temperatures effectively bound the experimental temperatures. This post-validates Sandia's first integrated use of fire modeling with thermal response modeling and associated uncertainty estimates in an abnormal-thermal QMU analysis.

  14. Uncertainty Estimation of Neutron Activation Analysis in Zinc Elemental Determination in Food Samples

    International Nuclear Information System (INIS)

    Endah Damastuti; Muhayatun; Diah Dwiana L

    2009-01-01

    Beside to complished the requirements of international standard of ISO/IEC 17025:2005, uncertainty estimation should be done to increase quality and confidence of analysis results and also to establish traceability of the analysis results to SI unit. Neutron activation analysis is a major technique used by Radiometry technique analysis laboratory and is included as scope of accreditation under ISO/IEC 17025:2005, therefore uncertainty estimation of neutron activation analysis is needed to be carried out. Sample and standard preparation as well as, irradiation and measurement using gamma spectrometry were the main activities which could give contribution to uncertainty. The components of uncertainty sources were specifically explained. The result of expanded uncertainty was 4,0 mg/kg with level of confidence 95% (coverage factor=2) and Zn concentration was 25,1 mg/kg. Counting statistic of cuplikan and standard were the major contribution of combined uncertainty. The uncertainty estimation was expected to increase the quality of the analysis results and could be applied further to other kind of samples. (author)

  15. Psychological Entropy: A Framework for Understanding Uncertainty-Related Anxiety

    Science.gov (United States)

    Hirsh, Jacob B.; Mar, Raymond A.; Peterson, Jordan B.

    2012-01-01

    Entropy, a concept derived from thermodynamics and information theory, describes the amount of uncertainty and disorder within a system. Self-organizing systems engage in a continual dialogue with the environment and must adapt themselves to changing circumstances to keep internal entropy at a manageable level. We propose the entropy model of…

  16. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    Science.gov (United States)

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  17. Heisenberg's principle of uncertainty and the uncertainty relations

    International Nuclear Information System (INIS)

    Redei, Miklos

    1987-01-01

    The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs

  18. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  19. Studying the effect of clinical uncertainty on physicians' decision-making using ILIAD.

    Science.gov (United States)

    Anderson, J D; Jay, S J; Weng, H C; Anderson, M M

    1995-01-01

    The influence of uncertainty on physicians' practice behavior is not well understood. In this research, ILIAD, a diagnostic expert system, has been used to study physicians' responses to uncertainty and how their responses affected clinical performance. The simulation mode of ILIAD was used to standardize the presentation and scoring of two cases to 46 residents in emergency medicine, internal medicine, family practice and transitional medicine at Methodist Hospital of Indiana. A questionnaire was used to collect additional data on how physicians respond to clinical uncertainty. A structural equation model was developed, estimated, and tested. The results indicate that stress that physicians experience in dealing with clinical uncertainty has a negative effect on their clinical performance. Moreover, the way that physicians respond to uncertainty has positive and negative effects on their performance. Open discussions with patients about clinical decisions and the use of practice guidelines improves performance. However, when the physician's clinical decisions are influenced by patient demands or their peers, their performance scores decline.

  20. Can agent based models effectively reduce fisheries management implementation uncertainty?

    Science.gov (United States)

    Drexler, M.

    2016-02-01

    Uncertainty is an inherent feature of fisheries management. Implementation uncertainty remains a challenge to quantify often due to unintended responses of users to management interventions. This problem will continue to plague both single species and ecosystem based fisheries management advice unless the mechanisms driving these behaviors are properly understood. Equilibrium models, where each actor in the system is treated as uniform and predictable, are not well suited to forecast the unintended behaviors of individual fishers. Alternatively, agent based models (AMBs) can simulate the behaviors of each individual actor driven by differing incentives and constraints. This study evaluated the feasibility of using AMBs to capture macro scale behaviors of the US West Coast Groundfish fleet. Agent behavior was specified at the vessel level. Agents made daily fishing decisions using knowledge of their own cost structure, catch history, and the histories of catch and quota markets. By adding only a relatively small number of incentives, the model was able to reproduce highly realistic macro patterns of expected outcomes in response to management policies (catch restrictions, MPAs, ITQs) while preserving vessel heterogeneity. These simulations indicate that agent based modeling approaches hold much promise for simulating fisher behaviors and reducing implementation uncertainty. Additional processes affecting behavior, informed by surveys, are continually being added to the fisher behavior model. Further coupling of the fisher behavior model to a spatial ecosystem model will provide a fully integrated social, ecological, and economic model capable of performing management strategy evaluations to properly consider implementation uncertainty in fisheries management.

  1. Uncertainty in prostate cancer. Ethnic and family patterns.

    Science.gov (United States)

    Germino, B B; Mishel, M H; Belyea, M; Harris, L; Ware, A; Mohler, J

    1998-01-01

    Prostate cancer occurs 37% more often in African-American men than in white men. Patients and their family care providers (FCPs) may have different experiences of cancer and its treatment. This report addresses two questions: 1) What is the relationship of uncertainty to family coping, psychological adjustment to illness, and spiritual factors? and 2) Are these patterns of relationship similar for patients and their family care givers and for whites and African-Americans? A sample of white and African-American men and their family care givers (N = 403) was drawn from an ongoing study, testing the efficacy of an uncertainty management intervention with men with stage B prostate cancer. Data were collected at study entry, either 1 week after post-surgical catheter removal or at the beginning of primary radiation treatment. Measures of uncertainty, adult role behavior, problem solving, social support, importance of God in one's life, family coping, psychological adjustment to illness, and perceptions of health and illness met standard criteria for internal consistency. Analyses of baseline data using Pearson's product moment correlations were conducted to examine the relationships of person, disease, and contextual factors to uncertainty. For family coping, uncertainty was significantly and positively related to two domains in white family care providers only. In African-American and white family care providers, the more uncertainty experienced, the less positive they felt about treatment. Uncertainty for all care givers was related inversely to positive feelings about the patient recovering from the illness. For all patients and for white family members, uncertainty was related inversely to the quality of the domestic environment. For everyone, uncertainty was related inversely to psychological distress. Higher levels of uncertainty were related to a poorer social environment for African-American patients and for white family members. For white patients and their

  2. Expert system driven fuzzy control application to power reactors

    International Nuclear Information System (INIS)

    Tsoukalas, L.H.; Berkan, R.C.; Upadhyaya, B.R.; Uhrig, R.E.

    1990-01-01

    For the purpose of nonlinear control and uncertainty/imprecision handling, fuzzy controllers have recently reached acclaim and increasing commercial application. The fuzzy control algorithms often require a ''supervisory'' routine that provides necessary heuristics for interface, adaptation, mode selection and other implementation issues. Performance characteristics of an on-line fuzzy controller depend strictly on the ability of such supervisory routines to manipulate the fuzzy control algorithm and enhance its control capabilities. This paper describes an expert system driven fuzzy control design application to nuclear reactor control, for the automated start-up control of the Experimental Breeder Reactor-II. The methodology is verified through computer simulations using a valid nonlinear model. The necessary heuristic decisions are identified that are vitally important for the implemention of fuzzy control in the actual plant. An expert system structure incorporating the necessary supervisory routines is discussed. The discussion also includes the possibility of synthesizing the fuzzy, exact and combined reasoning to include both inexact concepts, uncertainty and fuzziness, within the same environment

  3. The MDE Diploma: First International Postgraduate Specialization in Model-Driven Engineering

    Science.gov (United States)

    Cabot, Jordi; Tisi, Massimo

    2011-01-01

    Model-Driven Engineering (MDE) is changing the way we build, operate, and maintain our software-intensive systems. Several projects using MDE practices are reporting significant improvements in quality and performance but, to be able to handle these projects, software engineers need a set of technical and interpersonal skills that are currently…

  4. Optimizing integrated airport surface and terminal airspace operations under uncertainty

    Science.gov (United States)

    Bosson, Christabelle S.

    In airports and surrounding terminal airspaces, the integration of surface, arrival and departure scheduling and routing have the potential to improve the operations efficiency. Moreover, because both the airport surface and the terminal airspace are often altered by random perturbations, the consideration of uncertainty in flight schedules is crucial to improve the design of robust flight schedules. Previous research mainly focused on independently solving arrival scheduling problems, departure scheduling problems and surface management scheduling problems and most of the developed models are deterministic. This dissertation presents an alternate method to model the integrated operations by using a machine job-shop scheduling formulation. A multistage stochastic programming approach is chosen to formulate the problem in the presence of uncertainty and candidate solutions are obtained by solving sample average approximation problems with finite sample size. The developed mixed-integer-linear-programming algorithm-based scheduler is capable of computing optimal aircraft schedules and routings that reflect the integration of air and ground operations. The assembled methodology is applied to a Los Angeles case study. To show the benefits of integrated operations over First-Come-First-Served, a preliminary proof-of-concept is conducted for a set of fourteen aircraft evolving under deterministic conditions in a model of the Los Angeles International Airport surface and surrounding terminal areas. Using historical data, a representative 30-minute traffic schedule and aircraft mix scenario is constructed. The results of the Los Angeles application show that the integration of air and ground operations and the use of a time-based separation strategy enable both significant surface and air time savings. The solution computed by the optimization provides a more efficient routing and scheduling than the First-Come-First-Served solution. Additionally, a data driven analysis is

  5. Risk, unexpected uncertainty, and estimation uncertainty: Bayesian learning in unstable settings.

    Directory of Open Access Journals (Sweden)

    Elise Payzan-LeNestour

    Full Text Available Recently, evidence has emerged that humans approach learning using Bayesian updating rather than (model-free reinforcement algorithms in a six-arm restless bandit problem. Here, we investigate what this implies for human appreciation of uncertainty. In our task, a Bayesian learner distinguishes three equally salient levels of uncertainty. First, the Bayesian perceives irreducible uncertainty or risk: even knowing the payoff probabilities of a given arm, the outcome remains uncertain. Second, there is (parameter estimation uncertainty or ambiguity: payoff probabilities are unknown and need to be estimated. Third, the outcome probabilities of the arms change: the sudden jumps are referred to as unexpected uncertainty. We document how the three levels of uncertainty evolved during the course of our experiment and how it affected the learning rate. We then zoom in on estimation uncertainty, which has been suggested to be a driving force in exploration, in spite of evidence of widespread aversion to ambiguity. Our data corroborate the latter. We discuss neural evidence that foreshadowed the ability of humans to distinguish between the three levels of uncertainty. Finally, we investigate the boundaries of human capacity to implement Bayesian learning. We repeat the experiment with different instructions, reflecting varying levels of structural uncertainty. Under this fourth notion of uncertainty, choices were no better explained by Bayesian updating than by (model-free reinforcement learning. Exit questionnaires revealed that participants remained unaware of the presence of unexpected uncertainty and failed to acquire the right model with which to implement Bayesian updating.

  6. [Influence of Uncertainty and Uncertainty Appraisal on Self-management in Hemodialysis Patients].

    Science.gov (United States)

    Jang, Hyung Suk; Lee, Chang Suk; Yang, Young Hee

    2015-04-01

    This study was done to examine the relation of uncertainty, uncertainty appraisal, and self-management in patients undergoing hemodialysis, and to identify factors influencing self-management. A convenience sample of 92 patients receiving hemodialysis was selected. Data were collected using a structured questionnaire and medical records. The collected data were analyzed using descriptive statistics, t-test, ANOVA, Pearson correlations and multiple regression analysis with the SPSS/WIN 20.0 program. The participants showed a moderate level of uncertainty with the highest score being for ambiguity among the four uncertainty subdomains. Scores for uncertainty danger or opportunity appraisals were under the mid points. The participants were found to perform a high level of self-management such as diet control, management of arteriovenous fistula, exercise, medication, physical management, measurements of body weight and blood pressure, and social activity. The self-management of participants undergoing hemodialysis showed a significant relationship with uncertainty and uncertainty appraisal. The significant factors influencing self-management were uncertainty, uncertainty opportunity appraisal, hemodialysis duration, and having a spouse. These variables explained 32.8% of the variance in self-management. The results suggest that intervention programs to reduce the level of uncertainty and to increase the level of uncertainty opportunity appraisal among patients would improve the self-management of hemodialysis patients.

  7. Understanding Climate Uncertainty with an Ocean Focus

    Science.gov (United States)

    Tokmakian, R. T.

    2009-12-01

    Uncertainty in climate simulations arises from various aspects of the end-to-end process of modeling the Earth’s climate. First, there is uncertainty from the structure of the climate model components (e.g. ocean/ice/atmosphere). Even the most complex models are deficient, not only in the complexity of the processes they represent, but in which processes are included in a particular model. Next, uncertainties arise from the inherent error in the initial and boundary conditions of a simulation. Initial conditions are the state of the weather or climate at the beginning of the simulation and other such things, and typically come from observations. Finally, there is the uncertainty associated with the values of parameters in the model. These parameters may represent physical constants or effects, such as ocean mixing, or non-physical aspects of modeling and computation. The uncertainty in these input parameters propagates through the non-linear model to give uncertainty in the outputs. The models in 2020 will no doubt be better than today’s models, but they will still be imperfect, and development of uncertainty analysis technology is a critical aspect of understanding model realism and prediction capability. Smith [2002] and Cox and Stephenson [2007] discuss the need for methods to quantify the uncertainties within complicated systems so that limitations or weaknesses of the climate model can be understood. In making climate predictions, we need to have available both the most reliable model or simulation and a methods to quantify the reliability of a simulation. If quantitative uncertainty questions of the internal model dynamics are to be answered with complex simulations such as AOGCMs, then the only known path forward is based on model ensembles that characterize behavior with alternative parameter settings [e.g. Rougier, 2007]. The relevance and feasibility of using "Statistical Analysis of Computer Code Output" (SACCO) methods for examining uncertainty in

  8. An ensemble approach to assess hydrological models' contribution to uncertainties in the analysis of climate change impact on water resources

    Science.gov (United States)

    Velázquez, J. A.; Schmid, J.; Ricard, S.; Muerth, M. J.; Gauvin St-Denis, B.; Minville, M.; Chaumont, D.; Caya, D.; Ludwig, R.; Turcotte, R.

    2012-06-01

    Over the recent years, several research efforts investigated the impact of climate change on water resources for different regions of the world. The projection of future river flows is affected by different sources of uncertainty in the hydro-climatic modelling chain. One of the aims of the QBic3 project (Québec-Bavarian International Collaboration on Climate Change) is to assess the contribution to uncertainty of hydrological models by using an ensemble of hydrological models presenting a diversity of structural complexity (i.e. lumped, semi distributed and distributed models). The study investigates two humid, mid-latitude catchments with natural flow conditions; one located in Southern Québec (Canada) and one in Southern Bavaria (Germany). Daily flow is simulated with four different hydrological models, forced by outputs from regional climate models driven by a given number of GCMs' members over a reference (1971-2000) and a future (2041-2070) periods. The results show that the choice of the hydrological model does strongly affect the climate change response of selected hydrological indicators, especially those related to low flows. Indicators related to high flows seem less sensitive on the choice of the hydrological model. Therefore, the computationally less demanding models (usually simple, lumped and conceptual) give a significant level of trust for high and overall mean flows.

  9. MECCA coordinated research program: analysis of climate models uncertainties used for climatic changes study

    International Nuclear Information System (INIS)

    Caneill, J.Y.; Hakkarinen, C.

    1992-01-01

    An international consortium, called MECCA, (Model Evaluation Consortium for Climate Assessment) has been created in 1991 by different partners including electric utilities, government and academic groups to make available to the international scientific community, a super-computer facility for climate evolution studies. The first phase of the program consists to assess uncertainties of climate model simulations in the framework of global climate change studies. Fourteen scientific projects have been accepted on an international basis in this first phase. The second phase of the program will consist in the evaluation of a set of long climate simulations realized with coupled ocean/atmosphere models, in order to study the transient aspects of climate changes and the associated uncertainties. A particular attention will be devoted, on the consequences of these assessments on climate impact studies, and on the regional aspects of climate changes

  10. MO-E-BRE-01: Determination, Minimization and Communication of Uncertainties in Radiation Therapy

    International Nuclear Information System (INIS)

    Van Dyk, J; Palta, J; Bortfeld, T; Mijnheer, B

    2014-01-01

    Medical Physicists have a general understanding of uncertainties in the radiation treatment process, both with respect to dosimetry and geometry. However, there is a desire to be more quantitative about uncertainty estimation. A recent International Atomic Energy Agency (IAEA) report (about to be published) recommends that we should be as “accurate as reasonably achievable, technical and biological factors being taken into account”. Thus, a single recommendation as a goal for accuracy in radiation therapy is an oversimplification. That report also suggests that individual clinics should determine their own level of uncertainties for their specific treatment protocols. The question is “how do we implement this in clinical practice”? AAPM Monograph 35 (2011 AAPM Summer School) addressed many specific aspects of uncertainties in each of the steps of a course of radiation treatment. The intent of this symposium is: (1) to review uncertainty considerations in the entire radiation treatment process including uncertainty determination for each step and uncertainty propagation for the total process, (2) to consider aspects of robust optimization which optimizes treatment plans while protecting them against uncertainties, and (3) to describe various methods of displaying uncertainties and communicating uncertainties to the relevant professionals. While the theoretical and research aspects will also be described, the emphasis will be on the practical considerations for the medical physicist in clinical practice. Learning Objectives: To review uncertainty determination in the overall radiation treatment process. To consider uncertainty modeling and uncertainty propagation. To highlight the basic ideas and clinical potential of robust optimization procedures to generate optimal treatment plans that are not severely affected by uncertainties. To describe methods of uncertainty communication and display

  11. MO-E-BRE-01: Determination, Minimization and Communication of Uncertainties in Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Van Dyk, J; Palta, J; Bortfeld, T; Mijnheer, B [Western University, London, ON (United Kingdom)

    2014-06-15

    Medical Physicists have a general understanding of uncertainties in the radiation treatment process, both with respect to dosimetry and geometry. However, there is a desire to be more quantitative about uncertainty estimation. A recent International Atomic Energy Agency (IAEA) report (about to be published) recommends that we should be as “accurate as reasonably achievable, technical and biological factors being taken into account”. Thus, a single recommendation as a goal for accuracy in radiation therapy is an oversimplification. That report also suggests that individual clinics should determine their own level of uncertainties for their specific treatment protocols. The question is “how do we implement this in clinical practice”? AAPM Monograph 35 (2011 AAPM Summer School) addressed many specific aspects of uncertainties in each of the steps of a course of radiation treatment. The intent of this symposium is: (1) to review uncertainty considerations in the entire radiation treatment process including uncertainty determination for each step and uncertainty propagation for the total process, (2) to consider aspects of robust optimization which optimizes treatment plans while protecting them against uncertainties, and (3) to describe various methods of displaying uncertainties and communicating uncertainties to the relevant professionals. While the theoretical and research aspects will also be described, the emphasis will be on the practical considerations for the medical physicist in clinical practice. Learning Objectives: To review uncertainty determination in the overall radiation treatment process. To consider uncertainty modeling and uncertainty propagation. To highlight the basic ideas and clinical potential of robust optimization procedures to generate optimal treatment plans that are not severely affected by uncertainties. To describe methods of uncertainty communication and display.

  12. Photovoltaic Calibrations at the National Renewable Energy Laboratory and Uncertainty Analysis Following the ISO 17025 Guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Emery, Keith [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-01

    The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence of Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.

  13. Deep uncertainty and broad heterogeneity in country-level social cost of carbon

    Science.gov (United States)

    Ricke, K.; Drouet, L.; Caldeira, K.; Tavoni, M.

    2017-12-01

    The social cost of carbon (SCC) is a commonly employed metric of the expected economic damages expected from carbon dioxide (CO2) emissions. Recent estimates of SCC range from approximately 10/tonne of CO2 to as much as 1000/tCO2, but these have been computed at the global level. While useful in an optimal policy context, a world-level approach obscures the heterogeneous geography of climate damages and vast differences in country-level contributions to global SCC, as well as climate and socio-economic uncertainties, which are much larger at the regional level. For the first time, we estimate country-level contributions to SCC using recent climate and carbon-cycle model projections, empirical climate-driven economic damage estimations, and information from the Shared Socio-economic Pathways. Central specifications show high global SCC values (median: 417 /tCO2, 66% confidence intervals: 168 - 793 /tCO2) with country-level contributions ranging from -11 (-8 - -14) /tCO2 to 86 (50 - 158) /tCO2. We quantify climate-, scenario- and economic damage- driven uncertainties associated with the calculated values of SCC. We find that while the magnitude of country-level social cost of carbon is highly uncertain, the relative positioning among countries is consistent. Countries incurring large fractions of the global cost include India, China, and the United States. The share of SCC distributed among countries is robust, indicating climate change winners and losers from a geopolitical perspective.

  14. Climate change scenarios of heat waves in Central Europe and their uncertainties

    Science.gov (United States)

    Lhotka, Ondřej; Kyselý, Jan; Farda, Aleš

    2018-02-01

    The study examines climate change scenarios of Central European heat waves with a focus on related uncertainties in a large ensemble of regional climate model (RCM) simulations from the EURO-CORDEX and ENSEMBLES projects. Historical runs (1970-1999) driven by global climate models (GCMs) are evaluated against the E-OBS gridded data set in the first step. Although the RCMs are found to reproduce the frequency of heat waves quite well, those RCMs with the coarser grid (25 and 50 km) considerably overestimate the frequency of severe heat waves. This deficiency is improved in higher-resolution (12.5 km) EURO-CORDEX RCMs. In the near future (2020-2049), heat waves are projected to be nearly twice as frequent in comparison to the modelled historical period, and the increase is even larger for severe heat waves. Uncertainty originates mainly from the selection of RCMs and GCMs because the increase is similar for all concentration scenarios. For the late twenty-first century (2070-2099), a substantial increase in heat wave frequencies is projected, the magnitude of which depends mainly upon concentration scenario. Three to four heat waves per summer are projected in this period (compared to less than one in the recent climate), and severe heat waves are likely to become a regular phenomenon. This increment is primarily driven by a positive shift of temperature distribution, but changes in its scale and enhanced temporal autocorrelation of temperature also contribute to the projected increase in heat wave frequencies.

  15. Information system design for demand-driven supply networks

    OpenAIRE

    Selk, Bernhard

    2004-01-01

    Information system design for demand-driven supply networks : integrating CRM & SCM / B. Selk, K. Turowski, C. Winnewisser. - In: EIS : Fourth International ICSC Symposium on Engineering of Intelligent Systems, EIS 2004. [Elektronische Ressource]. - Millet, Alberta : ICSC Interdisciplinary Research Canada, 2004. - 8 S. auf CD-ROM

  16. The Relationship of Cultural Similarity, Communication Effectiveness and Uncertainty Reduction.

    Science.gov (United States)

    Koester, Jolene; Olebe, Margaret

    To investigate the relationship of cultural similarity/dissimilarity, communication effectiveness, and communication variables associated with uncertainty reduction theory, a study examined two groups of students--a multinational group living on an "international floor" in a dormitory at a state university and an unrelated group of U.S.…

  17. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  18. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  19. Effect of activation cross section uncertainties in transmutation analysis of realistic low-activation steels for IFMIF

    Energy Technology Data Exchange (ETDEWEB)

    Cabellos, O.; Garcya-Herranz, N.; Sanz, J. [Institute of Nuclear Fusion, UPM, Madrid (Spain); Cabellos, O.; Garcya-Herranz, N.; Fernandez, P.; Fernandez, B. [Dept. of Nuclear Engineering, UPM, Madrid (Spain); Sanz, J. [Dept. of Power Engineering, UNED, Madrid (Spain); Reyes, S. [Safety, Environment and Health Group, ITER Joint Work Site, Cadarache Center (France)

    2008-07-01

    We address uncertainty analysis to draw conclusions on the reliability of the activation calculation in the International Fusion Materials Irradiation Facility (IFMIF) under the potential impact of activation cross section uncertainties. The Monte Carlo methodology implemented in ACAB code gives the uncertainty estimates due to the synergetic/global effect of the complete set of cross section uncertainties. An element-by-element analysis has been demonstrated as a helpful tool to easily analyse the transmutation performance of irradiated materials.The uncertainty analysis results showed that for times over about 24 h the relative error in the contact dose rate can be as large as 23 per cent. We have calculated the effect of cross section uncertainties in the IFMIF activation of all different elements. For EUROFER, uncertainties in H and He elements are 7.3% and 5.6%, respectively. We have found significant uncertainties in the transmutation response for C, P and Nb.

  20. Uncertainty and sensitivity analysis on probabilistic safety assessment of an experimental facility

    International Nuclear Information System (INIS)

    Burgazzi, L.

    2000-01-01

    The aim of this work is to perform an uncertainty and sensitivity analysis on the probabilistic safety assessment of the International Fusion Materials Irradiation Facility (IFMIF), in order to assess the effect on the final risk values of the uncertainties associated with the generic data used for the initiating events and component reliability and to identify the key quantities contributing to this uncertainty. The analysis is conducted on the expected frequency calculated for the accident sequences, defined through the event tree (ET) modeling. This is in order to increment credit to the ET model quantification, to calculate frequency distributions for the occurrence of events and, consequently, to assess if sequences have been correctly selected on the probability standpoint and finally to verify the fulfillment of the safety conditions. Uncertainty and sensitivity analysis are performed using respectively Monte Carlo sampling and an importance parameter technique. (author)

  1. Uncertainty and measurement

    International Nuclear Information System (INIS)

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  2. Economic Uncertainty, Parental Selection and the Criminal Activity of the "Children of the Wall." CEP Discussion Paper No. 1256

    Science.gov (United States)

    Chevalier, Arnaud; Marie, Olivier

    2014-01-01

    We study the link between parental selection and children criminality in a new context. After the fall of the Berlin Wall, East Germany experienced an unprecedented temporary drop in fertility driven by economic uncertainty. We exploit this natural experiment to estimate that the children from these (smaller) cohorts are 40 percent more likely to…

  3. Impulsive control for permanent magnet synchronous motors with uncertainties: LMI approach

    International Nuclear Information System (INIS)

    Dong, Li; Shi-Long, Wang; Xiao-Hong, Zhang; Dan, Yang

    2010-01-01

    A permanent magnet synchronous motor (PMSM) may have chaotic behaviours under certain working conditions, especially for uncertain values of parameters, which threatens the security and stability of motor-driven operation. Hence, it is important to study methods of controlling or suppressing chaos in PMSMs. In this paper, the stability of a PMSM with parameter uncertainties is investigated. After uncertain matrices which represent the variable system parameters are formulated through matrix analysis, a novel asymptotical stability criterion is established by employing the method of Lyapunov functions and linear matrix inequality technology. An example is also given to illustrate the effectiveness of our results. (general)

  4. Delphi-RAND consensus of the Spanish Society of Internal Medicine on the controversies in anticoagulant therapy and prophylaxis in medical diseases. INTROMBIN Project (Uncertainty in thromboprophylaxis in internal medicine).

    Science.gov (United States)

    Ruiz-Ruiz, F; Medrano, F J; Navarro-Puerto, M A; Rodríguez-Torres, P; Romero-Alonso, A; Santos-Lozano, J M; Alonso-Ortiz Del Rio, C; Varela-Aguilar, J M; Calderón, E J; Marín-León, I

    2018-05-21

    The aim of this study was to determine the opinion of internists on the management of anticoagulation and thromboembolism prophylaxis in complex clinical scenarios in which the risk-benefit ratio of surgery is narrow and to develop a consensus document on the use of drugs anticoagulant therapy in this patient group. To this end, we identified by consensus the clinical areas of greatest uncertainty, a survey was created with 20 scenarios laid out in 40 clinical questions, and we reviewed the specific literature. The survey was distributed among the internists of the Spanish Society of Internal Medicine (SEMI) and was completed by 290 of its members. The consensus process was implemented by changing the Delphi-RAND appropriateness method in an anonymous, double-round process that enabled an expert panel to identify the areas of agreement and uncertainty. In our case, we also added the survey results to the panel, a methodological innovation that helps provide additional information on the standard clinical practice. The result of the process is a set of 19 recommendations formulated by SEMI experts, which helps establish guidelines for action on anticoagulant therapy in complex scenarios (high risk or active haemorrhage, short life expectancy, coexistence of antiplatelet therapy or comorbidities such as kidney disease and liver disease), which are not uncommon in standard clinical practice. Copyright © 2018 Elsevier España, S.L.U. and Sociedad Española de Medicina Interna (SEMI). All rights reserved.

  5. Uncertainty in the spatial distribution of tropical forest biomass: a comparison of pan-tropical maps

    OpenAIRE

    Mitchard, Edward TA; Saatchi, Sassan S; Baccini, Alessandro; Asner, Gregory P; Goetz, Scott J; Harris, Nancy L; Brown, Sandra

    2013-01-01

    BackgroundMapping the aboveground biomass of tropical forests is essential both for implementing conservation policy and reducing uncertainties in the global carbon cycle. Two medium resolution (500 m – 1000 m) pantropical maps of vegetation biomass have been recently published, and have been widely used by sub-national and national-level activities in relation to Reducing Emissions from Deforestation and forest Degradation (REDD+). Both maps use similar input data layers, and are driven by t...

  6. Application of probabilistic modelling for the uncertainty evaluation of alignment measurements of large accelerator magnets assemblies

    Science.gov (United States)

    Doytchinov, I.; Tonnellier, X.; Shore, P.; Nicquevert, B.; Modena, M.; Mainaud Durand, H.

    2018-05-01

    Micrometric assembly and alignment requirements for future particle accelerators, and especially large assemblies, create the need for accurate uncertainty budgeting of alignment measurements. Measurements and uncertainties have to be accurately stated and traceable, to international standards, for metre-long sized assemblies, in the range of tens of µm. Indeed, these hundreds of assemblies will be produced and measured by several suppliers around the world, and will have to be integrated into a single machine. As part of the PACMAN project at CERN, we proposed and studied a practical application of probabilistic modelling of task-specific alignment uncertainty by applying a simulation by constraints calibration method. Using this method, we calibrated our measurement model using available data from ISO standardised tests (10360 series) for the metrology equipment. We combined this model with reference measurements and analysis of the measured data to quantify the actual specific uncertainty of each alignment measurement procedure. Our methodology was successfully validated against a calibrated and traceable 3D artefact as part of an international inter-laboratory study. The validated models were used to study the expected alignment uncertainty and important sensitivity factors in measuring the shortest and longest of the compact linear collider study assemblies, 0.54 m and 2.1 m respectively. In both cases, the laboratory alignment uncertainty was within the targeted uncertainty budget of 12 µm (68% confidence level). It was found that the remaining uncertainty budget for any additional alignment error compensations, such as the thermal drift error due to variation in machine operation heat load conditions, must be within 8.9 µm and 9.8 µm (68% confidence level) respectively.

  7. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  8. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  9. Dataset-driven research for improving recommender systems for learning

    NARCIS (Netherlands)

    Verbert, Katrien; Drachsler, Hendrik; Manouselis, Nikos; Wolpers, Martin; Vuorikari, Riina; Duval, Erik

    2011-01-01

    Verbert, K., Drachsler, H., Manouselis, N., Wolpers, M., Vuorikari, R., & Duval, E. (2011). Dataset-driven research for improving recommender systems for learning. In Ph. Long, & G. Siemens (Eds.), Proceedings of 1st International Conference Learning Analytics & Knowledge (pp. 44-53). February,

  10. Learning together, growing apart. Global warming, energy policy and international trust

    International Nuclear Information System (INIS)

    Kydd, Andrew H.

    2010-01-01

    Standard models of uncertainty in economics imply that sharing information can reduce uncertainty and help identify welfare improving policies. In international relations, 'epistemic communities' of scientists are thought to help provide information for these purposes. However, conflicting preferences can frustrate the transmission of information and prevent effective information sharing. In addition, opportunities for information sharing can deepen distrust as actors observe each other's reaction to what to them is credible information. A model that assumes uncertainty both about the state of the world and the parties' motivations is applied to international climate change negotiations. (author)

  11. Isotopic effects in the neon fixed point: uncertainty of the calibration data correction

    Science.gov (United States)

    Steur, Peter P. M.; Pavese, Franco; Fellmuth, Bernd; Hermier, Yves; Hill, Kenneth D.; Seog Kim, Jin; Lipinski, Leszek; Nagao, Keisuke; Nakano, Tohru; Peruzzi, Andrea; Sparasci, Fernando; Szmyrka-Grzebyk, Anna; Tamura, Osamu; Tew, Weston L.; Valkiers, Staf; van Geel, Jan

    2015-02-01

    The neon triple point is one of the defining fixed points of the International Temperature Scale of 1990 (ITS-90). Although recognizing that natural neon is a mixture of isotopes, the ITS-90 definition only states that the neon should be of ‘natural isotopic composition’, without any further requirements. A preliminary study in 2005 indicated that most of the observed variability in the realized neon triple point temperatures within a range of about 0.5 mK can be attributed to the variability in isotopic composition among different samples of ‘natural’ neon. Based on the results of an International Project (EUROMET Project No. 770), the Consultative Committee for Thermometry decided to improve the realization of the neon fixed point by assigning the ITS-90 temperature value 24.5561 K to neon with the isotopic composition recommended by IUPAC, accompanied by a quadratic equation to take the deviations from the reference composition into account. In this paper, the uncertainties of the equation are discussed and an uncertainty budget is presented. The resulting standard uncertainty due to the isotopic effect (k = 1) after correction of the calibration data is reduced to (4 to 40) μK when using neon of ‘natural’ isotopic composition or to 30 μK when using 20Ne. For comparison, an uncertainty component of 0.15 mK should be included in the uncertainty budget for the neon triple point if the isotopic composition is unknown, i.e. whenever the correction cannot be applied.

  12. Principal results of uncertainty and sensibility analysis for generic spanish AGP-granite

    International Nuclear Information System (INIS)

    Bolado, R.; Moya, J.A.

    1998-01-01

    Recently, ENRESA published his Performance Assessment of a deep geologic repository in granite. This paper summarises the main results of the uncertainty and sensitivity analysis performed on the data generated for the main scenario in the ENRESA Performance Assessment. The uncertainty analysis allowed us to determine the most important radionuclides, which were ''129I, ''36 Cl, ''79 Se and ''126 Sn, and to estimate upper bounds for the risk due to each one of them and for the global risk. Since ''129 I was the most important radionuclide, the main efforts in the sensitivity study were done in studying the most influential parameters on the maximum dose due to that radionuclide. The analysis shows that the order of magnitude of the maximum dose is essentially related to geosphere transport parameters. Nevertheless, the most influential parameters, when considering only the highest values of the maximum doses, are those that control the total amount of contaminant that can be driven into the main path to biosphere. (Author) 3 refs

  13. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    Science.gov (United States)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  14. Psychometric properties of the parent́s perception uncertainty in illness scale, spanish version.

    Science.gov (United States)

    Suarez-Acuña, C E; Carvajal-Carrascal, G; Serrano-Gómez, M E

    2018-03-27

    To analyze the psychometric properties of the Parents' Perception of Uncertainty in Illness Scale, parents/children, adapted to Spanish. A descriptive methodological study involving the translation into Spanish of the Parents' Perception of Uncertainty in Illness Scale, parents/children, and analysis of their face validity, content validity, construct validity and internal consistency. The original version of the scale in English was translated into Spanish, and approved by its author. Six face validity items with comprehension difficulty were reported; which were reviewed and adapted, keeping its structure. The global content validity index with expert appraisal was 0.94. In the exploratory analysis of factors, 3 dimensions were identified: ambiguity and lack of information, unpredictability and lack of clarity, with a KMO=0.846, which accumulated 91.5% of the explained variance. The internal consistency of the scale yielded a Cronbach alpha of 0.86 demonstrating a good level of correlation between items. The Spanish version of "Parent's Perception of Uncertainty in Illness Scale" is a valid and reliable tool that can be used to determine the level of uncertainty of parents facing the illness of their children. Copyright © 2018 Sociedad Española de Enfermería Intensiva y Unidades Coronarias (SEEIUC). Publicado por Elsevier España, S.L.U. All rights reserved.

  15. After the Hague, Bonn and Marrakech: uncertainties on the future international market of emission permits

    International Nuclear Information System (INIS)

    Kitous, A.; Criqui, P.; Blanchard, O.

    2002-01-01

    The purpose of this article is to present an economic assessment, step by step, of the successive developments of the negotiation on weather changes since the Kyoto protocol in 1997 until the agreement achieved in Marrakech during the seventh Conference of the Parties (COP 7) in November 2001. The analysis covers the international market of emission rights, a key mechanism of the Protocol, the purpose of which is to facilitate the Parties' compliance with their undertakings, by introducing flexibility to improve the economic efficiency of emission reduction. However, it now appears that despite the Marrakech agreement in November 2001, the system is weakened by the withdrawal of the USA decided by President G.W. Bush in March 2001, following COP 6 in The Hague, and by a potential excess of permits due to the economic recession of transition countries since the early nineties (hot air). As things stands, the establishment of the market between the countries taking part in the process will undoubtedly require some management of this hot air between transition countries (Eastern Europe and Ex USSR) and the other Parties of appendix B still involved in the process. The uncertainties weighing on the future market of emission permits strengthen the strategic significance of the implementation of effective reduction policies within those regions and particularly within Europe. (authors)

  16. Uncertainty in social dilemmas

    OpenAIRE

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size uncertainty). Several researchers have therefore asked themselves the question as to how such uncertainty influences people’s choice behavior. These researchers have repeatedly concluded that uncertainty...

  17. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  18. Enterprise strategic development under conditions of uncertainty

    Directory of Open Access Journals (Sweden)

    O.L. Truhan

    2016-09-01

    Full Text Available The author points out the necessity to conduct researches in the field of enterprise strategic development under conditions of increased dynamism and uncertainty of external environment. It is determined that under conditions of external uncertainty it’s reasonable to conduct the strategic planning of entities using the life cycle models of organization and planning on the basis of disclosure. Any organization has to react in a flexible way upon external calls applying the cognitive knowledge about its own business model of development and the ability to intensify internal working reserves. The article determines that in the process of long-term business activity planning managers use traditional approaches based on the familiar facts and conditions that the present tendencies will not be subjected to essential changes in the future. Planning a new risky business one has to act when prerequisites and assumptions are predominant over knowledge. The author proves that under such conditions the powerful tool of enterprise strategic development may be such a well-known approach as “planning on the basis of disclosure”. The approach suggested helps take into account numerous factors of uncertainty of external environment that makes the strategic planning process maximum adaptable to the conditions of venture business development.

  19. Efficient uncertainty quantification of a fully nonlinear and dispersive water wave model with random inputs

    DEFF Research Database (Denmark)

    Bigoni, Daniele; Engsig-Karup, Allan Peter; Eskilsson, Claes

    2016-01-01

    A major challenge in next-generation industrial applications is to improve numerical analysis by quantifying uncertainties in predictions. In this work we present a formulation of a fully nonlinear and dispersive potential flow water wave model with random inputs for the probabilistic description...... at different points in the parameter space, allowing for the reuse of existing simulation software. The choice of the applied methods is driven by the number of uncertain input parameters and by the fact that finding the solution of the considered model is computationally intensive. We revisit experimental...... benchmarks often used for validation of deterministic water wave models. Based on numerical experiments and assumed uncertainties in boundary data, our analysis reveals that some of the known discrepancies from deterministic simulation in comparison with experimental measurements could be partially explained...

  20. Beyond the International Linear Collider Driven by FEL with Energy Recovery at 5-10TeV

    CERN Document Server

    Hajima, R

    2005-01-01

    The international linear collider (ILC) at the extreme high energy frontier provides the best hope for the scientist to probe the finenst structure of matter and its origin and perhaps even the origin of the Universe. The technology that employs is based on superconducting RF technology. This technology may usher in a new era for the development of superconducting accelerator technology. On the other hand, the gradient that is allowed in such an accelerator is limited. If one wishes something beyond this after one learns the physics at such high energies(~0.5TeV) and utilizing such technology, one may need a new way to employ the supeconducting technology in providing high gradient compact accelerators. Inspired by a former work of 5-TeV colliders based on solid-state tera-watt lasers [1], we explore 5-10 TeV linear colliders driven by free-electron lasers equipped with energy-recovery system. A preliminary design study suggests that a 5-10 TeV collider with the luminosity of 10(34) can be realized by multi-s...

  1. Determination of radionuclides in environmental test items at CPHR: traceability and uncertainty calculation.

    Science.gov (United States)

    Carrazana González, J; Fernández, I M; Capote Ferrera, E; Rodríguez Castro, G

    2008-11-01

    Information about how the laboratory of Centro de Protección e Higiene de las Radiaciones (CPHR), Cuba establishes its traceability to the International System of Units for the measurement of radionuclides in environmental test items is presented. A comparison among different methodologies of uncertainty calculation, including an analysis of the feasibility of using the Kragten-spreadsheet approach, is shown. In the specific case of the gamma spectrometric assay, the influence of each parameter, and the identification of the major contributor, in the relative difference between the methods of uncertainty calculation (Kragten and partial derivative) is described. The reliability of the uncertainty calculation results reported by the commercial software Gamma 2000 from Silena is analyzed.

  2. Determination of radionuclides in environmental test items at CPHR: Traceability and uncertainty calculation

    International Nuclear Information System (INIS)

    Carrazana Gonzalez, J.; Fernandez, I.M.; Capote Ferrera, E.; Rodriguez Castro, G.

    2008-01-01

    Information about how the laboratory of Centro de Proteccion e Higiene de las Radiaciones (CPHR), Cuba establishes its traceability to the International System of Units for the measurement of radionuclides in environmental test items is presented. A comparison among different methodologies of uncertainty calculation, including an analysis of the feasibility of using the Kragten-spreadsheet approach, is shown. In the specific case of the gamma spectrometric assay, the influence of each parameter, and the identification of the major contributor, in the relative difference between the methods of uncertainty calculation (Kragten and partial derivative) is described. The reliability of the uncertainty calculation results reported by the commercial software Gamma 2000 from Silena is analyzed

  3. Sources of uncertainty in future changes in local precipitation

    Energy Technology Data Exchange (ETDEWEB)

    Rowell, David P. [Met Office Hadley Centre, Exeter (United Kingdom)

    2012-10-15

    This study considers the large uncertainty in projected changes in local precipitation. It aims to map, and begin to understand, the relative roles of uncertain modelling and natural variability, using 20-year mean data from four perturbed physics or multi-model ensembles. The largest - 280-member - ensemble illustrates a rich pattern in the varying contribution of modelling uncertainty, with similar features found using a CMIP3 ensemble (despite its limited sample size, which restricts it value in this context). The contribution of modelling uncertainty to the total uncertainty in local precipitation change is found to be highest in the deep tropics, particularly over South America, Africa, the east and central Pacific, and the Atlantic. In the moist maritime tropics, the highly uncertain modelling of sea-surface temperature changes is transmitted to a large uncertain modelling of local rainfall changes. Over tropical land and summer mid-latitude continents (and to a lesser extent, the tropical oceans), uncertain modelling of atmospheric processes, land surface processes and the terrestrial carbon cycle all appear to play an additional substantial role in driving the uncertainty of local rainfall changes. In polar regions, inter-model variability of anomalous sea ice drives an uncertain precipitation response, particularly in winter. In all these regions, there is therefore the potential to reduce the uncertainty of local precipitation changes through targeted model improvements and observational constraints. In contrast, over much of the arid subtropical and mid-latitude oceans, over Australia, and over the Sahara in winter, internal atmospheric variability dominates the uncertainty in projected precipitation changes. Here, model improvements and observational constraints will have little impact on the uncertainty of time means shorter than at least 20 years. Last, a supplementary application of the metric developed here is that it can be interpreted as a measure

  4. A practical approach for the assessment and illustration of uncertainty in emissions modelling: a case study using GAINS Ireland

    International Nuclear Information System (INIS)

    King, Fearghal; Fu, Miao; Kelly, J. Andrew

    2011-01-01

    National outlooks of emission levels are important components of international environmental policymaking and associated national policy development. This is the case for both greenhouse gas emissions and transboundary air pollutants. However, there is uncertainty inherent in the production of forecasts. In the climate context, IPCC guidelines have been established to support national teams in quantifying uncertainty within national inventory reporting of historic emissions. These are presented to indicate the potential range of deviation from reported values and to offer added evidence for policy decisions. However, the method and practice of accounting for uncertainty amongst emission forecasts is both less clear and less common. This paper posits that the role of forecasts in setting international targets and planning policy action renders the management of ‘forecast’ uncertainty as important as addressing uncertainty in the context of inventory and compliance work. Failure to explicitly present uncertainty in forecasting delivers an implicit and misplaced confidence in a given future scenario, irrespective of parallel work on other scenarios and sensitivities. However, it is acknowledged that approaches to uncertainty analyses within the literature are often highly technical and the models used are both computationally demanding and time-intensive. This can limit broader adoption where national capacities are limited and scenario development is frequent. This paper describes an approach to presenting uncertainty, where the aim is to balance the technical and temporal demands of uncertainty estimation against a means of delivering regular and practical estimation and presentation of uncertainty for any given scenario. In turn this methodology should help formalise the recognition of the uncertainty dimension in emissions forecasts, for all stakeholders engaged.

  5. A Business Ecosystem Driven Market Analysis

    DEFF Research Database (Denmark)

    Ma, Zheng; Billanes, Joy Dalmacio; Jørgensen, Bo Nørregaard

    2017-01-01

    Due to the huge globally emerging market of the bright green buildings, this paper aims to develop a business-ecosystem driven market analysis approach for the investigation of the bright green building market. This paper develops a five-steps business-ecosystem driven market analysis (definition...... of the business domain, stakeholder listing, integration of the value chain, relationship mapping, and ego innovation ecosystem mapping.). This paper finds the global-local matters influence the market structure, which the technologies for building energy technology are developed and employed globally......, and the market demand is comparatively localized. The market players can be both local and international stakeholders who involve and collaborate for the building projects. This paper also finds that the building extensibility should be considered into the building design due to the gap between current market...

  6. Interactive Information Service Technology of Tea Industry Based on Demand-Driven

    OpenAIRE

    Shi , Xiaohui; Chen , Tian’en

    2013-01-01

    International audience; Information service technology is a bridge between user and information resource, also is the critical factor to weight the quality of information service. Focusing on the information service features of tea industry, the demand-driven and interaction of information service were emphasized in this paper. User and market as the major criterion for testing the quality of information service, the interactive information service mode based on the demand-driven was proposed...

  7. Analyzing climate change impacts on water resources under uncertainty using an integrated simulation-optimization approach

    Science.gov (United States)

    Zhuang, X. W.; Li, Y. P.; Nie, S.; Fan, Y. R.; Huang, G. H.

    2018-01-01

    An integrated simulation-optimization (ISO) approach is developed for assessing climate change impacts on water resources. In the ISO, uncertainties presented as both interval numbers and probability distributions can be reflected. Moreover, ISO permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised water-allocation targets are violated. A snowmelt-precipitation-driven watershed (Kaidu watershed) in northwest China is selected as the study case for demonstrating the applicability of the proposed method. Results of meteorological projections disclose that the incremental trend of temperature (e.g., minimum and maximum values) and precipitation exist. Results also reveal that (i) the system uncertainties would significantly affect water resources allocation pattern (including target and shortage); (ii) water shortage would be enhanced from 2016 to 2070; and (iii) the more the inflow amount decreases, the higher estimated water shortage rates are. The ISO method is useful for evaluating climate change impacts within a watershed system with complicated uncertainties and helping identify appropriate water resources management strategies hedging against drought.

  8. Uncertainty estimation in nuclear power plant probabilistic safety assessment

    International Nuclear Information System (INIS)

    Guarro, S.B.; Cummings, G.E.

    1989-01-01

    Probabilistic Risk Assessment (PRA) was introduced in the nuclear industry and the nuclear regulatory process in 1975 with the publication of the Reactor Safety Study by the U.S. Nuclear Regulatory Commission. Almost fifteen years later, the state-of-the-art in this field has been expanded and sharpened in many areas, and about thirty-five plant-specific PRAs (Probabilistic Risk Assessments) have been performed by the nuclear utility companies or by the U.S. Nuclear Regulatory commission. Among the areas where the most evident progress has been made in PRA and PSA (Probabilistic Safety Assessment, as these studies are more commonly referred to in the international community outside the U.S.) is the development of a consistent framework for the identification of sources of uncertainty and the estimation of their magnitude as it impacts various risk measures. Techniques to propagate uncertainty in reliability data through the risk models and display its effect on the top level risk estimates were developed in the early PRAs. The Seismic Safety Margin Research Program (SSMRP) study was the first major risk study to develop an approach to deal explicitly with uncertainty in risk estimates introduced not only by uncertainty in component reliability data, but by the incomplete state of knowledge of the assessor(s) with regard to basic phenomena that may trigger and drive a severe accident. More recently NUREG-1150, another major study of reactor risk sponsored by the NRC, has expanded risk uncertainty estimation and analysis into the realm of model uncertainty related to the relatively poorly known post-core-melt phenomena which determine the behavior of the molten core and of the rector containment structures

  9. Moving Beyond 2% Uncertainty: A New Framework for Quantifying Lidar Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer F.; Clifton, Andrew

    2017-03-08

    Remote sensing of wind using lidar is revolutionizing wind energy. However, current generations of wind lidar are ascribed a climatic value of uncertainty, which is based on a poor description of lidar sensitivity to external conditions. In this presentation, we show how it is important to consider the complete lidar measurement process to define the measurement uncertainty, which in turn offers the ability to define a much more granular and dynamic measurement uncertainty. This approach is a progression from the 'white box' lidar uncertainty method.

  10. COMMUNICATING THE PARAMETER UNCERTAINTY IN THE IQWIG EFFICIENCY FRONTIER TO DECISION-MAKERS

    Science.gov (United States)

    Stollenwerk, Björn; Lhachimi, Stefan K; Briggs, Andrew; Fenwick, Elisabeth; Caro, Jaime J; Siebert, Uwe; Danner, Marion; Gerber-Grote, Andreas

    2015-01-01

    The Institute for Quality and Efficiency in Health Care (IQWiG) developed—in a consultation process with an international expert panel—the efficiency frontier (EF) approach to satisfy a range of legal requirements for economic evaluation in Germany's statutory health insurance system. The EF approach is distinctly different from other health economic approaches. Here, we evaluate established tools for assessing and communicating parameter uncertainty in terms of their applicability to the EF approach. Among these are tools that perform the following: (i) graphically display overall uncertainty within the IQWiG EF (scatter plots, confidence bands, and contour plots) and (ii) communicate the uncertainty around the reimbursable price. We found that, within the EF approach, most established plots were not always easy to interpret. Hence, we propose the use of price reimbursement acceptability curves—a modification of the well-known cost-effectiveness acceptability curves. Furthermore, it emerges that the net monetary benefit allows an intuitive interpretation of parameter uncertainty within the EF approach. This research closes a gap for handling uncertainty in the economic evaluation approach of the IQWiG methods when using the EF. However, the precise consequences of uncertainty when determining prices are yet to be defined. © 2014 The Authors. Health Economics published by John Wiley & Sons Ltd. PMID:24590819

  11. Studies on battery storage requirement of PV fed wind-driven induction generators

    International Nuclear Information System (INIS)

    Rajan Singaravel, M.M.; Arul Daniel, S.

    2013-01-01

    Highlights: ► Sizing of battery storage for PV fed wind-driven IG system is taken up. ► Battery storage is also used to supply reactive power for wind-driven IG. ► Computation of LPSP by incorporating uncertainties of irradiation and wind speed. ► Sizing of hybrid power system components to ensure zero LPSP. ► Calculated storage size satisfied the constraints and improves battery life. - Abstract: Hybrid stand-alone renewable energy systems based on wind–solar resources are considered to be economically better and reliable than stand-alone systems with a single source. An isolated hybrid wind–solar system has been considered in this work, where the storage (battery bank) is necessary to supply the required reactive power for a wind-driven induction generator (IG) during the absence of power from a photovoltaic (PV) array. In such a scheme, to ensure zero Loss of Power Supply Probability (LPSP) and to improve battery bank life, a sizing procedure has been proposed with the incorporation of uncertainties in wind-speed and solar-irradiation level at the site of erection of the plant. Based on the proposed procedure, the size of hybrid power system components and storage capacity are determined. Storage capacity has been calculated for two different requirements. The first requirement of storage capacity is common to any hybrid scheme, which is; to supply both real and reactive power in the absence of wind and solar sources. The second requirement is to supply reactive power alone for the IG during the absence of photovoltaic power, which is unique to the hybrid scheme considered in this work. Storage capacity calculations for different conditions using the proposed approach, satisfies the constraints of maintaining zero LPSP and also improved cycle life of the battery bank

  12. Traceability and uncertainty estimation in coordinate metrology

    DEFF Research Database (Denmark)

    Hansen, Hans Nørgaard; Savio, Enrico; De Chiffre, Leonardo

    2001-01-01

    National and international standards have defined performance verification procedures for coordinate measuring machines (CMMs) that typically involve their ability to measure calibrated lengths and to a certain extent form. It is recognised that, without further analysis or testing, these results...... are required. Depending on the requirements for uncertainty level, different approaches may be adopted to achieve traceability. Especially in the case of complex measurement situations and workpieces the procedures are not trivial. This paper discusses the establishment of traceability in coordinate metrology...

  13. Psychometric Properties of the Intolerance of Uncertainty Scale (IUS in a Lithuanian-speaking population

    Directory of Open Access Journals (Sweden)

    Augustinas Rotomskis

    2014-03-01

    Full Text Available Research suggests that intolerance of uncertainty may be important in understanding worry and may play a key role in the etiology and maintenance of worry. Intolerance of uncertainty is measured using the Intolerance of Uncertainty Scale (IUS, which has been shown to be reliable and valid in many studies. The aim of the present study was to develop a Lithuanian version of this instrument. 228 university students completed the scale. The Lithuanian version of the IUS was found to have good psychometric properties. The IUS showed high internal consistency and good test-retest reliability over a five-week period, and good convergent and divergent validity when assessed with measures of trait anxiety, situational anxiety, and depression. Factor analysis indicated that the IUS has a two-factor structure that represents the beliefs that “uncertainty about the future is unfair” and that “uncertainty has negative behavioral and self-referent implications”. In conclusion, it was found that the Lithuanian version of the IUS is a sound scale for assessing intolerance of uncertainty.

  14. Hydrodynamic analysis of laser-driven cylindrical implosions

    Energy Technology Data Exchange (ETDEWEB)

    Ramis, R. [E.T.S.I. Aeronáuticos, Universidad Politécnica de Madrid (Spain)

    2013-08-15

    Three-dimensional hydrodynamic simulations are performed to study laser-driven cylindrical implosions in the context of experiments (F. Perez et al., Plasma Phys. Controlled Fusion 51, 124035 (2009)) carried out at the Rutherford Appleton Laboratory in the framework of the HiPER project. The analysis is carried out by using the 3D version of the hydrocode MULTI (R. Ramis et al., Comput. Phys. Commun. 49, 475-505 (1988)). The influence of the main laser parameters on implosion performance and symmetry is consistently studied and compared with the results of 2D analysis. Furthermore, the effects of uncertainties in laser irradiation (pointing, focusing, power balance, and time jitter) on implosion performance (average peak density and temperature) are studied by means of statistical analysis.

  15. Quantifying the measurement uncertainty of results from environmental analytical methods.

    Science.gov (United States)

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  16. Negotiated risks. International talks on hazardous issues

    International Nuclear Information System (INIS)

    Avenhaus, Rudolf; Sjoestedt, Gunnar

    2009-01-01

    This book fills a major gap in the risk literature, as it brings together two research strands: risks, to which IIASA's research programs have contributed significantly over the years, culminating in the Risk and Vulnerability Program, and international negotiations, on which there is an abundance of published work, much of it resulting from the work of IIASA's Processes of International Negotiations Program. Throughout the book, it is pointed out that there are actor-driven risks, namely those posed by international negotiations themselves, and issue-driven risks which are caused by large-scale human activities. In fact, negotiated risks deal with some of the most serious risks facing humanity: climate change, nuclear activities, and weapons of mass destruction. The book contains both scientific analyses on the nature of internationally negotiated risks and analyses of concrete risks, both of which are of immense practical relevance in the larger context of international negotiations. (orig.)

  17. Kofi Annan, Syria and the Uses of Uncertainty in Mediation

    Directory of Open Access Journals (Sweden)

    Richard Gowan

    2013-03-01

    Full Text Available One year after Kofi Annan presented his six-point plan for ending the Syrian civil war, it can only be called a failure. But it is necessary to recall the situation facing the UN-Arab League envoy and his team in early 2012. The Syrian conflict had created serious tensions between the major powers. A Western military intervention appeared unlikely but could not be ruled out with absolute certainty. This commentary contends that Annan’s initial priority was to reduce the level of uncertainty inside and outside Syria, thereby creating a framework for political talks.  However, in lowering the level of uncertainty, Annan reduced his own leverage as the Syrian government correctly concluded that it would not be punished for failing to cooperate in good faith.  The commentary concludes that there are occasions where it is advisable for international mediators to maintain and exploit a degree of uncertainty about how a conflict may develop.

  18. Testing methodologies for quantifying physical models uncertainties. A comparative exercise using CIRCE and IPREM (FFTBM)

    Energy Technology Data Exchange (ETDEWEB)

    Freixa, Jordi, E-mail: jordi.freixa-terradas@upc.edu; Alfonso, Elsa de, E-mail: elsa.de.alfonso@upc.edu; Reventós, Francesc, E-mail: francesc.reventos@upc.edu

    2016-08-15

    Highlights: • Uncertainty of physical models are a key issue in Best estimate plus uncertainty analysis. • Estimation of uncertainties of physical models of thermal hydraulics system codes. • Comparison of CIRCÉ and FFTBM methodologies. • Simulation of reflood experiments in order to evaluate uncertainty of physical models related to the reflood scenario. - Abstract: The increasing importance of Best-Estimate Plus Uncertainty (BEPU) analyses in nuclear safety and licensing processes have lead to several international activities. The latest findings highlighted the uncertainties of physical models as one of the most controversial aspects of BEPU. This type of uncertainties is an important contributor to the total uncertainty of NPP BE calculations. Due to the complexity of estimating this uncertainty, it is often assessed solely by engineering judgment. The present study comprises a comparison of two different state-of-the-art methodologies CIRCÉ and IPREM (FFTBM) capable of quantifying the uncertainty of physical models. Similarities and differences of their results are discussed through the observation of probability distribution functions and envelope calculations. In particular, the analyzed scenario is core reflood. Experimental data from the FEBA and PERICLES test facilities is employed while the thermal hydraulic simulations are carried out with RELAP5/mod3.3. This work is undertaken under the framework of PREMIUM (Post-BEMUSE Reflood Model Input Uncertainty Methods) benchmark.

  19. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    more reliable predictions with a new and improved set of model parameters for GC (group contribution) based and CI (atom connectivity index) based models and to quantify the uncertainties in the estimated property values from a process design point-of-view. This includes: (i) parameter estimation using....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column......, the developed methodology can be used to quantify the sensitivity of process design to uncertainties in property estimates; obtain rationally the risk/safety factors in process design; and identify additional experimentation needs in order to reduce most critical uncertainties....

  20. Observational uncertainty and regional climate model evaluation: A pan-European perspective

    Science.gov (United States)

    Kotlarski, Sven; Szabó, Péter; Herrera, Sixto; Räty, Olle; Keuler, Klaus; Soares, Pedro M.; Cardoso, Rita M.; Bosshard, Thomas; Pagé, Christian; Boberg, Fredrik; Gutiérrez, José M.; Jaczewski, Adam; Kreienkamp, Frank; Liniger, Mark. A.; Lussana, Cristian; Szepszo, Gabriella

    2017-04-01

    Local and regional climate change assessments based on downscaling methods crucially depend on the existence of accurate and reliable observational reference data. In dynamical downscaling via regional climate models (RCMs) observational data can influence model development itself and, later on, model evaluation, parameter calibration and added value assessment. In empirical-statistical downscaling, observations serve as predictand data and directly influence model calibration with corresponding effects on downscaled climate change projections. Focusing on the evaluation of RCMs, we here analyze the influence of uncertainties in observational reference data on evaluation results in a well-defined performance assessment framework and on a European scale. For this purpose we employ three different gridded observational reference grids, namely (1) the well-established EOBS dataset (2) the recently developed EURO4M-MESAN regional re-analysis, and (3) several national high-resolution and quality-controlled gridded datasets that recently became available. In terms of climate models five reanalysis-driven experiments carried out by five different RCMs within the EURO-CORDEX framework are used. Two variables (temperature and precipitation) and a range of evaluation metrics that reflect different aspects of RCM performance are considered. We furthermore include an illustrative model ranking exercise and relate observational spread to RCM spread. The results obtained indicate a varying influence of observational uncertainty on model evaluation depending on the variable, the season, the region and the specific performance metric considered. Over most parts of the continent, the influence of the choice of the reference dataset for temperature is rather small for seasonal mean values and inter-annual variability. Here, model uncertainty (as measured by the spread between the five RCM simulations considered) is typically much larger than reference data uncertainty. For

  1. Human amplification of drought-driven fire in tropical regions

    Science.gov (United States)

    Tosca, Michael

    2015-04-01

    The change in globally-measured radiative forcing from the pre-industrial to the present due to interactions between aerosol particles and cloud cover has the largest uncertainty of all anthropogenic factors. Uncertainties are largest in the tropics, where total cloud amount and incoming solar radiation are highest, and where 50% of all aerosol emissions originate from anthropogenic fire. It is well understood that interactions between smoke particles and cloud droplets modify cloud cover , which in turn affects climate, however, few studies have observed the temporal nature of aerosol-cloud interactions without the use of a model. Here we apply a novel approach to measure the effect of fire aerosols on convective clouds in tropical regions (Brazil, Africa and Indonesia) through a combination of remote sensing and meteorological data. We attribute a reduction in cloud fraction during periods of high aerosol optical depths to a smoke-driven inhibition of convection. We find that higher smoke burdens limit vertical updrafts, increase surface pressure, and increase low- level divergence-meteorological indicators of convective suppression. These results are corroborated by climate model simulations that show a smoke-driven increase in regionally averaged shortwave tropospheric heating and boundary layer stratification, and a decrease in vertical velocity and precipitation during the fire season (December-February). We then quantify the human response to decreased cloud cover using a combination of socioeconomic and climate data Our results suggest that, in tropical regions, anthropogenic fire initiates a positive feedback loop where increased aerosol emissions limit convection, dry the surface and enable increased fire activity via human ignition. This result has far-reaching implications for fire management and climate policy in emerging countries along the equator that utilize fire.

  2. Accelerator-driven molten-salt blankets: Physics issues

    International Nuclear Information System (INIS)

    Houts, M.G.; Beard, C.A.; Buksa, J.J.; Davidson, J.W.; Durkee, J.W.; Perry, R.T.; Poston, D.I.

    1994-01-01

    A number of nuclear physics issues concerning the Los Alamos molten-salt, accelerator-driven plutonium converter are discussed. General descriptions of several concepts using internal and external, moderation are presented. Burnup and salt processing requirement calculations are presented for four concepts, indicating that both the high power density externally moderated concept and an internally moderated concept achieve total plutonium burnups approaching 90% at salt processing rates of less than 2 m 3 per year. Beginning-of-life reactivity temperature coefficients and system kinetic response are also discussed. Future research should investigate the effect of changing blanket composition on operational and safety characteristics

  3. Accelerator-driven molten-salt blankets: Physics issues

    International Nuclear Information System (INIS)

    Houts, M.G.; Beard, C.A.; Buksa, J.J.; Davidson, J.W.; Durkee, J.W.; Perry, R.T.; Poston, D.I.

    1994-01-01

    A number of nuclear physics issues concerning the Los Alamos molten-salt accelerator-driven plutonium converter are discussed. General descriptions of several concepts using internal and external moderation are presented. Burnup and salt processing requirement calculations are presented for four concepts, indicating that both the high power density externally moderated concept and an internally moderated concept achieve total plutonium burnups approaching 90% at salt processing rates of less than 2 m 3 per year. Beginning-of-life reactivity temperature coefficients and system kinetic response are also discussed. Future research should investigate the effect of changing blanket composition on operational and safety characteristics

  4. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bostelmann, F. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained). SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on

  5. Insurance Applications of Active Fault Maps Showing Epistemic Uncertainty

    Science.gov (United States)

    Woo, G.

    2005-12-01

    Insurance loss modeling for earthquakes utilizes available maps of active faulting produced by geoscientists. All such maps are subject to uncertainty, arising from lack of knowledge of fault geometry and rupture history. Field work to undertake geological fault investigations drains human and monetary resources, and this inevitably limits the resolution of fault parameters. Some areas are more accessible than others; some may be of greater social or economic importance than others; some areas may be investigated more rapidly or diligently than others; or funding restrictions may have curtailed the extent of the fault mapping program. In contrast with the aleatory uncertainty associated with the inherent variability in the dynamics of earthquake fault rupture, uncertainty associated with lack of knowledge of fault geometry and rupture history is epistemic. The extent of this epistemic uncertainty may vary substantially from one regional or national fault map to another. However aware the local cartographer may be, this uncertainty is generally not conveyed in detail to the international map user. For example, an area may be left blank for a variety of reasons, ranging from lack of sufficient investigation of a fault to lack of convincing evidence of activity. Epistemic uncertainty in fault parameters is of concern in any probabilistic assessment of seismic hazard, not least in insurance earthquake risk applications. A logic-tree framework is appropriate for incorporating epistemic uncertainty. Some insurance contracts cover specific high-value properties or transport infrastructure, and therefore are extremely sensitive to the geometry of active faulting. Alternative Risk Transfer (ART) to the capital markets may also be considered. In order for such insurance or ART contracts to be properly priced, uncertainty should be taken into account. Accordingly, an estimate is needed for the likelihood of surface rupture capable of causing severe damage. Especially where a

  6. Mechanics of Interrill Erosion with Wind-Driven Rain (WDR)

    Science.gov (United States)

    This article provides an evaluation analysis for the performance of the interrill component of the Water Erosion Prediction Project (WEPP) model for Wind-Driven Rain (WDR) events. The interrill delivery rates (Di) were collected in the wind tunnel rainfall simulator facility of the International Cen...

  7. The strategy of parallel approaches in projects with unforeseeable uncertainty: the Manhattan case in retrospect

    OpenAIRE

    Sylvain Lenfle

    2011-01-01

    International audience; This paper discusses the literature on the management of projects with unforeseeable uncertainty. Recent work demonstrates that, when confronted with unforeseeable uncertainties, managers can adopt either a learning, trial-and-error-based strategy, or a parallel approach. In the latter, different solutions are developed in parallel and the best one is chosen when enough information becomes available. Studying the case of the Manhattan Project, which historically exempl...

  8. Communicating the Uncertainty in Greenhouse Gas Emissions from Agriculture

    Science.gov (United States)

    Milne, Alice; Glendining, Margaret; Perryman, Sarah; Whitmore, Andy

    2014-05-01

    inventory. Box plots were favoured by a majority of our participants but this result was driven by those with a better understanding of maths. We concluded that the methods chosen to communicate uncertainty in greenhouse gas emissions should be influenced by professional and mathematical background of the end-user. We propose that boxplots annotated with summary statistics such as mean, median, 2.5th and 97.5th percentiles provide a sound method for communicating uncertainty to research scientists as these individuals tend to be familiar with these methods. End-users from other groups may not be so familiar with these methods and so a combination of intuitive methods such as calibrated phrases and shaded arrays with numerate methods would be better suited. Ideally these individuals should be presented with the intuitive qualitative methods with the option to consider a more quantitative description, perhaps presented in an appendix.

  9. (Un)certainty in climate change impacts on global energy consumption

    Science.gov (United States)

    van Ruijven, B. J.; De Cian, E.; Sue Wing, I.

    2017-12-01

    Climate change is expected to have an influence on the energy sector, especially on energy demand. For many locations, this change in energy demand is a balance between increase of demand for space cooling and a decrease of space heating demand. We perform a large-scale uncertainty analysis to characterize climate change risk on energy consumption as driven by climate and socioeconomic uncertainty. We combine a dynamic econometric model1 with multiple realizations of temperature projections from all 21 CMIP5 models (from the NASA Earth Exchange Global Daily Downscaled Projections2) under moderate (RCP4.5) and vigorous (RCP8.5) warming. Global spatial population projections for five SSPs are combined with GDP projections to construct scenarios for future energy demand driven by socioeconomic change. Between the climate models, we find a median global increase in climate-related energy demand of around 24% by 2050 under RCP8.5 with an interquartile range of 18-38%. Most climate models agree on increases in energy demand of more than 25% or 50% in tropical regions, the Southern USA and Southern China (see Figure). With respect to socioeconomic scenarios, we find wide variations between the SSPs for the number of people in low-income countries who are exposed to increases in energy demand. Figure attached: Number of models that agree on total climate-related energy consumption to increase or decrease by more than 0, 10, 25 or 50% by 2050 under RCP8.5 and SSP5 as result of the CMIP5 ensemble of temperature projections. References1. De Cian, E. & Sue Wing, I. Global Energy Demand in a Warming Climate. (FEEM, 2016). 2. Thrasher, B., Maurer, E. P., McKellar, C. & Duffy, P. B. Technical Note: Bias correcting climate model simulated daily temperature extremes with quantile mapping. Hydrol Earth Syst Sci 16, 3309-3314 (2012).

  10. Absolute frequency list of the ν3-band transitions of methane at a relative uncertainty level of 10(-11).

    Science.gov (United States)

    Okubo, Sho; Nakayama, Hirotaka; Iwakuni, Kana; Inaba, Hajime; Sasada, Hiroyuki

    2011-11-21

    We determine the absolute frequencies of 56 rotation-vibration transitions of the ν(3) band of CH(4) from 88.2 to 90.5 THz with a typical uncertainty of 2 kHz corresponding to a relative uncertainty of 2.2 × 10(-11) over an average time of a few hundred seconds. Saturated absorption lines are observed using a difference-frequency-generation source and a cavity-enhanced absorption cell, and the transition frequencies are measured with a fiber-laser-based optical frequency comb referenced to a rubidium atomic clock linked to the international atomic time. The determined value of the P(7) F(2)((2)) line is consistent with the International Committee for Weights and Measures recommendation within the uncertainty. © 2011 Optical Society of America

  11. Learning together, growing apart. Global warming, energy policy and international trust

    Energy Technology Data Exchange (ETDEWEB)

    Kydd, Andrew H. [Department of Political Science, University of Wisconsin, 110 North Hall, 1050 Bascom Mall, Madison, WI 53706 (United States)

    2010-06-15

    Standard models of uncertainty in economics imply that sharing information can reduce uncertainty and help identify welfare improving policies. In international relations, 'epistemic communities' of scientists are thought to help provide information for these purposes. However, conflicting preferences can frustrate the transmission of information and prevent effective information sharing. In addition, opportunities for information sharing can deepen distrust as actors observe each other's reaction to what to them is credible information. A model that assumes uncertainty both about the state of the world and the parties' motivations is applied to international climate change negotiations. (author)

  12. Learning together, growing apart: Global warming, energy policy and international trust

    Energy Technology Data Exchange (ETDEWEB)

    Kydd, Andrew H., E-mail: akydd@sas.upenn.ed [Department of Political Science, University of Wisconsin, 110 North Hall, 1050 Bascom Mall, Madison, WI 53706 (United States)

    2010-06-15

    Standard models of uncertainty in economics imply that sharing information can reduce uncertainty and help identify welfare improving policies. In international relations, 'epistemic communities' of scientists are thought to help provide information for these purposes. However, conflicting preferences can frustrate the transmission of information and prevent effective information sharing. In addition, opportunities for information sharing can deepen distrust as actors observe each other's reaction to what to them is credible information. A model that assumes uncertainty both about the state of the world and the parties' motivations is applied to international climate change negotiations.

  13. Uncertainties in Steric Sea Level Change Estimation During the Satellite Altimeter Era: Concepts and Practices

    Science.gov (United States)

    MacIntosh, C. R.; Merchant, C. J.; von Schuckmann, K.

    2017-01-01

    This article presents a review of current practice in estimating steric sea level change, focussed on the treatment of uncertainty. Steric sea level change is the contribution to the change in sea level arising from the dependence of density on temperature and salinity. It is a significant component of sea level rise and a reflection of changing ocean heat content. However, tracking these steric changes still remains a significant challenge for the scientific community. We review the importance of understanding the uncertainty in estimates of steric sea level change. Relevant concepts of uncertainty are discussed and illustrated with the example of observational uncertainty propagation from a single profile of temperature and salinity measurements to steric height. We summarise and discuss the recent literature on methodologies and techniques used to estimate steric sea level in the context of the treatment of uncertainty. Our conclusions are that progress in quantifying steric sea level uncertainty will benefit from: greater clarity and transparency in published discussions of uncertainty, including exploitation of international standards for quantifying and expressing uncertainty in measurement; and the development of community "recipes" for quantifying the error covariances in observations and from sparse sampling and for estimating and propagating uncertainty across spatio-temporal scales.

  14. Laser driven single shock compression of fluid deuterium from 45 to 220 GPa

    Energy Technology Data Exchange (ETDEWEB)

    Hicks, D; Boehly, T; Celliers, P; Eggert, J; Moon, S; Meyerhofer, D; Collins, G

    2008-03-23

    The compression {eta} of liquid deuterium between 45 and 220 GPa under laser-driven shock loading has been measured using impedance matching to an aluminum (Al) standard. An Al impedance match model derived from a best fit to absolute Hugoniot data has been used to quantify and minimize the systematic errors caused by uncertainties in the high-pressure Al equation of state. In deuterium below 100 GPa results show that {eta} {approx_equal} 4.2, in agreement with previous impedance match data from magnetically-driven flyer and convergent-explosive shock wave experiments; between 100 and 220 GPa {eta} reaches a maximum of {approx}5.0, less than the 6-fold compression observed on the earliest laser-shock experiments but greater than expected from simple extrapolations of lower pressure data. Previous laser-driven double-shock results are found to be in good agreement with these single-shock measurements over the entire range under study. Both sets of laser-shock data indicate that deuterium undergoes an abrupt increase in compression at around 110 GPa.

  15. [Status Quo, Uncertainties and Trends Analysis of Environmental Risk Assessment for PFASs].

    Science.gov (United States)

    Hao, Xue-wen; Li, Li; Wang, Jie; Cao, Yan; Liu, Jian-guo

    2015-08-01

    This study systematically combed the definition and change of terms, category and application of perfluoroalkyl and polyfluoroalkyl substances (PFASs) in international academic, focusing on the environmental risk and exposure assessment of PFASs, to comprehensively analyze the current status, uncertainties and trends of PFASs' environmental risk assessment. Overall, the risk assessment of PFASs is facing a complicated situation involving complex substance pedigrees, various types, complex derivative relations, confidential business information and risk uncertainties. Although the environmental risk of long-chain PFASs has been widely recognized, the short-chain PFASs and short-chain fluorotelomers as their alternatives still have many research gaps and uncertainties in environmental hazards, environmental fate and exposure risk. The scope of risk control of PFASs in the international community is still worth discussing. Due to trade secrets and market competition, the chemical structure and risk information of PFASs' alternatives are generally lack of openness and transparency. The environmental risk of most fluorinated and non-fluorinated alternatives is not clear. In total, the international research on PFASs risk assessment gradually transfer from long-chain perfluoroalkyl acids (PFAAs) represented by perfluorooctane sulfonic acid (PFOS) and perfluorooctanoic acid (PFOA) to short-chain PFAAs, and then extends to other PFASs. The main problems to be solved urgently and researched continuously are: the environmental hazardous assessment indexes, such as bioaccumulation and environmental migration, optimization method, the environmental release and multimedia environmental fate of short-chain PFASs; the environmental fate of neutral PFASs and the transformation and contribution as precursors of short-chain PFASs; the risk identification and assessment of fluorinated and non-fluorinated alternatives of PFASs.

  16. Results from the Application of Uncertainty Methods in the CSNI Uncertainty Methods Study (UMS)

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a - best estimate - concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced -best estimate- thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges.

  17. Information Seeking in Uncertainty Management Theory: Exposure to Information About Medical Uncertainty and Information-Processing Orientation as Predictors of Uncertainty Management Success.

    Science.gov (United States)

    Rains, Stephen A; Tukachinsky, Riva

    2015-01-01

    Uncertainty management theory outlines the processes through which individuals cope with health-related uncertainty. Information seeking has been frequently documented as an important uncertainty management strategy. The reported study investigates exposure to specific types of medical information during a search, and one's information-processing orientation as predictors of successful uncertainty management (i.e., a reduction in the discrepancy between the level of uncertainty one feels and the level one desires). A lab study was conducted in which participants were primed to feel more or less certain about skin cancer and then were allowed to search the World Wide Web for skin cancer information. Participants' search behavior was recorded and content analyzed. The results indicate that exposure to two health communication constructs that pervade medical forms of uncertainty (i.e., severity and susceptibility) and information-processing orientation predicted uncertainty management success.

  18. Conditional uncertainty principle

    Science.gov (United States)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  19. Phenomenological uncertainty analysis of early containment failure at severe accident of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Su Won

    2011-02-15

    The severe accident has inherently significant uncertainty due to wide range of conditions and performing experiments, validation and practical application are extremely difficult because of its high temperature and pressure. Although internal and external researches were put into practice, the reference used in Korean nuclear plants were foreign data of 1980s and safety analysis as the probabilistic safety assessment has not applied the newest methodology. Also, it is applied to containment pressure formed into point value as results of thermal hydraulic analysis to identify the probability of containment failure in level 2 PSA. In this paper, the uncertainty analysis methods for phenomena of severe accident influencing early containment failure were developed, the uncertainty analysis that apply Korean nuclear plants using the MELCOR code was performed and it is a point of view to present the distribution of containment pressure as a result of uncertainty analysis. Because early containment failure is important factor of Large Early Release Frequency(LERF) that is used as representative criteria of decision-making in nuclear power plants, it was selected in this paper among various modes of containment failure. Important phenomena of early containment failure at severe accident based on previous researches were comprehended and methodology of 7th steps to evaluate uncertainty was developed. The MELCOR input for analysis of the severe accident reflected natural circulation flow was developed and the accident scenario for station black out that was representative initial event of early containment failure was determined. By reviewing the internal model and correlation for MELCOR model relevant important phenomena of early containment failure, the uncertainty factors which could affect on the uncertainty were founded and the major factors were finally identified through the sensitivity analysis. In order to determine total number of MELCOR calculations which can

  20. Integration of QFD, AHP, and LPP methods in supplier development problems under uncertainty

    Science.gov (United States)

    Shad, Zahra; Roghanian, Emad; Mojibian, Fatemeh

    2014-04-01

    Quality function deployment (QFD) is a customer-driven approach, widely used to develop or process new product to maximize customer satisfaction. Last researches used linear physical programming (LPP) procedure to optimize QFD; however, QFD issue involved uncertainties, or fuzziness, which requires taking them into account for more realistic study. In this paper, a set of fuzzy data is used to address linguistic values parameterized by triangular fuzzy numbers. Proposed integrated approach including analytic hierarchy process (AHP), QFD, and LPP to maximize overall customer satisfaction under uncertain conditions and apply them in the supplier development problem. The fuzzy AHP approach is adopted as a powerful method to obtain the relationship between the customer requirements and engineering characteristics (ECs) to construct house of quality in QFD method. LPP is used to obtain the optimal achievement level of the ECs and subsequently the customer satisfaction level under different degrees of uncertainty. The effectiveness of proposed method will be illustrated by an example.

  1. Uncertainties in environmental radiological assessment models and their implications

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible

  2. Uncertainty and Cognitive Control

    Directory of Open Access Journals (Sweden)

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  3. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    Science.gov (United States)

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them

  4. Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    Science.gov (United States)

    Boening, C.; Schlegel, N.; Limonadi, D.; Schodlok, M.; Seroussi, H. L.; Larour, E. Y.; Watkins, M. M.

    2017-12-01

    In order to better quantify uncertainties in global mean sea level rise projections and in particular upper bounds, we aim at systematically evaluating the contributions from ice sheets and potential for extreme sea level rise due to sudden ice mass loss. Here, we take advantage of established uncertainty quantification tools embedded within the Ice Sheet System Model (ISSM) as well as sensitivities to ice/ocean interactions using melt rates and melt potential derived from MITgcm/ECCO2. With the use of these tools, we conduct Monte-Carlo style sampling experiments on forward simulations of the Antarctic ice sheet, by varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges. Uncertainty bounds for climate forcing are informed by CMIP5 ensemble precipitation and ice melt estimates for year 2100, and uncertainty bounds for ocean melt rates are derived from a suite of regional sensitivity experiments using MITgcm. Resulting statistics allow us to assess how regional uncertainty in various parameters affect model estimates of century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  5. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  6. Uncertainty, probability and information-gaps

    International Nuclear Information System (INIS)

    Ben-Haim, Yakov

    2004-01-01

    This paper discusses two main ideas. First, we focus on info-gap uncertainty, as distinct from probability. Info-gap theory is especially suited for modelling and managing uncertainty in system models: we invest all our knowledge in formulating the best possible model; this leaves the modeller with very faulty and fragmentary information about the variation of reality around that optimal model. Second, we examine the interdependence between uncertainty modelling and decision-making. Good uncertainty modelling requires contact with the end-use, namely, with the decision-making application of the uncertainty model. The most important avenue of uncertainty-propagation is from initial data- and model-uncertainties into uncertainty in the decision-domain. Two questions arise. Is the decision robust to the initial uncertainties? Is the decision prone to opportune windfall success? We apply info-gap robustness and opportunity functions to the analysis of representation and propagation of uncertainty in several of the Sandia Challenge Problems

  7. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    Science.gov (United States)

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  8. Proceedings of a workshop on dealing with uncertainties in the hydroelectric energy business. CD-ROM ed.

    International Nuclear Information System (INIS)

    2004-01-01

    This workshop was attended by experts in Canadian and international hydroelectric utilities to exchange information on current practices and opportunities for improvement or future cooperation. The discussions focused on reducing the uncertainties associated with hydroelectric power production. Although significant improvements have been made in the efficiency, reliability and safety of hydroelectric power production, the sector is still challenged by the uncertainty of water supply which depends greatly on weather conditions. Energy markets pose another challenge to power producers in terms of energy supply, energy demand and energy prices. The workshop focused on 3 themes: (1) weather and hydrologic uncertainty, (2) market uncertainty, and (3) decision making models using uncertainty principles surrounding water resource planning and operation. The workshop featured 22 presentations of which 11 have been indexed separately for inclusion in this database. refs., tabs., figs

  9. A generalized Kruskal-Wallis test incorporating group uncertainty with application to genetic association studies.

    Science.gov (United States)

    Acar, Elif F; Sun, Lei

    2013-06-01

    Motivated by genetic association studies of SNPs with genotype uncertainty, we propose a generalization of the Kruskal-Wallis test that incorporates group uncertainty when comparing k samples. The extended test statistic is based on probability-weighted rank-sums and follows an asymptotic chi-square distribution with k - 1 degrees of freedom under the null hypothesis. Simulation studies confirm the validity and robustness of the proposed test in finite samples. Application to a genome-wide association study of type 1 diabetic complications further demonstrates the utilities of this generalized Kruskal-Wallis test for studies with group uncertainty. The method has been implemented as an open-resource R program, GKW. © 2013, The International Biometric Society.

  10. Hydrocoin level 3 - Testing methods for sensitivity/uncertainty analysis

    International Nuclear Information System (INIS)

    Grundfelt, B.; Lindbom, B.; Larsson, A.; Andersson, K.

    1991-01-01

    The HYDROCOIN study is an international cooperative project for testing groundwater hydrology modelling strategies for performance assessment of nuclear waste disposal. The study was initiated in 1984 by the Swedish Nuclear Power Inspectorate and the technical work was finalised in 1987. The participating organisations are regulatory authorities as well as implementing organisations in 10 countries. The study has been performed at three levels aimed at studying computer code verification, model validation and sensitivity/uncertainty analysis respectively. The results from the first two levels, code verification and model validation, have been published in reports in 1988 and 1990 respectively. This paper focuses on some aspects of the results from Level 3, sensitivity/uncertainty analysis, for which a final report is planned to be published during 1990. For Level 3, seven test cases were defined. Some of these aimed at exploring the uncertainty associated with the modelling results by simply varying parameter values and conceptual assumptions. In other test cases statistical sampling methods were applied. One of the test cases dealt with particle tracking and the uncertainty introduced by this type of post processing. The amount of results available is substantial although unevenly spread over the test cases. It has not been possible to cover all aspects of the results in this paper. Instead, the different methods applied will be illustrated by some typical analyses. 4 figs., 9 refs

  11. Linear minimax estimation for random vectors with parametric uncertainty

    KAUST Repository

    Bitar, E

    2010-06-01

    In this paper, we take a minimax approach to the problem of computing a worst-case linear mean squared error (MSE) estimate of X given Y , where X and Y are jointly distributed random vectors with parametric uncertainty in their distribution. We consider two uncertainty models, PA and PB. Model PA represents X and Y as jointly Gaussian whose covariance matrix Λ belongs to the convex hull of a set of m known covariance matrices. Model PB characterizes X and Y as jointly distributed according to a Gaussian mixture model with m known zero-mean components, but unknown component weights. We show: (a) the linear minimax estimator computed under model PA is identical to that computed under model PB when the vertices of the uncertain covariance set in PA are the same as the component covariances in model PB, and (b) the problem of computing the linear minimax estimator under either model reduces to a semidefinite program (SDP). We also consider the dynamic situation where x(t) and y(t) evolve according to a discrete-time LTI state space model driven by white noise, the statistics of which is modeled by PA and PB as before. We derive a recursive linear minimax filter for x(t) given y(t).

  12. GUM approach to uncertainty estimations for online 220Rn concentration measurements using Lucas scintillation cell

    International Nuclear Information System (INIS)

    Sathyabama, N.

    2014-01-01

    It is now widely recognized that, when all of the known or suspected components of errors have been evaluated and corrected, there still remains an uncertainty, that is, a doubt about how well the result of the measurement represents the value of the quantity being measured. Evaluation of measurement data - Guide to the expression of Uncertainty in Measurement (GUM) is a guidance document, the purpose of which is to promote full information on how uncertainty statements are arrived at and to provide a basis for the international comparison of measurement results. In this paper, uncertainty estimations following GUM guidelines have been made for the measured values of online thoron concentrations using Lucas scintillation cell to prove that the correction for disequilibrium between 220 Rn and 216 Po is significant in online 220 Rn measurements

  13. Integrated uncertainty analysis using RELAP/SCDAPSIM/MOD4.0

    International Nuclear Information System (INIS)

    Perez, M.; Reventos, F.; Wagner, R.; Allison, C.

    2009-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis package being developed jointly by the Technical University of Catalunya (UPC) and Innovative Systems Software (ISS). The integrated uncertainty analysis approach used in the package uses the following steps: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. The first four steps are performed by the user prior to the RELAP/SCDAPSIM/MOD4.0 analysis. The remaining steps are included with the MOD4.0 integrated uncertainty analysis (IUA) package. This paper briefly describes the integrated uncertainty analysis package including (a) the features of the package, (b) the implementation of the package into RELAP/SCDAPSIM/MOD4.0, and

  14. An approach of sensitivity and uncertainty analyses methods installation in a safety calculation

    International Nuclear Information System (INIS)

    Pepin, G.; Sallaberry, C.

    2003-01-01

    Simulation of the migration in deep geological formations leads to solve convection-diffusion equations in porous media, associated with the computation of hydrogeologic flow. Different time-scales (simulation during 1 million years), scales of space, contrasts of properties in the calculation domain, are taken into account. This document deals more particularly with uncertainties on the input data of the model. These uncertainties are taken into account in total analysis with the use of uncertainty and sensitivity analysis. ANDRA (French national agency for the management of radioactive wastes) carries out studies on the treatment of input data uncertainties and their propagation in the models of safety, in order to be able to quantify the influence of input data uncertainties of the models on the various indicators of safety selected. The step taken by ANDRA consists initially of 2 studies undertaken in parallel: - the first consists of an international review of the choices retained by ANDRA foreign counterparts to carry out their uncertainty and sensitivity analysis, - the second relates to a review of the various methods being able to be used in sensitivity and uncertainty analysis in the context of ANDRA's safety calculations. Then, these studies are supplemented by a comparison of the principal methods on a test case which gathers all the specific constraints (physical, numerical and data-processing) of the problem studied by ANDRA

  15. An Integrated Approach for Characterization of Uncertainty in Complex Best Estimate Safety Assessment

    International Nuclear Information System (INIS)

    Pourgol-Mohamad, Mohammad; Modarres, Mohammad; Mosleh, Ali

    2013-01-01

    This paper discusses an approach called Integrated Methodology for Thermal-Hydraulics Uncertainty Analysis (IMTHUA) to characterize and integrate a wide range of uncertainties associated with the best estimate models and complex system codes used for nuclear power plant safety analyses. Examples of applications include complex thermal hydraulic and fire analysis codes. In identifying and assessing uncertainties, the proposed methodology treats the complex code as a 'white box', thus explicitly treating internal sub-model uncertainties in addition to the uncertainties related to the inputs to the code. The methodology accounts for uncertainties related to experimental data used to develop such sub-models, and efficiently propagates all uncertainties during best estimate calculations. Uncertainties are formally analyzed and probabilistically treated using a Bayesian inference framework. This comprehensive approach presents the results in a form usable in most other safety analyses such as the probabilistic safety assessment. The code output results are further updated through additional Bayesian inference using any available experimental data, for example from thermal hydraulic integral test facilities. The approach includes provisions to account for uncertainties associated with user-specified options, for example for choices among alternative sub-models, or among several different correlations. Complex time-dependent best-estimate calculations are computationally intense. The paper presents approaches to minimize computational intensity during the uncertainty propagation. Finally, the paper will report effectiveness and practicality of the methodology with two applications to a complex thermal-hydraulics system code as well as a complex fire simulation code. In case of multiple alternative models, several techniques, including dynamic model switching, user-controlled model selection, and model mixing, are discussed. (authors)

  16. A Variation on Uncertainty Principle and Logarithmic Uncertainty Principle for Continuous Quaternion Wavelet Transforms

    Directory of Open Access Journals (Sweden)

    Mawardi Bahri

    2017-01-01

    Full Text Available The continuous quaternion wavelet transform (CQWT is a generalization of the classical continuous wavelet transform within the context of quaternion algebra. First of all, we show that the directional quaternion Fourier transform (QFT uncertainty principle can be obtained using the component-wise QFT uncertainty principle. Based on this method, the directional QFT uncertainty principle using representation of polar coordinate form is easily derived. We derive a variation on uncertainty principle related to the QFT. We state that the CQWT of a quaternion function can be written in terms of the QFT and obtain a variation on uncertainty principle related to the CQWT. Finally, we apply the extended uncertainty principles and properties of the CQWT to establish logarithmic uncertainty principles related to generalized transform.

  17. On the relationship between aerosol model uncertainty and radiative forcing uncertainty.

    Science.gov (United States)

    Lee, Lindsay A; Reddington, Carly L; Carslaw, Kenneth S

    2016-05-24

    The largest uncertainty in the historical radiative forcing of climate is caused by the interaction of aerosols with clouds. Historical forcing is not a directly measurable quantity, so reliable assessments depend on the development of global models of aerosols and clouds that are well constrained by observations. However, there has been no systematic assessment of how reduction in the uncertainty of global aerosol models will feed through to the uncertainty in the predicted forcing. We use a global model perturbed parameter ensemble to show that tight observational constraint of aerosol concentrations in the model has a relatively small effect on the aerosol-related uncertainty in the calculated forcing between preindustrial and present-day periods. One factor is the low sensitivity of present-day aerosol to natural emissions that determine the preindustrial aerosol state. However, the major cause of the weak constraint is that the full uncertainty space of the model generates a large number of model variants that are equally acceptable compared to present-day aerosol observations. The narrow range of aerosol concentrations in the observationally constrained model gives the impression of low aerosol model uncertainty. However, these multiple "equifinal" models predict a wide range of forcings. To make progress, we need to develop a much deeper understanding of model uncertainty and ways to use observations to constrain it. Equifinality in the aerosol model means that tuning of a small number of model processes to achieve model-observation agreement could give a misleading impression of model robustness.

  18. Relevance of the law of international organisations in resolving ...

    African Journals Online (AJOL)

    structures to resolve disputes between states. Uncertainty remains, however, on the availability of effective structures within the system to resolve disputes between international organisations. It is important to note that international organisations were, prior to 1945, not considered subjects of international law so as to be ...

  19. Proceedings of 14th international workshop on Asian network for accelerator-driven system and nuclear transmutation technology (ADS-NTT 2016)

    International Nuclear Information System (INIS)

    Pyeon, Cheol Ho

    2016-09-01

    The proceedings describe the current status on research and development (R and D) of accelerator-driven system (ADS) and nuclear transmutation techniques (NTT), including nuclear data, accelerator techniques, Pb-Bi target, fuel technologies and reactor physics, in East Asian countries: China, Korea and Japan. The proceedings also include all presentation materials presented in 'the 14th International Workshop on Asian Network for ADS and NTT (ADS-NTT2016)' held at Mito, Japan on 5th September, 2016. The objective of this workshop is to make actual progress of ADS R and D especially in East Asian countries, as well as in European countries, through sharing mutual interests and conducting the information exchange each other. The report is composed of these following items: Presentation materials: ADS-NTT 2016. (author)

  20. Development and application of methods to characterize code uncertainty

    International Nuclear Information System (INIS)

    Wilson, G.E.; Burtt, J.D.; Case, G.S.; Einerson, J.J.; Hanson, R.G.

    1985-01-01

    The United States Nuclear Regulatory Commission sponsors both international and domestic studies to assess its safety analysis codes. The Commission staff intends to use the results of these studies to quantify the uncertainty of the codes with a statistically based analysis method. Development of the methodology is underway. The Idaho National Engineering Laboratory contributions to the early development effort, and testing of two candidate methods are the subjects of this paper

  1. Decision-making under great uncertainty

    International Nuclear Information System (INIS)

    Hansson, S.O.

    1992-01-01

    Five types of decision-uncertainty are distinguished: uncertainty of consequences, of values, of demarcation, of reliance, and of co-ordination. Strategies are proposed for each type of uncertainty. The general conclusion is that it is meaningful for decision theory to treat cases with greater uncertainty than the textbook case of 'decision-making under uncertainty'. (au)

  2. Risk in International Business

    OpenAIRE

    Canavan, Deirdre; Sharkey Scott, Pamela

    2012-01-01

    Risk in international business can stress risk adverse behaviour to counteract foreign market uncertainty or individual entrepreneurial risk taking behaviour dependent on the characteristics of both the business sector and the individual. International business theory would suggest that the perception of risk may differ in situations including where new market entry is incremental, is taken in larger or earlier stages, or indeed whether it may be experienced in a continually fluctuating manne...

  3. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  4. Who am I? The relationship between self-concept uncertainty and materialism.

    Science.gov (United States)

    Noguti, Valeria; Bokeyar, Alexandra L

    2014-10-01

    It is well accepted that materialism may result in a number of negative consequences, hence the importance of improving its understanding. In this paper, we propose that materialism negatively relates to self-concept uncertainty. Uncertainty about oneself is aversive and those feeling uncertain may use the possession of material objects as a way to reduce the uncertainty. Inasmuch as material objects can serve as concrete signs of self-worth, self-concept uncertainty can therefore relate to more materialism. Over two studies, one in Australia and the other in the US, with a total of 390 participants, our research demonstrates that lower clarity about one's self-concept associates with higher levels of materialism. While this result holds for both genders, this relationship is considerably stronger for women compared to men. We also find that lower self-concept clarity relates to higher compulsive buying. We further demonstrate that materialism relates to higher positive moods during shopping, and also relates to higher negative moods after shopping, more notably negative moods towards what was purchased. This effect is significant even when controlling for general affective states. © 2014 International Union of Psychological Science.

  5. INTERNAL AUDIT AND RISK MANAGEMENT

    OpenAIRE

    Elena RUSE; Georgiana SUSMANSCHI (BADEA); Daniel DĂNECI-PĂTRĂU

    2014-01-01

    The existence of risk in economic activity can not be denied. In fact, the risk is a concept which exists in every activity, the term of risk being identified with uncertainty, respectively the (un)chance to produce an undesirable event. Internal audit and risk management aim at the same goal, namely the control of risks. Internal Audit performs several roles in risk management plan. The objectives of the internal audit function varies from company to company, but in all economic entities int...

  6. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events appendices

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. The mean core damage frequency is 4.5E-6 with 5% and 95% uncertainty bounds of 3.5E-7 and 1.3E-5, respectively. Station blackout type accidents (loss of all ac power) contributed about 46% of the core damage frequency with Anticipated Transient Without Scram (ATWS) accidents contributing another 42%. The numerical results are driven by loss of offsite power, transients with the power conversion system initially available operator errors, and mechanical failure to scram. 13 refs., 345 figs., 171 tabs

  7. Application of a Novel Dose-Uncertainty Model for Dose-Uncertainty Analysis in Prostate Intensity-Modulated Radiotherapy

    International Nuclear Information System (INIS)

    Jin Hosang; Palta, Jatinder R.; Kim, You-Hyun; Kim, Siyong

    2010-01-01

    Purpose: To analyze dose uncertainty using a previously published dose-uncertainty model, and to assess potential dosimetric risks existing in prostate intensity-modulated radiotherapy (IMRT). Methods and Materials: The dose-uncertainty model provides a three-dimensional (3D) dose-uncertainty distribution in a given confidence level. For 8 retrospectively selected patients, dose-uncertainty maps were constructed using the dose-uncertainty model at the 95% CL. In addition to uncertainties inherent to the radiation treatment planning system, four scenarios of spatial errors were considered: machine only (S1), S1 + intrafraction, S1 + interfraction, and S1 + both intrafraction and interfraction errors. To evaluate the potential risks of the IMRT plans, three dose-uncertainty-based plan evaluation tools were introduced: confidence-weighted dose-volume histogram, confidence-weighted dose distribution, and dose-uncertainty-volume histogram. Results: Dose uncertainty caused by interfraction setup error was more significant than that of intrafraction motion error. The maximum dose uncertainty (95% confidence) of the clinical target volume (CTV) was smaller than 5% of the prescribed dose in all but two cases (13.9% and 10.2%). The dose uncertainty for 95% of the CTV volume ranged from 1.3% to 2.9% of the prescribed dose. Conclusions: The dose uncertainty in prostate IMRT could be evaluated using the dose-uncertainty model. Prostate IMRT plans satisfying the same plan objectives could generate a significantly different dose uncertainty because a complex interplay of many uncertainty sources. The uncertainty-based plan evaluation contributes to generating reliable and error-resistant treatment plans.

  8. Margins for uncertainties in Hydro-Quebec's short-term operations planning

    International Nuclear Information System (INIS)

    Beaumont, M.; Raymond, M.P.

    1995-01-01

    A method developed by Hydro-Quebec for establishing the short-term capacity margin requirements for dealing with uncertainties from 1 to 24 hours in advance, was presented. Hydro-Quebec's generating system and characterization of the problems associated with meeting load requirements were discussed. Factors accounted for included those concerning internal load forecast, unit forced outages, risks of not meeting firm load, risks of not meeting real-time reserves requirements, costs, time delays, and operating constraints of non-hydraulic resources. Each of these were described in detail, and methods for combining mathematical uncertainties were presented. Procedures used for selecting an appropriate risk level and building a margin policy were described. Improvements for more accurate modelling were discussed. 5 refs., 2 tabs., 5 figs

  9. On the ultimate uncertainty of the top quark pole mass

    Science.gov (United States)

    Beneke, M.; Marquard, P.; Nason, P.; Steinhauser, M.

    2017-12-01

    We combine the known asymptotic behaviour of the QCD perturbation series expansion, which relates the pole mass of a heavy quark to the MS ‾ mass, with the exact series coefficients up to the four-loop order to determine the ultimate uncertainty of the top-quark pole mass due to the renormalon divergence. We perform extensive tests of our procedure by varying the number of colours and flavours, as well as the scale of the strong coupling and the MS ‾ mass. Including an estimate of the internal bottom and charm quark mass effect, we conclude that this uncertainty is around 110 MeV. We further estimate the additional contribution to the mass relation from the five-loop correction and beyond to be around 300 MeV.

  10. Impact of geometric uncertainties on dose calculations for intensity modulated radiation therapy of prostate cancer

    Science.gov (United States)

    Jiang, Runqing

    Intensity-modulated radiation therapy (IMRT) uses non-uniform beam intensities within a radiation field to provide patient-specific dose shaping, resulting in a dose distribution that conforms tightly to the planning target volume (PTV). Unavoidable geometric uncertainty arising from patient repositioning and internal organ motion can lead to lower conformality index (CI) during treatment delivery, a decrease in tumor control probability (TCP) and an increase in normal tissue complication probability (NTCP). The CI of the IMRT plan depends heavily on steep dose gradients between the PTV and organ at risk (OAR). Geometric uncertainties reduce the planned dose gradients and result in a less steep or "blurred" dose gradient. The blurred dose gradients can be maximized by constraining the dose objective function in the static IMRT plan or by reducing geometric uncertainty during treatment with corrective verification imaging. Internal organ motion and setup error were evaluated simultaneously for 118 individual patients with implanted fiducials and MV electronic portal imaging (EPI). A Gaussian probability density function (PDF) is reasonable for modeling geometric uncertainties as indicated by the 118 patients group. The Gaussian PDF is patient specific and group standard deviation (SD) should not be used for accurate treatment planning for individual patients. In addition, individual SD should not be determined or predicted from small imaging samples because of random nature of the fluctuations. Frequent verification imaging should be employed in situations where geometric uncertainties are expected. Cumulative PDF data can be used for re-planning to assess accuracy of delivered dose. Group data is useful for determining worst case discrepancy between planned and delivered dose. The margins for the PTV should ideally represent true geometric uncertainties. The measured geometric uncertainties were used in this thesis to assess PTV coverage, dose to OAR, equivalent

  11. Ruminations On NDA Measurement Uncertainty Compared TO DA Uncertainty

    International Nuclear Information System (INIS)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-01-01

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  12. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    Energy Technology Data Exchange (ETDEWEB)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  13. Data-driven modeling, control and tools for cyber-physical energy systems

    Science.gov (United States)

    Behl, Madhur

    Energy systems are experiencing a gradual but substantial change in moving away from being non-interactive and manually-controlled systems to utilizing tight integration of both cyber (computation, communications, and control) and physical representations guided by first principles based models, at all scales and levels. Furthermore, peak power reduction programs like demand response (DR) are becoming increasingly important as the volatility on the grid continues to increase due to regulation, integration of renewables and extreme weather conditions. In order to shield themselves from the risk of price volatility, end-user electricity consumers must monitor electricity prices and be flexible in the ways they choose to use electricity. This requires the use of control-oriented predictive models of an energy system's dynamics and energy consumption. Such models are needed for understanding and improving the overall energy efficiency and operating costs. However, learning dynamical models using grey/white box approaches is very cost and time prohibitive since it often requires significant financial investments in retrofitting the system with several sensors and hiring domain experts for building the model. We present the use of data-driven methods for making model capture easy and efficient for cyber-physical energy systems. We develop Model-IQ, a methodology for analysis of uncertainty propagation for building inverse modeling and controls. Given a grey-box model structure and real input data from a temporary set of sensors, Model-IQ evaluates the effect of the uncertainty propagation from sensor data to model accuracy and to closed-loop control performance. We also developed a statistical method to quantify the bias in the sensor measurement and to determine near optimal sensor placement and density for accurate data collection for model training and control. Using a real building test-bed, we show how performing an uncertainty analysis can reveal trends about

  14. Uncertainty and sensitivity studies supporting the interpretation of the results of TVO I/II PRA

    International Nuclear Information System (INIS)

    Holmberg, J.

    1992-01-01

    A comprehensive Level 1 probabilistic risk assessment (PRA) has been performed for the TVO I/II nuclear power units. As a part of the PRA project, uncertainties of risk models and methods were systematically studied in order to describe them and to demonstrate their impact by way of results. The uncertainty study was divided into two phases: a qualitative and a quantitative study. The qualitative study contained identification of uncertainties and qualitative assessments of their importance. The PRA was introduced, and identified assumptions and uncertainties behind the models were documented. The most significant uncertainties were selected by importance measures or other judgements for further quantitative studies. The quantitative study included sensitivity studies and propagation of uncertainty ranges. In the sensitivity studies uncertain assumptions or parameters were varied in order to illustrate the sensitivity of the models. The propagation of the uncertainty ranges demonstrated the impact of the statistical uncertainties of the parameter values. The Monte Carlo method was used as a propagation method. The most significant uncertainties were those involved in modelling human interactions, dependences and common cause failures (CCFs), loss of coolant accident (LOCA) frequencies and pressure suppression. The qualitative mapping out of the uncertainty factors turned out to be useful in planning quantitative studies. It also served as internal review of the assumptions made in the PRA. The sensitivity studies were perhaps the most advantageous part of the quantitative study because they allowed individual analyses of the significance of uncertainty sources identified. The uncertainty study was found reasonable in systematically and critically assessing uncertainties in a risk analysis. The usefulness of this study depends on the decision maker (power company) since uncertainty studies are primarily carried out to support decision making when uncertainties are

  15. Comparison between bottom-up and top-down approaches in the estimation of measurement uncertainty.

    Science.gov (United States)

    Lee, Jun Hyung; Choi, Jee-Hye; Youn, Jae Saeng; Cha, Young Joo; Song, Woonheung; Park, Ae Ja

    2015-06-01

    Measurement uncertainty is a metrological concept to quantify the variability of measurement results. There are two approaches to estimate measurement uncertainty. In this study, we sought to provide practical and detailed examples of the two approaches and compare the bottom-up and top-down approaches to estimating measurement uncertainty. We estimated measurement uncertainty of the concentration of glucose according to CLSI EP29-A guideline. Two different approaches were used. First, we performed a bottom-up approach. We identified the sources of uncertainty and made an uncertainty budget and assessed the measurement functions. We determined the uncertainties of each element and combined them. Second, we performed a top-down approach using internal quality control (IQC) data for 6 months. Then, we estimated and corrected systematic bias using certified reference material of glucose (NIST SRM 965b). The expanded uncertainties at the low glucose concentration (5.57 mmol/L) by the bottom-up approach and top-down approaches were ±0.18 mmol/L and ±0.17 mmol/L, respectively (all k=2). Those at the high glucose concentration (12.77 mmol/L) by the bottom-up and top-down approaches were ±0.34 mmol/L and ±0.36 mmol/L, respectively (all k=2). We presented practical and detailed examples for estimating measurement uncertainty by the two approaches. The uncertainties by the bottom-up approach were quite similar to those by the top-down approach. Thus, we demonstrated that the two approaches were approximately equivalent and interchangeable and concluded that clinical laboratories could determine measurement uncertainty by the simpler top-down approach.

  16. Uncertainty analysis of a one-dimensional constitutive model for shape memory alloy thermomechanical description

    DEFF Research Database (Denmark)

    Oliveira, Sergio A.; Savi, Marcelo A.; Santos, Ilmar F.

    2014-01-01

    The use of shape memory alloys (SMAs) in engineering applications has increased the interest of the accuracy analysis of their thermomechanical description. This work presents an uncertainty analysis related to experimental tensile tests conducted with shape memory alloy wires. Experimental data...... are compared with numerical simulations obtained from a constitutive model with internal constraints employed to describe the thermomechanical behavior of SMAs. The idea is to evaluate if the numerical simulations are within the uncertainty range of the experimental data. Parametric analysis is also developed...

  17. Embracing uncertainty in applied ecology.

    Science.gov (United States)

    Milner-Gulland, E J; Shea, K

    2017-12-01

    Applied ecologists often face uncertainty that hinders effective decision-making.Common traps that may catch the unwary are: ignoring uncertainty, acknowledging uncertainty but ploughing on, focussing on trivial uncertainties, believing your models, and unclear objectives.We integrate research insights and examples from a wide range of applied ecological fields to illustrate advances that are generally underused, but could facilitate ecologists' ability to plan and execute research to support management.Recommended approaches to avoid uncertainty traps are: embracing models, using decision theory, using models more effectively, thinking experimentally, and being realistic about uncertainty. Synthesis and applications . Applied ecologists can become more effective at informing management by using approaches that explicitly take account of uncertainty.

  18. Decision-Making under Criteria Uncertainty

    Science.gov (United States)

    Kureychik, V. M.; Safronenkova, I. B.

    2018-05-01

    Uncertainty is an essential part of a decision-making procedure. The paper deals with the problem of decision-making under criteria uncertainty. In this context, decision-making under uncertainty, types and conditions of uncertainty were examined. The decision-making problem under uncertainty was formalized. A modification of the mathematical decision support method under uncertainty via ontologies was proposed. A critical distinction of the developed method is ontology usage as its base elements. The goal of this work is a development of a decision-making method under criteria uncertainty with the use of ontologies in the area of multilayer board designing. This method is oriented to improvement of technical-economic values of the examined domain.

  19. Preliminary uncertainty analysis for the doses estimated using the Techa River dosimetry system - 2000

    International Nuclear Information System (INIS)

    Napier, Bruce A.; Shagina, N B.; Degteva, M O.; Tolstykh, E I.; Vorobiova, M I.; Anspaugh, L R.

    2000-01-01

    The Mayak Production Association (MPA) was the first facility in the former Soviet Union for the production of plutonium. As a result of failures in the technological processes in the late 1940's and early 1950's, members of the public were exposed via discharge of about 1017 Bq of liquid wastes into the Techa River (1949-1956). Residents of many villages downstream on the Techa River were exposed via a variety of pathways; the more significant included drinking of water from the river and external gamma exposure due to proximity to sediments and shoreline. The specific aim of this project is to enhance the reconstruction of external and internal radiation doses for individuals in the Extended Techa River Cohort. The purpose of this paper is to present the approaches being used to evaluate the uncertainty in the calculated individual doses and to provide example and representative results of the uncertainty analyses. The magnitude of the uncertainties varies depending on location and time of individual exposure, but the results from reference-individual calculations indicate that for external doses, the range of uncertainty is about factors of four to five. For internal doses, the range of uncertainty depends on village of residence, which is actually a surrogate for source of drinking water. For villages with single sources of drinking water (river or well), the ratio of the 97.5th percentile-to 2.5th percentile estimates can be a factor of 20 to 30. For villages with mixed sources of drinking water (river and well), the ratio of the range can be over two orders of magnitude

  20. Uncertainty Quantification in Geomagnetic Field Modeling

    Science.gov (United States)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  1. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  2. Transportable Optical Lattice Clock with 7×10^{-17} Uncertainty.

    Science.gov (United States)

    Koller, S B; Grotti, J; Vogt, St; Al-Masoudi, A; Dörscher, S; Häfner, S; Sterr, U; Lisdat, Ch

    2017-02-17

    We present a transportable optical clock (TOC) with ^{87}Sr. Its complete characterization against a stationary lattice clock resulted in a systematic uncertainty of 7.4×10^{-17}, which is currently limited by the statistics of the determination of the residual lattice light shift, and an instability of 1.3×10^{-15}/sqrt[τ] with an averaging time τ in seconds. Measurements confirm that the systematic uncertainty can be reduced to below the design goal of 1×10^{-17}. To our knowledge, these are the best uncertainties and instabilities reported for any transportable clock to date. For autonomous operation, the TOC has been installed in an air-conditioned car trailer. It is suitable for chronometric leveling with submeter resolution as well as for intercontinental cross-linking of optical clocks, which is essential for a redefinition of the International System of Units (SI) second. In addition, the TOC will be used for high precision experiments for fundamental science that are commonly tied to precise frequency measurements and its development is an important step to space-borne optical clocks.

  3. Transportable Optical Lattice Clock with 7 ×10-17 Uncertainty

    Science.gov (United States)

    Koller, S. B.; Grotti, J.; Vogt, St.; Al-Masoudi, A.; Dörscher, S.; Häfner, S.; Sterr, U.; Lisdat, Ch.

    2017-02-01

    We present a transportable optical clock (TOC) with Sr 87 . Its complete characterization against a stationary lattice clock resulted in a systematic uncertainty of 7.4 ×10-17, which is currently limited by the statistics of the determination of the residual lattice light shift, and an instability of 1.3 ×10-15/√{τ } with an averaging time τ in seconds. Measurements confirm that the systematic uncertainty can be reduced to below the design goal of 1 ×10-17. To our knowledge, these are the best uncertainties and instabilities reported for any transportable clock to date. For autonomous operation, the TOC has been installed in an air-conditioned car trailer. It is suitable for chronometric leveling with submeter resolution as well as for intercontinental cross-linking of optical clocks, which is essential for a redefinition of the International System of Units (SI) second. In addition, the TOC will be used for high precision experiments for fundamental science that are commonly tied to precise frequency measurements and its development is an important step to space-borne optical clocks.

  4. The Role of Type and Source of Uncertainty on the Processing of Climate Models Projections.

    Science.gov (United States)

    Benjamin, Daniel M; Budescu, David V

    2018-01-01

    Scientists agree that the climate is changing due to human activities, but there is less agreement about the specific consequences and their timeline. Disagreement among climate projections is attributable to the complexity of climate models that differ in their structure, parameters, initial conditions, etc. We examine how different sources of uncertainty affect people's interpretation of, and reaction to, information about climate change by presenting participants forecasts from multiple experts. Participants viewed three types of sets of sea-level rise projections: (1) precise, but conflicting ; (2) imprecise , but agreeing, and (3) hybrid that were both conflicting and imprecise. They estimated the most likely sea-level rise, provided a range of possible values and rated the sets on several features - ambiguity, credibility, completeness, etc. In Study 1, everyone saw the same hybrid set. We found that participants were sensitive to uncertainty between sources, but not to uncertainty about which model was used. The impacts of conflict and imprecision were combined for estimation tasks and compromised for feature ratings . Estimates were closer to the experts' original projections, and sets were rated more favorably under imprecision. Estimates were least consistent with (narrower than) the experts in the hybrid condition, but participants rated the conflicting set least favorably. In Study 2, we investigated the hybrid case in more detail by creating several distinct interval sets that combine conflict and imprecision. Two factors drive perceptual differences: overlap - the structure of the forecast set (whether intersecting, nested, tangent, or disjoint) - and a symmetry - the balance of the set. Estimates were primarily driven by asymmetry, and preferences were primarily driven by overlap. Asymmetric sets were least consistent with the experts: estimated ranges were narrower, and estimates of the most likely value were shifted further below the set mean

  5. Uncertainties of exposure-related quantities in mammographic x-ray unit quality control

    International Nuclear Information System (INIS)

    Gregory, Kent J.; Pattison, John E.; Bibbo, Giovanni

    2006-01-01

    Breast screening programs operate in many countries with mammographic x-ray units subject to stringent quality control tests. These tests include the evaluation of quantities based on exposure measurements, such as half value layer, automatic exposure control reproducibility, average glandular dose, and radiation output rate. There are numerous error sources that contribute to the uncertainty of these exposure-related quantities, some of which are unique to the low energy x-ray spectrum produced by mammographic x-ray units. For each of these exposure-related quantities, the applicable error sources and their magnitudes vary, depending on the test equipment used to make the measurement, and whether or not relevant corrections have been applied. This study has identified and quantified a range of error sources that may be used to estimate the combined uncertainty of these exposure-related quantities, given the test equipment used and corrections applied. The uncertainty analysis uses methods described by the International Standards Organization's Guide to the Expression of Uncertainty in Measurement. Examples of how these error sources combine to give the uncertainty of the exposure-related quantities are presented. Using the best test equipment evaluated in this study, uncertainties of the four exposure-related quantities at the 95% confidence interval were found to be ±1.6% (half value layer), ±0.0008 (automatic exposure control reproducibility), ±2.3% (average glandular dose), and ±2.1% (radiation output rate). In some cases, using less precise test equipment or failing to apply corrections, resulted in uncertainties more than double in magnitude

  6. Uncertainty Propagation in OMFIT

    Science.gov (United States)

    Smith, Sterling; Meneghini, Orso; Sung, Choongki

    2017-10-01

    A rigorous comparison of power balance fluxes and turbulent model fluxes requires the propagation of uncertainties in the kinetic profiles and their derivatives. Making extensive use of the python uncertainties package, the OMFIT framework has been used to propagate covariant uncertainties to provide an uncertainty in the power balance calculation from the ONETWO code, as well as through the turbulent fluxes calculated by the TGLF code. The covariant uncertainties arise from fitting 1D (constant on flux surface) density and temperature profiles and associated random errors with parameterized functions such as a modified tanh. The power balance and model fluxes can then be compared with quantification of the uncertainties. No effort is made at propagating systematic errors. A case study will be shown for the effects of resonant magnetic perturbations on the kinetic profiles and fluxes at the top of the pedestal. A separate attempt at modeling the random errors with Monte Carlo sampling will be compared to the method of propagating the fitting function parameter covariant uncertainties. Work supported by US DOE under DE-FC02-04ER54698, DE-FG2-95ER-54309, DE-SC 0012656.

  7. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  8. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  9. Meeting the measurement uncertainty and traceability requirements of ISO/AEC standard 17025 in chemical analysis.

    Science.gov (United States)

    King, B

    2001-11-01

    The new laboratory accreditation standard, ISO/IEC 17025, reflects current thinking on good measurement practice by requiring more explicit and more demanding attention to a number of activities. These include client interactions, method validation, traceability, and measurement uncertainty. Since the publication of the standard in 1999 there has been extensive debate about its interpretation. It is the author's view that if good quality practices are already in place and if the new requirements are introduced in a manner that is fit for purpose, the additional work required to comply with the new requirements can be expected to be modest. The paper argues that the rigour required in addressing the issues should be driven by customer requirements and the factors that need to be considered in this regard are discussed. The issues addressed include the benefits, interim arrangements, specifying the analytical requirement, establishing traceability, evaluating the uncertainty and reporting the information.

  10. Information-Driven Inspections

    International Nuclear Information System (INIS)

    Laughter, Mark D.; Whitaker, J. Michael; Lockwood, Dunbar

    2010-01-01

    New uranium enrichment capacity is being built worldwide in response to perceived shortfalls in future supply. To meet increasing safeguards responsibilities with limited resources, the nonproliferation community is exploring next-generation concepts to increase the effectiveness and efficiency of safeguards, such as advanced technologies to enable unattended monitoring of nuclear material. These include attribute measurement technologies, data authentication tools, and transmission and security methods. However, there are several conceptual issues with how such data would be used to improve the ability of a safeguards inspectorate such as the International Atomic Energy Agency (IAEA) to reach better safeguards conclusions regarding the activities of a State. The IAEA is pursuing the implementation of information-driven safeguards, whereby all available sources of information are used to make the application of safeguards more effective and efficient. Data from continuous, unattended monitoring systems can be used to optimize on-site inspection scheduling and activities at declared facilities, resulting in fewer, better inspections. Such information-driven inspections are the logical evolution of inspection planning - making use of all available information to enhance scheduled and randomized inspections. Data collection and analysis approaches for unattended monitoring systems can be designed to protect sensitive information while enabling information-driven inspections. A number of such inspections within a predetermined range could reduce inspection frequency while providing an equal or greater level of deterrence against illicit activity, all while meeting operator and technology holder requirements and reducing inspector and operator burden. Three options for using unattended monitoring data to determine an information-driven inspection schedule are to (1) send all unattended monitoring data off-site, which will require advances in data analysis techniques to

  11. How Do Science and Technology Affect International Affairs?

    Science.gov (United States)

    Weiss, Charles

    2015-01-01

    Science and technology influence international affairs by many different mechanisms. Both create new issues, risks and uncertainties. Advances in science alert the international community to new issues and risks. New technological capabilities transform war, diplomacy, commerce, intelligence, and investment. This paper identifies six basic…

  12. The state of the art of the impact of sampling uncertainty on measurement uncertainty

    Science.gov (United States)

    Leite, V. J.; Oliveira, E. C.

    2018-03-01

    The measurement uncertainty is a parameter that marks the reliability and can be divided into two large groups: sampling and analytical variations. Analytical uncertainty is a controlled process, performed in the laboratory. The same does not occur with the sampling uncertainty, which, because it faces several obstacles and there is no clarity on how to perform the procedures, has been neglected, although it is admittedly indispensable to the measurement process. This paper aims at describing the state of the art of sampling uncertainty and at assessing its relevance to measurement uncertainty.

  13. Visualizing uncertainties in a storm surge ensemble data assimilation and forecasting system

    KAUST Repository

    Hollt, Thomas

    2015-01-15

    We present a novel integrated visualization system that enables the interactive visual analysis of ensemble simulations and estimates of the sea surface height and other model variables that are used for storm surge prediction. Coastal inundation, caused by hurricanes and tropical storms, poses large risks for today\\'s societies. High-fidelity numerical models of water levels driven by hurricane-force winds are required to predict these events, posing a challenging computational problem, and even though computational models continue to improve, uncertainties in storm surge forecasts are inevitable. Today, this uncertainty is often exposed to the user by running the simulation many times with different parameters or inputs following a Monte-Carlo framework in which uncertainties are represented as stochastic quantities. This results in multidimensional, multivariate and multivalued data, so-called ensemble data. While the resulting datasets are very comprehensive, they are also huge in size and thus hard to visualize and interpret. In this paper, we tackle this problem by means of an interactive and integrated visual analysis system. By harnessing the power of modern graphics processing units for visualization as well as computation, our system allows the user to browse through the simulation ensembles in real time, view specific parameter settings or simulation models and move between different spatial and temporal regions without delay. In addition, our system provides advanced visualizations to highlight the uncertainty or show the complete distribution of the simulations at user-defined positions over the complete time series of the prediction. We highlight the benefits of our system by presenting its application in a real-world scenario using a simulation of Hurricane Ike.

  14. Measurement of pzz of the laser-driven polarized deuterium target

    International Nuclear Information System (INIS)

    Jones, C.E.; Coulter, K.P.; Holt, R.J.; Poelker, M.; Potterveld, D.P.; Kowalczyk, R.S.; Buchholz, M.; Neal, J.; van den Brand, J.F.J.

    1993-01-01

    The question of whether nuclei are polarized as a result of H-H (D-D) spin-exchange collisions within the relatively dense gas of a laser-driven source of polarized hydrogen (deuterium) can be addressed directly by measuring the nuclear polarization of atoms from the source. The feasibility of using a polarimeter based on the D + T → n + 4 He reaction to measure the tensor polarization of deuterium in an internal target fed by the laser-driven source has been tested. The device and the measurements necessary to test the spin-exchange polarization theory are described

  15. Impact of Climate Change. Policy Uncertainty in Power Investment

    International Nuclear Information System (INIS)

    Blyth, W.; Yang, M.

    2006-10-01

    Climate change policies are being introduced or actively considered in all IEA member countries, changing the investment conditions and technology choices in the energy sector. Many of these policies are at a formative stage, and policy uncertainty is currently high. The objective of this paper is to quantify the impacts of climate change policy on power investment. We use Real Options Analysis approach in the study and model uncertain carbon price and fuel price with stochastic variables. The analysis compares the effects of climate policy uncertainty with fuel price uncertainty, showing the relative importance of these sources of risk for different technologies. This paper considers views on the importance of climate policy risk, how it is managed, and how it might affect investment behaviour. The implications for policymakers are analyzed, allowing the key messages to be transferred into policy design decisions. We found that in many cases, the dominant risks facing base-load generation investment decisions will be market risks associated with electricity and fuel prices. However, under certain conditions and for some technologies, climate policy uncertainty can be an important risk factor, creating an incentive to delay investment and raising investment thresholds. This paper concludes that government climate change policies to promote investment in low-carbon technologies should aim to overcome this incentive to delay by sending long-term investment signals backed up by strengthened international policy action to enhance domestic policy credibility

  16. Proceedings of 11th international workshop on Asian network for accelerator-driven system and nuclear transmutation technology (ADS+NTT 2013)

    International Nuclear Information System (INIS)

    Pyeon, Cheol Ho

    2014-01-01

    The proceedings describe the current status on research and development (R and D) of accelerator-driven system (ADS) and nuclear transmutation techniques (NTT), including nuclear data, accelerator techniques, Pb-Bi target, fuel technologies and reactor physics, in East Asian countries: Korea, China and Japan. The proceedings also include all presentation materials presented in 'the 11th International Workshop on Asian Network for ADS and NTT (ADS+NTT 2013)' held at the Seoul National University, Seoul, Korea on 12th and 13th December, 2013. The objective of this workshop is to make actual progress of ADS R and D especially in East Asian countries, as well as in European countries, through sharing mutual interests and conducting the information exchange each other. The report is composed of these following items: Presentation materials: ADS+NTT 2013. (author)

  17. Estimate of uncertainties correlated and no correlated associated to performance tests of activity meters

    International Nuclear Information System (INIS)

    Sousa, C.H.S.; Teixeira, G.J.; Peixoto, J.G.P.

    2014-01-01

    Activimeters should undergo performance for verifying the functionality tests as technical recommendations. This study estimated the associated expanded uncertainties uncorrelated to the results conducted on three instruments, two detectors with ionization chamber and one with Geiger Mueller tubes. For this we used a standard reference source and screened certified by the National Institute of Technology and Standardization. The methodology of this research was based on the protocols listed in the technical document of the International Atomic Energy Agency. Later two quantities were correlated presenting real correlation and improving expanded uncertainty 3.7%. (author)

  18. Uncertainties in hydrogen combustion

    International Nuclear Information System (INIS)

    Stamps, D.W.; Wong, C.C.; Nelson, L.S.

    1988-01-01

    Three important areas of hydrogen combustion with uncertainties are identified: high-temperature combustion, flame acceleration and deflagration-to-detonation transition, and aerosol resuspension during hydrogen combustion. The uncertainties associated with high-temperature combustion may affect at least three different accident scenarios: the in-cavity oxidation of combustible gases produced by core-concrete interactions, the direct containment heating hydrogen problem, and the possibility of local detonations. How these uncertainties may affect the sequence of various accident scenarios is discussed and recommendations are made to reduce these uncertainties. 40 references

  19. Uncertainty in artificial intelligence

    CERN Document Server

    Kanal, LN

    1986-01-01

    How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

  20. How Evolution May Work Through Curiosity-Driven Developmental Process.

    Science.gov (United States)

    Oudeyer, Pierre-Yves; Smith, Linda B

    2016-04-01

    Infants' own activities create and actively select their learning experiences. Here we review recent models of embodied information seeking and curiosity-driven learning and show that these mechanisms have deep implications for development and evolution. We discuss how these mechanisms yield self-organized epigenesis with emergent ordered behavioral and cognitive developmental stages. We describe a robotic experiment that explored the hypothesis that progress in learning, in and for itself, generates intrinsic rewards: The robot learners probabilistically selected experiences according to their potential for reducing uncertainty. In these experiments, curiosity-driven learning led the robot learner to successively discover object affordances and vocal interaction with its peers. We explain how a learning curriculum adapted to the current constraints of the learning system automatically formed, constraining learning and shaping the developmental trajectory. The observed trajectories in the robot experiment share many properties with those in infant development, including a mixture of regularities and diversities in the developmental patterns. Finally, we argue that such emergent developmental structures can guide and constrain evolution, in particular with regard to the origins of language. Copyright © 2016 Cognitive Science Society, Inc.

  1. Uncertainty quantification of CO2 emission reduction for maritime shipping

    International Nuclear Information System (INIS)

    Yuan, Jun; Ng, Szu Hui; Sou, Weng Sut

    2016-01-01

    The International Maritime Organization (IMO) has recently proposed several operational and technical measures to improve shipping efficiency and reduce the greenhouse gases (GHG) emissions. The abatement potentials estimated for these measures have been further used by many organizations to project future GHG emission reductions and plot Marginal Abatement Cost Curves (MACC). However, the abatement potentials estimated for many of these measures can be highly uncertain as many of these measures are new, with limited sea trial information. Furthermore, the abatements obtained are highly dependent on ocean conditions, trading routes and sailing patterns. When the estimated abatement potentials are used for projections, these ‘input’ uncertainties are often not clearly displayed or accounted for, which can lead to overly optimistic or pessimistic outlooks. In this paper, we propose a methodology to systematically quantify and account for these input uncertainties on the overall abatement potential forecasts. We further propose improvements to MACCs to better reflect the uncertainties in marginal abatement costs and total emissions. This approach provides a fuller and more accurate picture of abatement forecasts and potential reductions achievable, and will be useful to policy makers and decision makers in the shipping industry to better assess the cost effective measures for CO 2 emission reduction. - Highlights: • We propose a systematic method to quantify uncertainty in emission reduction. • Marginal abatement cost curves are improved to better reflect the uncertainties. • Percentage reduction probability is given to determine emission reduction target. • The methodology is applied to a case study on maritime shipping.

  2. Licensing in BE system code calculations. Applications and uncertainty evaluation by CIAU method

    International Nuclear Information System (INIS)

    Petruzzi, Alessandro; D'Auria, Francesco

    2007-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate (BE) calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. In the present paper the approaches to uncertainty are outlined and the CIAU (Code with capability of Internal Assessment of Uncertainty) method proposed by the University of Pisa is described including ideas at the basis and results from applications. Two approaches are distinguished that are characterized as 'propagation of code input uncertainty' and 'propagation of code output errors'. For both methods, the thermal-hydraulic code is at the centre of the process of uncertainty evaluation: in the former case the code itself is adopted to compute the error bands and to propagate the input errors, in the latter case the errors in code application to relevant measurements are used to derive the error bands. The CIAU method exploits the idea of the 'status approach' for identifying the thermal-hydraulic conditions of an accident in any Nuclear Power Plant (NPP). Errors in predicting such status are derived from the comparison between predicted and measured quantities and, in the stage of the application of the method, are used to compute the uncertainty. (author)

  3. Convection flows driven by laser heating of a liquid layer

    OpenAIRE

    Rivière , David; Selva , Bertrand; Chraibi , Hamza; Delabre , Ulysse; Delville , Jean-Pierre

    2016-01-01

    International audience; When a fluid is heated by the absorption of a continuous laser wave, the fluid density decreases in the heated area. This induces a pressure gradient that generates internal motion of the fluid. Due to mass conservation, convection eddies emerge in the sample. To investigate these laser-driven bulk flows at the microscopic scale, we built a setup to perform temperature measurements with a fluorescent-sensitive dye on the one hand, and measured the flow pattern at diffe...

  4. Expanding Uncertainty Principle to Certainty-Uncertainty Principles with Neutrosophy and Quad-stage Method

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-03-01

    Full Text Available The most famous contribution of Heisenberg is uncertainty principle. But the original uncertainty principle is improper. Considering all the possible situations (including the case that people can create laws and applying Neutrosophy and Quad-stage Method, this paper presents "certainty-uncertainty principles" with general form and variable dimension fractal form. According to the classification of Neutrosophy, "certainty-uncertainty principles" can be divided into three principles in different conditions: "certainty principle", namely a particle’s position and momentum can be known simultaneously; "uncertainty principle", namely a particle’s position and momentum cannot be known simultaneously; and neutral (fuzzy "indeterminacy principle", namely whether or not a particle’s position and momentum can be known simultaneously is undetermined. The special cases of "certain ty-uncertainty principles" include the original uncertainty principle and Ozawa inequality. In addition, in accordance with the original uncertainty principle, discussing high-speed particle’s speed and track with Newton mechanics is unreasonable; but according to "certaintyuncertainty principles", Newton mechanics can be used to discuss the problem of gravitational defection of a photon orbit around the Sun (it gives the same result of deflection angle as given by general relativity. Finally, for the reason that in physics the principles, laws and the like that are regardless of the principle (law of conservation of energy may be invalid; therefore "certaintyuncertainty principles" should be restricted (or constrained by principle (law of conservation of energy, and thus it can satisfy the principle (law of conservation of energy.

  5. Uncertainty enabled Sensor Observation Services

    Science.gov (United States)

    Cornford, Dan; Williams, Matthew; Bastin, Lucy

    2010-05-01

    Almost all observations of reality are contaminated with errors, which introduce uncertainties into the actual observation result. Such uncertainty is often held to be a data quality issue, and quantification of this uncertainty is essential for the principled exploitation of the observations. Many existing systems treat data quality in a relatively ad-hoc manner, however if the observation uncertainty is a reliable estimate of the error on the observation with respect to reality then knowledge of this uncertainty enables optimal exploitation of the observations in further processes, or decision making. We would argue that the most natural formalism for expressing uncertainty is Bayesian probability theory. In this work we show how the Open Geospatial Consortium Sensor Observation Service can be implemented to enable the support of explicit uncertainty about observations. We show how the UncertML candidate standard is used to provide a rich and flexible representation of uncertainty in this context. We illustrate this on a data set of user contributed weather data where the INTAMAP interpolation Web Processing Service is used to help estimate the uncertainty on the observations of unknown quality, using observations with known uncertainty properties. We then go on to discuss the implications of uncertainty for a range of existing Open Geospatial Consortium standards including SWE common and Observations and Measurements. We discuss the difficult decisions in the design of the UncertML schema and its relation and usage within existing standards and show various options. We conclude with some indications of the likely future directions for UncertML in the context of Open Geospatial Consortium services.

  6. Diagnostic uncertainty and recall bias in chronic low back pain.

    Science.gov (United States)

    Serbic, Danijela; Pincus, Tamar

    2014-08-01

    Patients' beliefs about the origin of their pain and their cognitive processing of pain-related information have both been shown to be associated with poorer prognosis in low back pain (LBP), but the relationship between specific beliefs and specific cognitive processes is not known. The aim of this study was to examine the relationship between diagnostic uncertainty and recall bias in 2 groups of chronic LBP patients, those who were certain about their diagnosis and those who believed that their pain was due to an undiagnosed problem. Patients (N=68) endorsed and subsequently recalled pain, illness, depression, and neutral stimuli. They also provided measures of pain, diagnostic status, mood, and disability. Both groups exhibited a recall bias for pain stimuli, but only the group with diagnostic uncertainty also displayed a recall bias for illness-related stimuli. This bias remained after controlling for depression and disability. Sensitivity analyses using grouping by diagnosis/explanation received supported these findings. Higher levels of depression and disability were found in the group with diagnostic uncertainty, but levels of pain intensity did not differ between the groups. Although the methodology does not provide information on causality, the results provide evidence for a relationship between diagnostic uncertainty and recall bias for negative health-related stimuli in chronic LBP patients. Copyright © 2014 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  7. Effect of the sample matrix on measurement uncertainty in X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Morgenstern, P.; Brueggemann, L.; Wennrich, R.

    2005-01-01

    The estimation of measurement uncertainty, with reference to univariate calibration functions, is discussed in detail in the Eurachem Guide 'Quantifying Uncertainty in Analytical Measurement'. The adoption of these recommendations to quantitative X-ray fluorescence analysis (XRF) involves basic problems which are above all due to the strong influence of the sample matrix on the analytical response. In XRF-analysis, the proposed recommendations are consequently applicable only to the matrix corrected response. The application is also restricted with regard to both the matrices and analyte concentrations. In this context the present studies are aimed at the problems to predict measurement uncertainty also with reference to more variable sample compositions. The corresponding investigations are focused on the use of the intensity of the Compton scattered tube line as an internal standard to assess the effect of the individual sample matrix on the analytical response relatively to a reference matrix. Based on this concept the estimation of the measurement uncertainty of an analyte presented in an unknown specimen can be predicted in consideration of the data obtained under defined matrix conditions

  8. Quantification and Minimization of Uncertainties of Internal Target Volume for Stereotactic Body Radiation Therapy of Lung Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Ge Hong [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States); Department of Radiation Oncology, Henan Cancer Hospital, the Affiliated Cancer Hospital of Zhengzhou University, Henan (China); Cai Jing; Kelsey, Chris R. [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States); Yin Fangfang, E-mail: fangfang.yin@duke.edu [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States)

    2013-02-01

    Purpose: To quantify uncertainties in delineating an internal target volume (ITV) and to understand how these uncertainties may be individually minimized for stereotactic body radiation therapy (SBRT) of early stage non-small cell lung cancer (NSCLC). Methods and Materials: Twenty patients with NSCLC who were undergoing SBRT were imaged with free-breathing 3-dimensional computed tomography (3DCT) and 10-phase 4-dimensional CT (4DCT) for delineating gross tumor volume (GTV){sub 3D} and ITV{sub 10Phase} (ITV3). The maximum intensity projection (MIP) CT was also calculated from 10-phase 4DCT for contouring ITV{sub MIP} (ITV1). Then, ITV{sub COMB} (ITV2), ITV{sub 10Phase+GTV3D} (ITV4), and ITV{sub 10Phase+ITVCOMB} (ITV5) were generated by combining ITV{sub MIP} and GTV{sub 3D}, ITV{sub 10phase} and GTV{sub 3D}, and ITV{sub 10phase} and ITV{sub COMB}, respectively. All 6 volumes (GTV{sub 3D} and ITV1 to ITV5) were delineated in the same lung window by the same radiation oncologist. The percentage of volume difference (PVD) between any 2 different volumes was determined and was correlated to effective tumor diameter (ETD), tumor motion ranges, R{sub 3D}, and the amplitude variability of the recorded breathing signal (v) to assess their volume variations. Results: The mean (range) tumor motion (R{sub SI}, R{sub AP}, R{sub ML}, and R{sub 3D}) and breathing variability (v) were 7.6 mm (2-18 mm), 4.0 mm (2-8 mm), 3.3 mm (0-7.5 mm), 9.9 mm (4.1-18.7 mm), and 0.17 (0.07-0.37), respectively. The trend of volume variation was GTV{sub 3D}

  9. Assessing scenario and parametric uncertainties in risk analysis: a model uncertainty audit

    International Nuclear Information System (INIS)

    Tarantola, S.; Saltelli, A.; Draper, D.

    1999-01-01

    In the present study a process of model audit is addressed on a computational model used for predicting maximum radiological doses to humans in the field of nuclear waste disposal. Global uncertainty and sensitivity analyses are employed to assess output uncertainty and to quantify the contribution of parametric and scenario uncertainties to the model output. These tools are of fundamental importance for risk analysis and decision making purposes

  10. Specifications of the International Atomic Energy Agency's international project on safety assessment driven radioactive waste management solutions

    International Nuclear Information System (INIS)

    Ghannadi, M.; Asgharizadeh, F.; Assadi, M. R.

    2008-01-01

    Radioactive waste is produced in the generation of nuclear power and the production and use of radioactive materials in the industry, research, and medicine. The nuclear waste management facilities need to perform a safety assessment in order to ensure the safety of a facility. Nuclear safety assessment is a structured and systematic way of examining a proposed facility, process, operation and activity. In nuclear waste management point of view, safety assessment is a process which is used to evaluate the safety of radioactive waste management and disposal facilities. In this regard the International Atomic Energy Agency is planed to implement an international project with cooperation of some member states. The Safety Assessment Driving Radioactive Waste Management Solutions Project is an international programme of work to examine international approaches to safety assessment in aspects of p redisposal r adioactive waste management, including waste conditioning and storage. This study is described the rationale, common aspects, scope, objectives, work plan and anticipated outcomes of the project with refer to International Atomic Energy Agency's documents, such as International Atomic Energy Agency's Safety Standards, as well as the Safety Assessment Driving Radioactive Waste Management Solutions project reports

  11. Adult head CT scans: the uncertainties of effective dose estimates

    International Nuclear Information System (INIS)

    Gregory, Kent J.; Bibbo, Giovanni; Pattison, John E.

    2008-01-01

    Full Text: CT scanning is a high dose imaging modality. Effective dose estimates from CT scans can provide important information to patients and medical professionals. For example, medical practitioners can use the dose to estimate the risk to the patient, and judge whether this risk is outweighed by the benefits of the CT examination, while radiographers can gauge the effect of different scanning protocols on the patient effective dose, and take this into consideration when establishing routine scan settings. Dose estimates also form an important part of epidemiological studies examining the health effects of medical radiation exposures on the wider population. Medical physicists have been devoting significant effort towards estimating patient radiation doses from diagnostic CT scans for some years. The question arises: How accurate are these effective dose estimates? The need for a greater understanding and improvement of the uncertainties in CT dose estimates is now gaining recognition as an important issue (BEIR VII 2006). This study is an attempt to analyse and quantify the uncertainty components relating to effective dose estimates from adult head CT examinations that are calculated with four commonly used methods. The dose estimation methods analysed are the Nagel method, the ImpaCT method, the Wellhoefer method and the Dose-Length Product (DLP) method. The analysis of the uncertainties was performed in accordance with the International Standards Organisation's Guide to the Expression of Uncertainty in Measurement as discussed in Gregory et al (Australas. Phys. Eng. Sci. Med., 28: 131-139, 2005). The uncertainty components vary, depending on the method used to derive the effective dose estimate. Uncertainty components in this study include the statistical and other errors from Monte Carlo simulations, uncertainties in the CT settings and positions of patients in the CT gantry, calibration errors from pencil ionization chambers, the variations in the organ

  12. Uncertainty Communication. Issues and good practice

    International Nuclear Information System (INIS)

    Kloprogge, P.; Van der Sluijs, J.; Wardekker, A.

    2007-12-01

    In 2003 the Netherlands Environmental Assessment Agency (MNP) published the RIVM/MNP Guidance for Uncertainty Assessment and Communication. The Guidance assists in dealing with uncertainty in environmental assessments. Dealing with uncertainty is essential because assessment results regarding complex environmental issues are of limited value if the uncertainties have not been taken into account adequately. A careful analysis of uncertainties in an environmental assessment is required, but even more important is the effective communication of these uncertainties in the presentation of assessment results. The Guidance yields rich and differentiated insights in uncertainty, but the relevance of this uncertainty information may vary across audiences and uses of assessment results. Therefore, the reporting of uncertainties is one of the six key issues that is addressed in the Guidance. In practice, users of the Guidance felt a need for more practical assistance in the reporting of uncertainty information. This report explores the issue of uncertainty communication in more detail, and contains more detailed guidance on the communication of uncertainty. In order to make this a 'stand alone' document several questions that are mentioned in the detailed Guidance have been repeated here. This document thus has some overlap with the detailed Guidance. Part 1 gives a general introduction to the issue of communicating uncertainty information. It offers guidelines for (fine)tuning the communication to the intended audiences and context of a report, discusses how readers of a report tend to handle uncertainty information, and ends with a list of criteria that uncertainty communication needs to meet to increase its effectiveness. Part 2 helps writers to analyze the context in which communication takes place, and helps to map the audiences, and their information needs. It further helps to reflect upon anticipated uses and possible impacts of the uncertainty information on the

  13. Polynomial Chaos Characterization of Uncertainty in Multiscale Models and Behavior of Carbon Reinforced Composites

    Energy Technology Data Exchange (ETDEWEB)

    Mehrez, Loujaine [University of Southern California; Ghanem, Roger [University of Southern California; Aitharaju, Venkat [General Motors; Rodgers, William [General Motors

    2017-10-23

    Design of non-crimp fabric (NCF) composites entails major challenges pertaining to (1) the complex fine-scale morphology of the constituents, (2) the manufacturing-produced inconsistency of this morphology spatially, and thus (3) the ability to build reliable, robust, and efficient computational surrogate models to account for this complex nature. Traditional approaches to construct computational surrogate models have been to average over the fluctuations of the material properties at different scale lengths. This fails to account for the fine-scale features and fluctuations in morphology, material properties of the constituents, as well as fine-scale phenomena such as damage and cracks. In addition, it fails to accurately predict the scatter in macroscopic properties, which is vital to the design process and behavior prediction. In this work, funded in part by the Department of Energy, we present an approach for addressing these challenges by relying on polynomial chaos representations of both input parameters and material properties at different scales. Moreover, we emphasize the efficiency and robustness of integrating the polynomial chaos expansion with multiscale tools to perform multiscale assimilation, characterization, propagation, and prediction, all of which are necessary to construct the data-driven surrogate models required to design under the uncertainty of composites. These data-driven constructions provide an accurate map from parameters (and their uncertainties) at all scales and the system-level behavior relevant for design. While this perspective is quite general and applicable to all multiscale systems, NCF composites present a particular hierarchy of scales that permits the efficient implementation of these concepts.

  14. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  15. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  16. Oil price uncertainty in Canada

    Energy Technology Data Exchange (ETDEWEB)

    Elder, John [Department of Finance and Real Estate, 1272 Campus Delivery, Colorado State University, Fort Collins, CO 80523 (United States); Serletis, Apostolos [Department of Economics, University of Calgary, Calgary, Alberta (Canada)

    2009-11-15

    Bernanke [Bernanke, Ben S. Irreversibility, uncertainty, and cyclical investment. Quarterly Journal of Economics 98 (1983), 85-106.] shows how uncertainty about energy prices may induce optimizing firms to postpone investment decisions, thereby leading to a decline in aggregate output. Elder and Serletis [Elder, John and Serletis, Apostolos. Oil price uncertainty.] find empirical evidence that uncertainty about oil prices has tended to depress investment in the United States. In this paper we assess the robustness of these results by investigating the effects of oil price uncertainty in Canada. Our results are remarkably similar to existing results for the United States, providing additional evidence that uncertainty about oil prices may provide another explanation for why the sharp oil price declines of 1985 failed to produce rapid output growth. Impulse-response analysis suggests that uncertainty about oil prices may tend to reinforce the negative response of output to positive oil shocks. (author)

  17. Uncertainty evaluation in correlated quantities: application to elemental analysis of atmospheric aerosols

    International Nuclear Information System (INIS)

    Espinosa, A.; Miranda, J.; Pineda, J. C.

    2010-01-01

    One of the aspects that are frequently overlooked in the evaluation of uncertainty in experimental data is the possibility that the involved quantities are correlated among them, due to different causes. An example in the elemental analysis of atmospheric aerosols using techniques like X-ray Fluorescence (X RF) or Particle Induced X-ray Emission (PIXE). In these cases, the measured elemental concentrations are highly correlated, and then are used to obtain information about other variables, such as the contribution from emitting sources related to soil, sulfate, non-soil potassium or organic matter. This work describes, as an example, the method required to evaluate the uncertainty in variables determined from correlated quantities from a set of atmospheric aerosol samples collected in the Metropolitan Area of the Mexico Valley and analyzed with PIXE. The work is based on the recommendations of the Guide for the Evaluation of Uncertainty published by the International Organization for Standardization. (Author)

  18. Additivity of entropic uncertainty relations

    Directory of Open Access Journals (Sweden)

    René Schwonnek

    2018-03-01

    Full Text Available We consider the uncertainty between two pairs of local projective measurements performed on a multipartite system. We show that the optimal bound in any linear uncertainty relation, formulated in terms of the Shannon entropy, is additive. This directly implies, against naive intuition, that the minimal entropic uncertainty can always be realized by fully separable states. Hence, in contradiction to proposals by other authors, no entanglement witness can be constructed solely by comparing the attainable uncertainties of entangled and separable states. However, our result gives rise to a huge simplification for computing global uncertainty bounds as they now can be deduced from local ones. Furthermore, we provide the natural generalization of the Maassen and Uffink inequality for linear uncertainty relations with arbitrary positive coefficients.

  19. International benchmark for coupled codes and uncertainty analysis in modelling: switching-Off of one of the four operating main circulation pumps at nominal reactor power at NPP Kalinin unit 3

    International Nuclear Information System (INIS)

    Tereshonok, V. A.; Nikonov, S. P.; Lizorkin, M. P.; Velkov, K; Pautz, A.; Ivanov, V.

    2008-01-01

    The paper briefly describes the Specification of an international NEA/OECD benchmark based on measured plant data. During the commissioning tests for nominal power at NPP Kalinin Unit 3 a lot of measurements of neutron and thermo-hydraulic parameters have been carried out in the reactor pressure vessel, primary and the secondary circuits. One of the measured data sets for the transient 'Switching-off of one Main Circulation Pump (MCP) at nominal power' has been chosen to be applied for validation of coupled thermal-hydraulic and neutron-kinetic system codes and additionally for performing of uncertainty analyses as a part of the NEA/OECD Uncertainty Analysis in Modeling Benchmark. The benchmark is opened for all countries and institutions. The experimental data and the final specification with the cross section libraries will be provided to the participants from NEA/OECD only after official declaration of real participation in the benchmark and delivery of the simulated results of the transient for comparison. (Author)

  20. Communicating and dealing with uncertainty in general practice: the association with neuroticism.

    Directory of Open Access Journals (Sweden)

    Antonius Schneider

    Full Text Available Diagnostic reasoning in primary care setting where presented problems and patients are mostly unselected appears as a complex process. The aim was to develop a questionnaire to describe how general practitioners (GPs deal with uncertainty to gain more insight into the decisional process. The association of personality traits with medical decision making was investigated additionally.Raw items were identified by literature research and focus group. Items were improved by interviewing ten GPs with thinking-aloud-method. A personal case vignette related to a complex and uncertainty situation was introduced. The final questionnaire was administered to 228 GPs in Germany. Factorial validity was calculated with explorative and confirmatory factor analysis. The results of the Communicating and Dealing with Uncertainty (CoDU-questionnaire were compared with the scales of the 'Physician Reaction to Uncertainty' (PRU questionnaire and with the personality traits which were determined with the Big Five Inventory (BFI-K.The items could be assigned to four scales with varying internal consistency, namely 'communicating uncertainty' (Cronbach alpha 0.79, 'diagnostic action' (0.60, 'intuition' (0.39 and 'extended social anamnesis' (0.69. Neuroticism was positively associated with all PRU scales 'anxiety due to uncertainty' (Pearson correlation 0.487, 'concerns about bad outcomes' (0.488, 'reluctance to disclose uncertainty to patients' (0.287, 'reluctance to disclose mistakes to physicians' (0.212 and negatively associated with the CoDU scale 'communicating uncertainty' (-0.242 (p<0.01 for all. 'Extraversion' (0.146; p<0.05, 'agreeableness' (0.145, p<0.05, 'conscientiousness' (0.168, p<0.05 and 'openness to experience' (0.186, p<0.01 were significantly positively associated with 'communicating uncertainty'. 'Extraversion' (0.162, 'consciousness' (0.158 and 'openness to experience' (0.155 were associated with 'extended social anamnesis' (p<0.05.The

  1. Internal design of technical systems under conditions of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Krasnoshchekov, P S; Morozov, V V; Fedorov, V V

    1982-03-01

    An investigation is made of a model of internal design of a complex technical system in the presence of uncertain factors. The influence of an opponent on the design is examined. The concepts of hierarchical and balanced compatibility between the criteria of the designer, the opponent and the segregations functions are introduced and studied. The connection between the approach proposed and the methods of artificial intelligence is discussed. 5 references.

  2. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  3. Uncertainty and sensitivity analysis of biokinetic models for radiopharmaceuticals used in nuclear medicine

    International Nuclear Information System (INIS)

    Li, W. B.; Hoeschen, C.

    2010-01-01

    Mathematical models for kinetics of radiopharmaceuticals in humans were developed and are used to estimate the radiation absorbed dose for patients in nuclear medicine by the International Commission on Radiological Protection and the Medical Internal Radiation Dose (MIRD) Committee. However, due to the fact that the residence times used were derived from different subjects, partially even with different ethnic backgrounds, a large variation in the model parameters propagates to a high uncertainty of the dose estimation. In this work, a method was developed for analysing the uncertainty and sensitivity of biokinetic models that are used to calculate the residence times. The biokinetic model of 18 F-FDG (FDG) developed by the MIRD Committee was analysed by this developed method. The sources of uncertainty of all model parameters were evaluated based on the experiments. The Latin hypercube sampling technique was used to sample the parameters for model input. Kinetic modelling of FDG in humans was performed. Sensitivity of model parameters was indicated by combining the model input and output, using regression and partial correlation analysis. The transfer rate parameter of plasma to other tissue fast is the parameter with the greatest influence on the residence time of plasma. Optimisation of biokinetic data acquisition in the clinical practice by exploitation of the sensitivity of model parameters obtained in this study is discussed. (authors)

  4. Monte Carlo Uncertainty Quantification Using Quasi-1D SRM Ballistic Model

    Directory of Open Access Journals (Sweden)

    Davide Viganò

    2016-01-01

    Full Text Available Compactness, reliability, readiness, and construction simplicity of solid rocket motors make them very appealing for commercial launcher missions and embarked systems. Solid propulsion grants high thrust-to-weight ratio, high volumetric specific impulse, and a Technology Readiness Level of 9. However, solid rocket systems are missing any throttling capability at run-time, since pressure-time evolution is defined at the design phase. This lack of mission flexibility makes their missions sensitive to deviations of performance from nominal behavior. For this reason, the reliability of predictions and reproducibility of performances represent a primary goal in this field. This paper presents an analysis of SRM performance uncertainties throughout the implementation of a quasi-1D numerical model of motor internal ballistics based on Shapiro’s equations. The code is coupled with a Monte Carlo algorithm to evaluate statistics and propagation of some peculiar uncertainties from design data to rocker performance parameters. The model has been set for the reproduction of a small-scale rocket motor, discussing a set of parametric investigations on uncertainty propagation across the ballistic model.

  5. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  6. Elevated Body Mass Index is Associated with Increased Integration and Reduced Cohesion of Sensory-Driven and Internally Guided Resting-State Functional Brain Networks.

    Science.gov (United States)

    Doucet, Gaelle E; Rasgon, Natalie; McEwen, Bruce S; Micali, Nadia; Frangou, Sophia

    2018-03-01

    Elevated body mass index (BMI) is associated with increased multi-morbidity and mortality. The investigation of the relationship between BMI and brain organization has the potential to provide new insights relevant to clinical and policy strategies for weight control. Here, we quantified the association between increasing BMI and the functional organization of resting-state brain networks in a sample of 496 healthy individuals that were studied as part of the Human Connectome Project. We demonstrated that higher BMI was associated with changes in the functional connectivity of the default-mode network (DMN), central executive network (CEN), sensorimotor network (SMN), visual network (VN), and their constituent modules. In siblings discordant for obesity, we showed that person-specific factors contributing to obesity are linked to reduced cohesiveness of the sensory networks (SMN and VN). We conclude that higher BMI is associated with widespread alterations in brain networks that balance sensory-driven (SMN, VN) and internally guided (DMN, CEN) states which may augment sensory-driven behavior leading to overeating and subsequent weight gain. Our results provide a neurobiological context for understanding the association between BMI and brain functional organization while accounting for familial and person-specific influences. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. The impact of incomplete information on the use of marketing research intelligence in international service settings: An experimental study

    NARCIS (Netherlands)

    Birgelen, van M.; Ruyter, de J.C.; Wetzels, M.G.M.

    2000-01-01

    Unfamiliarity with foreign business environments and cultures will result in higher levels of uncertainty, especially for international service organizations. To effectively deal with international uncertainty, it seems crucial to have access to information that is as complete as possible. In

  8. Scaling of the local quantum uncertainty at quantum phase transitions

    International Nuclear Information System (INIS)

    Coulamy, I.B.; Warnes, J.H.; Sarandy, M.S.; Saguia, A.

    2016-01-01

    We investigate the local quantum uncertainty (LQU) between a block of L qubits and one single qubit in a composite system of n qubits driven through a quantum phase transition (QPT). A first-order QPT is analytically considered through a Hamiltonian implementation of the quantum search. In the case of second-order QPTs, we consider the transverse-field Ising chain via a numerical analysis through density matrix renormalization group. For both cases, we compute the LQU for finite-sizes as a function of L and of the coupling parameter, analyzing its pronounced behavior at the QPT. - Highlights: • LQU is suitable for the analysis of block correlations. • LQU exhibits pronounced behavior at quantum phase transitions. • LQU exponentially saturates in the quantum search. • Concavity of LQU indicates criticality in the Ising chain.

  9. A new uncertainty importance measure

    International Nuclear Information System (INIS)

    Borgonovo, E.

    2007-01-01

    Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman [A matrix-based approach to uncertainty and sensitivity analysis for fault trees. Risk Anal 1987;7(1):22-33] first introduced uncertainty importance measures

  10. Proceedings of 12th international workshop on Asian network for accelerator-driven system and nuclear transmutation technology (ADS+NTT 2014)

    International Nuclear Information System (INIS)

    Pyeon, Cheol Ho

    2015-01-01

    The proceedings describe the current status on research and development (R and D) of accelerator-driven system (ADS) and nuclear transmutation techniques (NTT), including nuclear data, accelerator techniques, Pb-Bi target, fuel technologies and reactor physics, in East Asian countries: China, Japan and Korea. The proceedings also include all presentation materials presented in 'the 12th International Workshop on Asian Network for ADS and NTT (ADS+NTT 2014)' held at the Institute of Nuclear Energy and Safety Technology, Chinese Academy of Sciences, Hefei, China on 15th and 16th December, 2014. The objective of this workshop is to make actual progress of ADS R and D especially in East Asian countries, as well as in European countries, through sharing mutual interests and conducting the information exchange each other. The report is composed of these following items: Presentation materials: ADS+NTT 2014. (author)

  11. Uncertainty analysis of the 35% reactor inlet header break in a CANDU 6 reactor using RELAP/SCDAPSIM/MOD4.0 with integrated uncertainty analysis option

    International Nuclear Information System (INIS)

    Dupleac, D.; Perez, M.; Reventos, F.; Allison, C.

    2011-01-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis (IUA) package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). RELAP/SCDAPSIM/MOD4.0(IUA) follows the input-propagation approach using probability distribution functions to define the uncertainty of the input parameters. The main steps for this type of methodologies, often referred as to statistical approaches or Wilks’ methods, are the ones that follow: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. RELAP/SCDAPSIM/MOD4.0(IUA) calculates the number of required code runs given the desired percentile and confidence level, performs the sampling process for the

  12. Uncertainty analysis of the 35% reactor inlet header break in a CANDU 6 reactor using RELAP/SCDAPSIM/MOD4.0 with integrated uncertainty analysis option

    Energy Technology Data Exchange (ETDEWEB)

    Dupleac, D., E-mail: danieldu@cne.pub.ro [Politehnica Univ. of Bucharest (Romania); Perez, M.; Reventos, F., E-mail: marina.perez@upc.edu, E-mail: francesc.reventos@upc.edu [Technical Univ. of Catalonia (Spain); Allison, C., E-mail: iss@cableone.net [Innovative Systems Software (United States)

    2011-07-01

    The RELAP/SCDAPSIM/MOD4.0 code, designed to predict the behavior of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology Software Development and Training Program (SDTP). RELAP/SCDAPSIM/MOD4.0, which is the first version of RELAP5 completely rewritten to FORTRAN 90/95/2000 standards, uses the publicly available RELAP5 and SCDAP models in combination with (a) advanced programming and numerical techniques, (b) advanced SDTP-member-developed models for LWR, HWR, and research reactor analysis, and (c) a variety of other member-developed computational packages. One such computational package is an integrated uncertainty analysis (IUA) package being developed jointly by the Technical University of Catalonia (UPC) and Innovative Systems Software (ISS). RELAP/SCDAPSIM/MOD4.0(IUA) follows the input-propagation approach using probability distribution functions to define the uncertainty of the input parameters. The main steps for this type of methodologies, often referred as to statistical approaches or Wilks’ methods, are the ones that follow: 1. Selection of the plant; 2. Selection of the scenario; 3. Selection of the safety criteria; 4. Identification and ranking of the relevant phenomena based on the safety criteria; 5. Selection of the appropriate code parameters to represent those phenomena; 6. Association of uncertainty by means of Probability Distribution Functions (PDFs) for each selected parameter; 7. Random sampling of the selected parameters according to its PDF and performing multiple computer runs to obtain uncertainty bands with a certain percentile and confidence level; 8. Processing the results of the multiple computer runs to estimate the uncertainty bands for the computed quantities associated with the selected safety criteria. RELAP/SCDAPSIM/MOD4.0(IUA) calculates the number of required code runs given the desired percentile and confidence level, performs the sampling process for the

  13. Urban water supply infrastructure planning under predictive groundwater uncertainty: Bayesian updating and flexible design

    Science.gov (United States)

    Fletcher, S.; Strzepek, K.

    2017-12-01

    Many urban water planners face increased pressure on water supply systems from increasing demands from population and economic growth in combination with uncertain water supply, driven by short-term climate variability and long-term climate change. These uncertainties are often exacerbated in groundwater-dependent water systems due to the extra difficulty in measuring groundwater storage, recharge, and sustainable yield. Groundwater models are typically under-parameterized due to the high data requirements for calibration and limited data availability, leading to uncertainty in the models' predictions. We develop an integrated approach to urban water supply planning that combines predictive groundwater uncertainty analysis with adaptive water supply planning using multi-stage decision analysis. This allows us to compare the value of collecting additional groundwater data and reducing predictive uncertainty with the value of using water infrastructure planning that is flexible, modular, and can react quickly in response to unexpected changes in groundwater availability. We apply this approach to a case from Riyadh, Saudi Arabia. Riyadh relies on fossil groundwater aquifers and desalination for urban use. The main fossil aquifers incur minimal recharge and face depletion as a result of intense withdrawals for urban and agricultural use. As the water table declines and pumping becomes uneconomical, Riyadh will have to build new supply infrastructure, decrease demand, or increase the efficiency of its distribution system. However, poor groundwater characterization has led to severe uncertainty in aquifer parameters such as hydraulic conductivity, and therefore severe uncertainty in how the water table will respond to pumping over time and when these transitions will be necessary: the potential depletion time varies from approximately five years to 100 years. This case is an excellent candidate for flexible planning both because of its severity and the potential for

  14. TREATING UNCERTAINTIES IN A NUCLEAR SEISMIC PROBABILISTIC RISK ASSESSMENT BY MEANS OF THE DEMPSTER-SHAFER THEORY OF EVIDENCE

    OpenAIRE

    Lo , Chung-Kung; Pedroni , N.; Zio , Enrico

    2014-01-01

    International audience; The analyses carried out within the Seismic Probabilistic Risk Assessments (SPRAs) of Nuclear Power Plants (NPPs) are affected by significant aleatory and epistemic uncertainties. These uncertainties have to be represented and quantified coherently with the data, information and knowledge available, to provide reasonable assurance that related decisions can be taken robustly and with confidence. The amount of data, information and knowledge available for seismic risk a...

  15. Review of uncertainty estimates associated with models for assessing the impact of breeder reactor radioactivity releases

    International Nuclear Information System (INIS)

    Miller, C.; Little, C.A.

    1982-08-01

    The purpose is to summarize estimates based on currently available data of the uncertainty associated with radiological assessment models. The models being examined herein are those recommended previously for use in breeder reactor assessments. Uncertainty estimates are presented for models of atmospheric and hydrologic transport, terrestrial and aquatic food-chain bioaccumulation, and internal and external dosimetry. Both long-term and short-term release conditions are discussed. The uncertainty estimates presented in this report indicate that, for many sites, generic models and representative parameter values may be used to calculate doses from annual average radionuclide releases when these calculated doses are on the order of one-tenth or less of a relevant dose limit. For short-term, accidental releases, especially those from breeder reactors located in sites dominated by complex terrain and/or coastal meteorology, the uncertainty in the dose calculations may be much larger than an order of magnitude. As a result, it may be necessary to incorporate site-specific information into the dose calculation under these circumstances to reduce this uncertainty. However, even using site-specific information, natural variability and the uncertainties in the dose conversion factor will likely result in an overall uncertainty of greater than an order of magnitude for predictions of dose or concentration in environmental media following shortterm releases

  16. Third International Workshop on Squeezed States and Uncertainty Relations

    Science.gov (United States)

    Han, D. (Editor); Kim, Y. S. (Editor); Rubin, Morton H. (Editor); Shih, Yan-Hua (Editor); Zachary, Woodford W. (Editor)

    1994-01-01

    The purpose of these workshops is to bring together an international selection of scientists to discuss the latest developments in Squeezed States in various branches of physics, and in the understanding of the foundations of quantum mechanics. At the third workshop, special attention was given to the influence that quantum optics is having on our understanding of quantum measurement theory. The fourth meeting in this series will be held in the People's Republic of China.

  17. SU-E-J-159: Analysis of Total Imaging Uncertainty in Respiratory-Gated Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, J; Okuda, T [Toyota memorial hospital, Toyota, Aichi (Japan); Sakaino, S; Yokota, N [Suzukake central hospital, Hamamatsu, Shizuoka (Japan)

    2015-06-15

    , an internal margin should be added to account for the total imaging uncertainty.

  18. SU-E-J-159: Analysis of Total Imaging Uncertainty in Respiratory-Gated Radiotherapy

    International Nuclear Information System (INIS)

    Suzuki, J; Okuda, T; Sakaino, S; Yokota, N

    2015-01-01

    , an internal margin should be added to account for the total imaging uncertainty

  19. Uncertainty calculations made easier

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-07-01

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)

  20. Prevalence of Internalized Homophobia and HIV Associated Risks ...

    African Journals Online (AJOL)

    This study assessed the level of internalized homophobia and associated factors among men who have sex with men (MSM) in Nigeria. Using respondent driven sampling, MSM were recruited in Lagos and Ibadan between July and September, 2006. Internalized homophobia was assessed as a negative composite score ...

  1. Unexpected uncertainty, volatility and decision-making

    Directory of Open Access Journals (Sweden)

    Amy Rachel Bland

    2012-06-01

    Full Text Available The study of uncertainty in decision making is receiving greater attention in the fields of cognitive and computational neuroscience. Several lines of evidence are beginning to elucidate different variants of uncertainty. Particularly, risk, ambiguity and expected and unexpected forms of uncertainty are well articulated in the literature. In this article we review both empirical and theoretical evidence arguing for the potential distinction between three forms of uncertainty; expected uncertainty, unexpected uncertainty and volatility. Particular attention will be devoted to exploring the distinction between unexpected uncertainty and volatility which has been less appreciated in the literature. This includes evidence from computational modelling, neuromodulation, neuroimaging and electrophysiological studies. We further address the possible differentiation of cognitive control mechanisms used to deal with these forms of uncertainty. Particularly we explore a role for conflict monitoring and the temporal integration of information into working memory. Finally, we explore whether the Dual Modes of Control theory provides a theoretical framework for understanding the distinction between unexpected uncertainty and volatility.

  2. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  3. Model uncertainty in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U; Huovinen, T [VTT Automation, Espoo (Finland). Industrial Automation

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.).

  4. Uncertainty Management and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Rosenbaum, Ralph K.; Georgiadis, Stylianos; Fantke, Peter

    2018-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...

  5. INTERNAL AUDIT AND RISK MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Elena RUSE

    2014-04-01

    Full Text Available The existence of risk in economic activity can not be denied. In fact, the risk is a concept which exists in every activity, the term of risk being identified with uncertainty, respectively the (unchance to produce an undesirable event. Internal audit and risk management aim at the same goal, namely the control of risks. Internal Audit performs several roles in risk management plan. The objectives of the internal audit function varies from company to company, but in all economic entities internal audit department aims to improve performance management, enterprise performance and thus improve the internal control system. This paper aims to demonstrate, among other things, that any event that may result in failure is unquestionably classified as risk.

  6. Adaptively Addressing Uncertainty in Estuarine and Near Coastal Restoration Projects

    Energy Technology Data Exchange (ETDEWEB)

    Thom, Ronald M.; Williams, Greg D.; Borde, Amy B.; Southard, John A.; Sargeant, Susan L.; Woodruff, Dana L.; Laufle, Jeffrey C.; Glasoe, Stuart

    2005-03-01

    Restoration projects have an uncertain outcome because of a lack of information about current site conditions, historical disturbance levels, effects of landscape alterations on site development, unpredictable trajectories or patterns of ecosystem structural development, and many other factors. A poor understanding of the factors that control the development and dynamics of a system, such as hydrology, salinity, wave energies, can also lead to an unintended outcome. Finally, lack of experience in restoring certain types of systems (e.g., rare or very fragile habitats) or systems in highly modified situations (e.g., highly urbanized estuaries) makes project outcomes uncertain. Because of these uncertainties, project costs can rise dramatically in an attempt to come closer to project goals. All of the potential sources of error can be addressed to a certain degree through adaptive management. The first step is admitting that these uncertainties can exist, and addressing as many of the uncertainties with planning and directed research prior to implementing the project. The second step is to evaluate uncertainties through hypothesis-driven experiments during project implementation. The third step is to use the monitoring program to evaluate and adjust the project as needed to improve the probability of the project to reach is goal. The fourth and final step is to use the information gained in the project to improve future projects. A framework that includes a clear goal statement, a conceptual model, and an evaluation framework can help in this adaptive restoration process. Projects and programs vary in their application of adaptive management in restoration, and it is very difficult to be highly prescriptive in applying adaptive management to projects that necessarily vary widely in scope, goal, ecosystem characteristics, and uncertainties. Very large ecosystem restoration programs in the Mississippi River delta (Coastal Wetlands Planning, Protection, and Restoration

  7. Propagation of experimental uncertainties using the Lipari-Szabo model-free analysis of protein dynamics

    International Nuclear Information System (INIS)

    Jin Danqing; Andrec, Michael; Montelione, Gaetano T.; Levy, Ronald M.

    1998-01-01

    In this paper we make use of the graphical procedure previously described [Jin, D. et al. (1997) J. Am. Chem. Soc., 119, 6923-6924] to analyze NMR relaxation data using the Lipari-Szabo model-free formalism. The graphical approach is advantageous in that it allows the direct visualization of the experimental uncertainties in the motional parameter space. Some general 'rules' describing the relationship between the precision of the relaxation measurements and the precision of the model-free parameters and how this relationship changes with the overall tumbling time (τm) are summarized. The effect of the precision in the relaxation measurements on the detection of internal motions not close to the extreme narrowing limit is analyzed. We also show that multiple timescale internal motions may be obscured by experimental uncertainty, and that the collection of relaxation data at very high field strength can improve the ability to detect such deviations from the simple Lipari-Szabo model

  8. Sensitivity, uncertainty, and importance analysis of a risk assessment

    International Nuclear Information System (INIS)

    Andsten, R.S.; Vaurio, J.K.

    1992-01-01

    In this paper a number of supplementary studies and applications associated with probabilistic safety assessment (PSA) are described, including sensitivity and importance evaluations of failures, errors, systems, and groups of components. The main purpose is to illustrate the usefulness of a PSA for making decisions about safety improvements, training, allowed outage times, and test intervals. A useful measure of uncertainty importance is presented, and it points out areas needing development, such as reactor vessel aging phenomena, for reducing overall uncertainty. A time-dependent core damage frequency is also presented, illustrating the impact of testing scenarios and intervals. Tea methods and applications presented are based on the Level 1 PSA carried out for the internal initiating event of the Loviisa 1 nuclear power station. Steam generator leakages and associated operator actions are major contributors to the current core-damage frequency estimate of 2 x10 -4 /yr. The results are used to improve the plant and procedures and to guide future improvements

  9. Religion in the face of uncertainty: an uncertainty-identity theory account of religiousness.

    Science.gov (United States)

    Hogg, Michael A; Adelman, Janice R; Blagg, Robert D

    2010-02-01

    The authors characterize religions as social groups and religiosity as the extent to which a person identifies with a religion, subscribes to its ideology or worldview, and conforms to its normative practices. They argue that religions have attributes that make them well suited to reduce feelings of self-uncertainty. According to uncertainty-identity theory, people are motivated to reduce feelings of uncertainty about or reflecting on self; and identification with groups, particularly highly entitative groups, is a very effective way to reduce uncertainty. All groups provide belief systems and normative prescriptions related to everyday life. However, religions also address the nature of existence, invoking sacred entities and associated rituals and ceremonies. They are entitative groups that provide a moral compass and rules for living that pervade a person's life, making them particularly attractive in times of uncertainty. The authors document data supporting their analysis and discuss conditions that transform religiosity into religious zealotry and extremism.

  10. Assessing the reliability of dose coefficients for exposure to radioiodine by members of the public, accounting for dosimetric and risk model uncertainties.

    Science.gov (United States)

    Puncher, M; Zhang, W; Harrison, J D; Wakeford, R

    2017-06-26

    Assessments of risk to a specific population group resulting from internal exposure to a particular radionuclide can be used to assess the reliability of the appropriate International Commission on Radiological Protection (ICRP) dose coefficients used as a radiation protection device for the specified exposure pathway. An estimate of the uncertainty on the associated risk is important for informing judgments on reliability; a derived uncertainty factor, UF, is an estimate of the 95% probable geometric difference between the best risk estimate and the nominal risk and is a useful tool for making this assessment. This paper describes the application of parameter uncertainty analysis to quantify uncertainties resulting from internal exposures to radioiodine by members of the public, specifically 1, 10 and 20-year old females from the population of England and Wales. Best estimates of thyroid cancer incidence risk (lifetime attributable risk) are calculated for ingestion or inhalation of 129 I and 131 I, accounting for uncertainties in biokinetic model and cancer risk model parameter values. These estimates are compared with the equivalent ICRP derived nominal age-, sex- and population-averaged estimates of excess thyroid cancer incidence to obtain UFs. Derived UF values for ingestion or inhalation of 131 I for 1 year, 10-year and 20-year olds are around 28, 12 and 6, respectively, when compared with ICRP Publication 103 nominal values, and 9, 7 and 14, respectively, when compared with ICRP Publication 60 values. Broadly similar results were obtained for 129 I. The uncertainties on risk estimates are largely determined by uncertainties on risk model parameters rather than uncertainties on biokinetic model parameters. An examination of the sensitivity of the results to the risk models and populations used in the calculations show variations in the central estimates of risk of a factor of around 2-3. It is assumed that the direct proportionality of excess thyroid cancer

  11. Travel itinerary uncertainty and the pre-travel consultation--a pilot study.

    Science.gov (United States)

    Flaherty, Gerard; Md Nor, Muhammad Najmi

    2016-01-01

    Risk assessment relies on the accuracy of the information provided by the traveller. A questionnaire was administered to 83 consecutive travellers attending a travel medicine clinic. The majority of travellers was uncertain about destinations within countries, transportation or type of accommodation. Most travellers were uncertain if they would be visiting malaria regions. The degree of uncertainty about itinerary potentially impacts on the ability of the travel medicine specialist to perform an adequate risk assessment, select appropriate vaccinations and prescribe malaria prophylaxis. This study reveals high levels of traveller uncertainty about their itinerary which may potentially reduce the effectiveness of their pre-travel consultation. © The Author 2016. Published by Oxford University Press on behalf of International society of travel medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Uncertainty in Measurement: Procedures for Determining Uncertainty With Application to Clinical Laboratory Calculations.

    Science.gov (United States)

    Frenkel, Robert B; Farrance, Ian

    2018-01-01

    The "Guide to the Expression of Uncertainty in Measurement" (GUM) is the foundational document of metrology. Its recommendations apply to all areas of metrology including metrology associated with the biomedical sciences. When the output of a measurement process depends on the measurement of several inputs through a measurement equation or functional relationship, the propagation of uncertainties in the inputs to the uncertainty in the output demands a level of understanding of the differential calculus. This review is intended as an elementary guide to the differential calculus and its application to uncertainty in measurement. The review is in two parts. In Part I, Section 3, we consider the case of a single input and introduce the concepts of error and uncertainty. Next we discuss, in the following sections in Part I, such notions as derivatives and differentials, and the sensitivity of an output to errors in the input. The derivatives of functions are obtained using very elementary mathematics. The overall purpose of this review, here in Part I and subsequently in Part II, is to present the differential calculus for those in the medical sciences who wish to gain a quick but accurate understanding of the propagation of uncertainties. © 2018 Elsevier Inc. All rights reserved.

  13. Homogeneous internal wave turbulence driven by tidal flows

    Science.gov (United States)

    Le Reun, Thomas; Favier, Benjamin; Le Bars, Michael; Erc Fludyco Team

    2017-11-01

    We propose a novel investigation of the stability of strongly stratified planetary fluid layers undergoing periodic tidal distortion in the limit where rotational effects are negligible compared to buoyancy. With the help of a local model focusing on a small fluid area compared to the global layer, we find that periodic tidal distortion drives a parametric subharmonic resonance of internal. This instability saturates into an homogeneous internal wave turbulence pervading the whole fluid interior: the energy is injected in the unstable waves which then feed a succession of triadic resonances also generating small spatial scales. As the timescale separation between the forcing and Brunt-Väisälä is increased, the temporal spectrum of this turbulence displays a -2 power law reminiscent of the Garrett and Munk spectrum measured in the oceans (Garett & Munk 1979). Moreover, in this state consisting of a superposition of waves in weak non-linear interaction, the mixing efficiency is increased compared to classical, Kolmogorov-like stratified turbulence. This study is of wide interest in geophysical fluid dynamics ranging from oceanic turbulence and tidal heating in icy satellites to dynamo action in partially stratified planetary cores as it could be the case in the Earth. We acknowledge support from the European Research Council (ERC) under the European Union's Horizon 2020 research and innovation program (Grant Agreement No. 681835-FLUDYCO-ERC-2015-CoG).

  14. On Robust Stability of Differential-Algebraic Equations with Structured Uncertainty

    Directory of Open Access Journals (Sweden)

    A. Kononov

    2018-03-01

    Full Text Available We consider a linear time-invariant system of differential-algebraic equations (DAE, which can be written as a system of ordinary differential equations with non-invertible coefficients matrices. An important characteristic of DAE is the unsolvability index, which reflects the complexity of the internal structure of the system. The question of the asymptotic stability of DAE containing the uncertainty given by the matrix norm is investigated. We consider a perturbation in the structured uncertainty case. It is assumed that the initial nominal system is asymptotically stable. For the analysis, the original equation is reduced to the structural form, in which the differential and algebraic subsystems are separated. This structural form is equivalent to the input system in the sense of coincidence of sets of solutions, and the operator transforming the DAE into the structural form possesses the inverse operator. The conversion to structural form does not use a change of variables. Regularity of matrix pencil of the source equation is the necessary and sufficient condition of structural form existence. Sufficient conditions have been obtained that perturbations do not break the internal structure of the nominal system. Under these conditions robust stability of the DAE with structured uncertainty is investigated. Estimates for the stability radius of the perturbed DAE system are obtained. The text of the article is from the simpler case, in which the perturbation is present only for an unknown function, to a more complex one, under which the perturbation is also present in the derivative of the unknown function. We used values of the real and the complex stability radii of explicit ordinary differential equations for obtaining the results. We consider the example illustrating the obtained results.

  15. Uncertainty of a hydrological climate change impact assessment - Is it really all about climate uncertainty?

    Science.gov (United States)

    Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian

    2013-04-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980's with another boost after the publication of the IPCC AR4 report. During hundreds of impact studies a quasi-standard methodology emerged, which is mainly shaped by the growing public demand for predicting how water resources management or flood protection should change in the close future. The ``standard'' workflow considers future climate under a specific IPCC emission scenario simulated by global circulation models (GCMs), possibly downscaled by a regional climate model (RCM) and/or a stochastic weather generator. The output from the climate models is typically corrected for bias before feeding it into a calibrated hydrological model, which is run on the past and future meteorological data to analyse the impacts of climate change on the hydrological indicators of interest. The impact predictions are as uncertain as any forecast that tries to describe the behaviour of an extremely complex system decades into the future. Future climate predictions are uncertain due to the scenario uncertainty and the GCM model uncertainty that is obvious on finer resolution than continental scale. Like in any hierarchical model system, uncertainty propagates through the descendant components. Downscaling increases uncertainty with the deficiencies of RCMs and/or weather generators. Bias correction adds a strong deterministic shift to the input data. Finally the predictive uncertainty of the hydrological model ends the cascade that leads to the total uncertainty of the hydrological impact assessment. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. There are only few studies, which found that the predictive uncertainty of hydrological models can be in the same range or even larger than climatic uncertainty. We carried out a

  16. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  17. Uncertainty and simulation

    International Nuclear Information System (INIS)

    Depres, B.; Dossantos-Uzarralde, P.

    2009-01-01

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  18. Accounting for Epistemic Uncertainty in Mission Supportability Assessment: A Necessary Step in Understanding Risk and Logistics Requirements

    Science.gov (United States)

    Owens, Andrew; De Weck, Olivier L.; Stromgren, Chel; Goodliff, Kandyce; Cirillo, William

    2017-01-01

    Future crewed missions to Mars present a maintenance logistics challenge that is unprecedented in human spaceflight. Mission endurance – defined as the time between resupply opportunities – will be significantly longer than previous missions, and therefore logistics planning horizons are longer and the impact of uncertainty is magnified. Maintenance logistics forecasting typically assumes that component failure rates are deterministically known and uses them to represent aleatory uncertainty, or uncertainty that is inherent to the process being examined. However, failure rates cannot be directly measured; rather, they are estimated based on similarity to other components or statistical analysis of observed failures. As a result, epistemic uncertainty – that is, uncertainty in knowledge of the process – exists in failure rate estimates that must be accounted for. Analyses that neglect epistemic uncertainty tend to significantly underestimate risk. Epistemic uncertainty can be reduced via operational experience; for example, the International Space Station (ISS) failure rate estimates are refined using a Bayesian update process. However, design changes may re-introduce epistemic uncertainty. Thus, there is a tradeoff between changing a design to reduce failure rates and operating a fixed design to reduce uncertainty. This paper examines the impact of epistemic uncertainty on maintenance logistics requirements for future Mars missions, using data from the ISS Environmental Control and Life Support System (ECLS) as a baseline for a case study. Sensitivity analyses are performed to investigate the impact of variations in failure rate estimates and epistemic uncertainty on spares mass. The results of these analyses and their implications for future system design and mission planning are discussed.

  19. The uncertainty budget in pharmaceutical industry

    DEFF Research Database (Denmark)

    Heydorn, Kaj

    of their uncertainty, exactly as described in GUM [2]. Pharmaceutical industry has therefore over the last 5 years shown increasing interest in accreditation according to ISO 17025 [3], and today uncertainty budgets are being developed for all so-called critical measurements. The uncertainty of results obtained...... that the uncertainty of a particular result is independent of the method used for its estimation. Several examples of uncertainty budgets for critical parameters based on the bottom-up procedure will be discussed, and it will be shown how the top-down method is used as a means of verifying uncertainty budgets, based...

  20. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  1. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  2. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    NARCIS (Netherlands)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G.; Ring, David; Parisien, Robert

    2016-01-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if

  3. Governance Frameworks for International Public Goods: The Case of Concerted Entrepreneurship

    Science.gov (United States)

    Andersson, Thomas; Formica, Piero

    2007-01-01

    In the "participation age", emerging cross-border, transnational communities driven by innovation and entrepreneurship initiatives--in short, international entrepreneurial communities--give impetus to the rise of international public goods. With varying intensity, a non-voting international mobile public--still a small but an increasing fraction…

  4. Uncertainty, Social Location and Influence in Decision Making: A Sociometric Analysis

    OpenAIRE

    Michael L. Tushman; Elaine Romanelli

    1983-01-01

    This research investigates the relative impacts of formal status and informal communication roles on influence in administrative and technical decision making. While external information enters the organization via boundary spanning individuals, the exercise of influence at lower levels of the organization is dependent on mediating critical organizational contingencies. As the locus of task uncertainty shifts, so too does the relative influence of boundary spanning individuals and internal st...

  5. Structure of parallel-velocity-shear-driven mode in toroidal plasmas

    International Nuclear Information System (INIS)

    Dong, J.Q.; Xu, W.B.; Zhang, Y.Z.; Horton, W.

    1998-01-01

    It is shown that the Fourier-ballooning representation is appropriate for the study of short-wavelength drift-like perturbation in toroidal plasmas with a parallel velocity shear (PVS). The radial structure of the mode driven by a PVS is investigated in a torus. The Reynolds stress created by PVS turbulence, and proposed as one of the sources for a sheared poloidal plasma rotation, is analyzed. It is demonstrated that a finite ion temperature may strongly enhance the Reynolds stress creation ability from PVS-driven turbulence. The correlation of this observation with the requirement that ion heating power be higher than a threshold value for the formation of an internal transport barrier is discussed. copyright 1998 American Institute of Physics

  6. Uncertainty in geological and hydrogeological data

    Directory of Open Access Journals (Sweden)

    B. Nilsson

    2007-09-01

    Full Text Available Uncertainty in conceptual model structure and in environmental data is of essential interest when dealing with uncertainty in water resources management. To make quantification of uncertainty possible is it necessary to identify and characterise the uncertainty in geological and hydrogeological data. This paper discusses a range of available techniques to describe the uncertainty related to geological model structure and scale of support. Literature examples on uncertainty in hydrogeological variables such as saturated hydraulic conductivity, specific yield, specific storage, effective porosity and dispersivity are given. Field data usually have a spatial and temporal scale of support that is different from the one on which numerical models for water resources management operate. Uncertainty in hydrogeological data variables is characterised and assessed within the methodological framework of the HarmoniRiB classification.

  7. Preface [HD3-2015: International meeting on high-dimensional data-driven science

    International Nuclear Information System (INIS)

    2016-01-01

    A never-ending series of innovations in measurement technology and evolutions in information and communication technologies have led to the ongoing generation and accumulation of large quantities of high-dimensional data every day. While detailed data-centric approaches have been pursued in respective research fields, situations have been encountered where the same mathematical framework of high-dimensional data analysis can be found in a wide variety of seemingly unrelated research fields, such as estimation on the basis of undersampled Fourier transform in nuclear magnetic resonance spectroscopy in chemistry, in magnetic resonance imaging in medicine, and in astronomical interferometry in astronomy. In such situations, bringing diverse viewpoints together therefore becomes a driving force for the creation of innovative developments in various different research fields. This meeting focuses on “Sparse Modeling” (SpM) as a methodology for creation of innovative developments through the incorporation of a wide variety of viewpoints in various research fields. The objective of this meeting is to offer a forum where researchers with interest in SpM can assemble and exchange information on the latest results and newly established methodologies, and discuss future directions of the interdisciplinary studies for High-Dimensional Data-Driven science (HD 3 ). The meeting was held in Kyoto from 14-17 December 2015. We are pleased to publish 22 papers contributed by invited speakers in this volume of Journal of Physics: Conference Series. We hope that this volume will promote further development of High-Dimensional Data-Driven science. (paper)

  8. Justification for recommended uncertainties

    International Nuclear Information System (INIS)

    Pronyaev, V.G.; Badikov, S.A.; Carlson, A.D.

    2007-01-01

    The uncertainties obtained in an earlier standards evaluation were considered to be unrealistically low by experts of the US Cross Section Evaluation Working Group (CSEWG). Therefore, the CSEWG Standards Subcommittee replaced the covariance matrices of evaluated uncertainties by expanded percentage errors that were assigned to the data over wide energy groups. There are a number of reasons that might lead to low uncertainties of the evaluated data: Underestimation of the correlations existing between the results of different measurements; The presence of unrecognized systematic uncertainties in the experimental data can lead to biases in the evaluated data as well as to underestimations of the resulting uncertainties; Uncertainties for correlated data cannot only be characterized by percentage uncertainties or variances. Covariances between evaluated value at 0.2 MeV and other points obtained in model (RAC R matrix and PADE2 analytical expansion) and non-model (GMA) fits of the 6 Li(n,t) TEST1 data and the correlation coefficients are presented and covariances between the evaluated value at 0.045 MeV and other points (along the line or column of the matrix) as obtained in EDA and RAC R matrix fits of the data available for reactions that pass through the formation of the 7 Li system are discussed. The GMA fit with the GMA database is shown for comparison. The following diagrams are discussed: Percentage uncertainties of the evaluated cross section for the 6 Li(n,t) reaction and the for the 235 U(n,f) reaction; estimation given by CSEWG experts; GMA result with full GMA database, including experimental data for the 6 Li(n,t), 6 Li(n,n) and 6 Li(n,total) reactions; uncertainties in the GMA combined fit for the standards; EDA and RAC R matrix results, respectively. Uncertainties of absolute and 252 Cf fission spectrum averaged cross section measurements, and deviations between measured and evaluated values for 235 U(n,f) cross-sections in the neutron energy range 1

  9. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  10. International Experiences and Frameworks to Support Country-Driven Low-Emissions Development

    Energy Technology Data Exchange (ETDEWEB)

    Benioff, R.; Cochran, J.; Cox, S.

    2012-08-01

    Countries can use low-emission development strategies (LEDS) to advance sustainable development, promote private-sector growth, and reduce greenhouse gas emissions. This paper proposes a framework -- or support infrastructure -- to enable the efficient exchange of LEDS-related knowledge and technical assistance. Under the proposed framework, countries share LEDS-related resources via coordinating forums, 'knowledge platforms,' and networks of experts and investors. The virtual 'knowledge platforms' foster learning by allowing countries to communicate with each other and share technical reports, data, and analysis tools in support of LEDS development. Investing in all elements of the framework in an integrated fashion increases the efficacy of support for country-driven LEDS.

  11. International Reaction to the Palestinian Unity Government

    National Research Council Canada - National Science Library

    Morro, Paul

    2007-01-01

    .... The international sanctions have not driven Hamas from power, and instead, some assert they may have provided an opening for Iran to increase its influence among Palestinians by filling the void...

  12. A review of the current state-of-the-art methodology for handling bias and uncertainty in performing criticality safety evaluations. Final report

    International Nuclear Information System (INIS)

    Disney, R.K.

    1994-10-01

    The methodology for handling bias and uncertainty when calculational methods are used in criticality safety evaluations (CSE's) is a rapidly evolving technology. The changes in the methodology are driven by a number of factors. One factor responsible for changes in the methodology for handling bias and uncertainty in CSE's within the overview of the US Department of Energy (DOE) is a shift in the overview function from a ''site'' perception to a more uniform or ''national'' perception. Other causes for change or improvement in the methodology for handling calculational bias and uncertainty are; (1) an increased demand for benchmark criticals data to expand the area (range) of applicability of existing data, (2) a demand for new data to supplement existing benchmark criticals data, (3) the increased reliance on (or need for) computational benchmarks which supplement (or replace) experimental measurements in critical assemblies, and (4) an increased demand for benchmark data applicable to the expanded range of conditions and configurations encountered in DOE site restoration and remediation

  13. Linear Programming Problems for Generalized Uncertainty

    Science.gov (United States)

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

  14. One Strategy for Reducing Uncertainty in Climate Change Communications

    Science.gov (United States)

    Romm, J.

    2011-12-01

    Future impacts of climate change are invariably presented with a very wide range of impacts reflecting two different sets of uncertainties. The first concerns our uncertainty about precisely how much greenhouse gas emissions humanity will emit into the atmosphere. The second concerns our uncertainty about precisely what impact those emissions will have on the climate. By failing to distinguish between these two types of uncertainties, climate scientists have not clearly explained to the public and policymakers what the scientific literature suggests is likely to happen if we don't substantially alter our current emissions path. Indeed, much of climate communications has been built around describing the range of impacts from emissions paths that are increasingly implausible given political and technological constraints, such as a stabilization at 450 or 550 parts per million atmospheric of carbon dioxide. For the past decade, human emissions of greenhouse gases have trended near the worst-case scenarios of the Intergovernmental Panel on Climate Change, emissions paths that reach 800 ppm or even 1000 ppm. The current policies of the two biggest emitters, the United States and China, coupled with the ongoing failure of international negotiations to come to an agreement on restricting emissions, suggests that recent trends will continue for the foreseeable future. This in turn suggests that greater clarity in climate change communications could be achieved by more clearly explaining to the public what the scientific literature suggests the range of impacts are for our current high emissions path. This also suggests that more focus should be given in the scientific literature to better constraining the range of impacts from the high emissions scenarios.

  15. Introducing Blended Learning: An Experience of Uncertainty for Students in the United Arab Emirates

    Science.gov (United States)

    Kemp, Linzi J.

    2013-01-01

    The cultural dimension of Uncertainty Avoidance is analysed in this study of an introduction to blended learning for international students. Content analysis was conducted on the survey narratives collected from three cohorts of management undergraduates in the United Arab Emirates. Interpretation of certainty with blended learning was found in:…

  16. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Elert, M. [Kemakta Konsult AB, Stockholm (Sweden)] [ed.

    1996-09-01

    In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root

  17. Uncertainty propagation in a multiscale model of nanocrystalline plasticity

    International Nuclear Information System (INIS)

    Koslowski, M.; Strachan, Alejandro

    2011-01-01

    We characterize how uncertainties propagate across spatial and temporal scales in a physics-based model of nanocrystalline plasticity of fcc metals. Our model combines molecular dynamics (MD) simulations to characterize atomic-level processes that govern dislocation-based-plastic deformation with a phase field approach to dislocation dynamics (PFDD) that describes how an ensemble of dislocations evolve and interact to determine the mechanical response of the material. We apply this approach to a nanocrystalline Ni specimen of interest in micro-electromechanical (MEMS) switches. Our approach enables us to quantify how internal stresses that result from the fabrication process affect the properties of dislocations (using MD) and how these properties, in turn, affect the yield stress of the metallic membrane (using the PFMM model). Our predictions show that, for a nanocrystalline sample with small grain size (4 nm), a variation in residual stress of 20 MPa (typical in today's microfabrication techniques) would result in a variation on the critical resolved shear yield stress of approximately 15 MPa, a very small fraction of the nominal value of approximately 9 GPa. - Highlights: → Quantify how fabrication uncertainties affect yield stress in a microswitch component. → Propagate uncertainties in a multiscale model of single crystal plasticity. → Molecular dynamics quantifies how fabrication variations affect dislocations. → Dislocation dynamics relate variations in dislocation properties to yield stress.

  18. Measurement uncertainty: Friend or foe?

    Science.gov (United States)

    Infusino, Ilenia; Panteghini, Mauro

    2018-02-02

    The definition and enforcement of a reference measurement system, based on the implementation of metrological traceability of patients' results to higher order reference methods and materials, together with a clinically acceptable level of measurement uncertainty, are fundamental requirements to produce accurate and equivalent laboratory results. The uncertainty associated with each step of the traceability chain should be governed to obtain a final combined uncertainty on clinical samples fulfilling the requested performance specifications. It is important that end-users (i.e., clinical laboratory) may know and verify how in vitro diagnostics (IVD) manufacturers have implemented the traceability of their calibrators and estimated the corresponding uncertainty. However, full information about traceability and combined uncertainty of calibrators is currently very difficult to obtain. Laboratory professionals should investigate the need to reduce the uncertainty of the higher order metrological references and/or to increase the precision of commercial measuring systems. Accordingly, the measurement uncertainty should not be considered a parameter to be calculated by clinical laboratories just to fulfil the accreditation standards, but it must become a key quality indicator to describe both the performance of an IVD measuring system and the laboratory itself. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  19. Comparison of the effect of hazard and response/fragility uncertainties on core melt probability uncertainty

    International Nuclear Information System (INIS)

    Mensing, R.W.

    1985-01-01

    This report proposes a method for comparing the effects of the uncertainty in probabilistic risk analysis (PRA) input parameters on the uncertainty in the predicted risks. The proposed method is applied to compare the effect of uncertainties in the descriptions of (1) the seismic hazard at a nuclear power plant site and (2) random variations in plant subsystem responses and component fragility on the uncertainty in the predicted probability of core melt. The PRA used is that developed by the Seismic Safety Margins Research Program

  20. BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance

    Science.gov (United States)

    Lira, Ignacio

    2003-08-01

    Evaluating the Measurement Uncertainty is a book written for anyone who makes and reports measurements. It attempts to fill the gaps in the ISO Guide to the Expression of Uncertainty in Measurement, or the GUM, and does a pretty thorough job. The GUM was written with the intent of being applicable by all metrologists, from the shop floor to the National Metrology Institute laboratory; however, the GUM has often been criticized for its lack of user-friendliness because it is primarily filled with statements, but with little explanation. Evaluating the Measurement Uncertainty gives lots of explanations. It is well written and makes use of many good figures and numerical examples. Also important, this book is written by a metrologist from a National Metrology Institute, and therefore up-to-date ISO rules, style conventions and definitions are correctly used and supported throughout. The author sticks very closely to the GUM in topical theme and with frequent reference, so readers who have not read GUM cover-to-cover may feel as if they are missing something. The first chapter consists of a reprinted lecture by T J Quinn, Director of the Bureau International des Poids et Mesures (BIPM), on the role of metrology in today's world. It is an interesting and informative essay that clearly outlines the importance of metrology in our modern society, and why accurate measurement capability, and by definition uncertainty evaluation, should be so important. Particularly interesting is the section on the need for accuracy rather than simply reproducibility. Evaluating the Measurement Uncertainty then begins at the beginning, with basic concepts and definitions. The third chapter carefully introduces the concept of standard uncertainty and includes many derivations and discussion of probability density functions. The author also touches on Monte Carlo methods, calibration correction quantities, acceptance intervals or guardbanding, and many other interesting cases. The book goes

  1. Uncertainty Characterization of Reactor Vessel Fracture Toughness

    International Nuclear Information System (INIS)

    Li, Fei; Modarres, Mohammad

    2002-01-01

    To perform fracture mechanics analysis of reactor vessel, fracture toughness (K Ic ) at various temperatures would be necessary. In a best estimate approach, K Ic uncertainties resulting from both lack of sufficient knowledge and randomness in some of the variables of K Ic must be characterized. Although it may be argued that there is only one type of uncertainty, which is lack of perfect knowledge about the subject under study, as a matter of practice K Ic uncertainties can be divided into two types: aleatory and epistemic. Aleatory uncertainty is related to uncertainty that is very difficult to reduce, if not impossible; epistemic uncertainty, on the other hand, can be practically reduced. Distinction between aleatory and epistemic uncertainties facilitates decision-making under uncertainty and allows for proper propagation of uncertainties in the computation process. Typically, epistemic uncertainties representing, for example, parameters of a model are sampled (to generate a 'snapshot', single-value of the parameters), but the totality of aleatory uncertainties is carried through the calculation as available. In this paper a description of an approach to account for these two types of uncertainties associated with K Ic has been provided. (authors)

  2. Opportunity recognition and international new venture creation in University spin-offs

    DEFF Research Database (Denmark)

    Hannibal, Martin; Evers, Natasha; Servais, Per

    2016-01-01

    Extant research suggests that the founder’s activities and interactions are considered pivotal in driving the opportunity recognition process leading to international new venture emergence. This paper aims to explore the opportunity recognition process and international new venture emergence...... in the context of university high-technology spin-offs that are internationally market driven from inception. University spin-offs (USOs) are defined as ‘new firms created to exploit commercially some knowledge, technology or research results developed within a university’ (Pirnay et al., Small Bus Econ 21...... that the inventor-founders are typically engaged in opportunity recognition processes that are characterized as creative, driven by scientific innovations. It is indicated that the process of USO emergence and continuous development involves activities and interactions similar to typical international new ventures...

  3. Strategy under uncertainty.

    Science.gov (United States)

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  4. Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms.

    Science.gov (United States)

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2016-12-01

    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Uncertainty budget for k0-NAA

    International Nuclear Information System (INIS)

    Robouch, P.; Arana, G.; Eguskiza, M.; Etxebarria, N.

    2000-01-01

    The concepts of the Guide to the expression of Uncertainties in Measurements for chemical measurements (GUM) and the recommendations of the Eurachem document 'Quantifying Uncertainty in Analytical Methods' are applied to set up the uncertainty budget for k 0 -NAA. The 'universally applicable spreadsheet technique', described by KRAGTEN, is applied to the k 0 -NAA basic equations for the computation of uncertainties. The variance components - individual standard uncertainties - highlight the contribution and the importance of the different parameters to be taken into account. (author)

  6. Uncertainty governance: an integrated framework for managing and communicating uncertainties

    International Nuclear Information System (INIS)

    Umeki, H.; Naito, M.; Takase, H.

    2004-01-01

    Treatment of uncertainty, or in other words, reasoning with imperfect information is widely recognised as being of great importance within performance assessment (PA) of the geological disposal mainly because of the time scale of interest and spatial heterogeneity that geological environment exhibits. A wide range of formal methods have been proposed for the optimal processing of incomplete information. Many of these methods rely on the use of numerical information, the frequency based concept of probability in particular, to handle the imperfections. However, taking quantitative information as a base for models that solve the problem of handling imperfect information merely creates another problem, i.e., how to provide the quantitative information. In many situations this second problem proves more resistant to solution, and in recent years several authors have looked at a particularly ingenious way in accordance with the rules of well-founded methods such as Bayesian probability theory, possibility theory, and the Dempster-Shafer theory of evidence. Those methods, while drawing inspiration from quantitative methods, do not require the kind of complete numerical information required by quantitative methods. Instead they provide information that, though less precise than that provided by quantitative techniques, is often, if not sufficient, the best that could be achieved. Rather than searching for the best method for handling all imperfect information, our strategy for uncertainty management, that is recognition and evaluation of uncertainties associated with PA followed by planning and implementation of measures to reduce them, is to use whichever method best fits the problem at hand. Such an eclectic position leads naturally to integration of the different formalisms. While uncertainty management based on the combination of semi-quantitative methods forms an important part of our framework for uncertainty governance, it only solves half of the problem

  7. Accelerator driven systems: Energy generation and transmutation of nuclear waste. Status report

    International Nuclear Information System (INIS)

    1997-11-01

    The report includes 31 individual contributions by experts from six countries and two international organizations in different areas of the accelerator driven transmutation technology intended to be applied for the treatment of highly radioactive waste and power generation. A separate abstract was prepared for each paper

  8. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    , and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  9. Entropic uncertainty relations-a survey

    International Nuclear Information System (INIS)

    Wehner, Stephanie; Winter, Andreas

    2010-01-01

    Uncertainty relations play a central role in quantum mechanics. Entropic uncertainty relations in particular have gained significant importance within quantum information, providing the foundation for the security of many quantum cryptographic protocols. Yet, little is known about entropic uncertainty relations with more than two measurement settings. In the present survey, we review known results and open questions.

  10. Propagation of dynamic measurement uncertainty

    International Nuclear Information System (INIS)

    Hessling, J P

    2011-01-01

    The time-dependent measurement uncertainty has been evaluated in a number of recent publications, starting from a known uncertain dynamic model. This could be defined as the 'downward' propagation of uncertainty from the model to the targeted measurement. The propagation of uncertainty 'upward' from the calibration experiment to a dynamic model traditionally belongs to system identification. The use of different representations (time, frequency, etc) is ubiquitous in dynamic measurement analyses. An expression of uncertainty in dynamic measurements is formulated for the first time in this paper independent of representation, joining upward as well as downward propagation. For applications in metrology, the high quality of the characterization may be prohibitive for any reasonably large and robust model to pass the whiteness test. This test is therefore relaxed by not directly requiring small systematic model errors in comparison to the randomness of the characterization. Instead, the systematic error of the dynamic model is propagated to the uncertainty of the measurand, analogously but differently to how stochastic contributions are propagated. The pass criterion of the model is thereby transferred from the identification to acceptance of the total accumulated uncertainty of the measurand. This increases the relevance of the test of the model as it relates to its final use rather than the quality of the calibration. The propagation of uncertainty hence includes the propagation of systematic model errors. For illustration, the 'upward' propagation of uncertainty is applied to determine if an appliance box is damaged in an earthquake experiment. In this case, relaxation of the whiteness test was required to reach a conclusive result

  11. Accelerator-driven nuclear synergetic systems-an overview of the research activities in Sweden

    International Nuclear Information System (INIS)

    Conde, H.; Baecklin, A.; Carius, S.

    1995-01-01

    The rapid development of the accelerator technology which enables the construction of reliable and very intense neutron sources has initiated a growing interest for accelerator driven transmutation systems in Sweden. After the Specialist Meeting on Accelerator-Driven Transmutation Technology for Radwaste and other Applications on 24-28 June 1991 at Saltsjoebaden, Sweden, the research activities oriented towards accelerator-driven systems have been started at several research centers in Sweden. Also the governmental agencies responsible for the spent fuel policy showed a positive attitude to these activities through a limited financial support, particularly for studies of the safety aspects of these systems. Also the nuclear power industry and utilities show a positive interest in the research on these concepts. The present paper presents an overview of the Swedish research activities on accelerator-driven systems and the proposed future coordination, organizations and prospects for this research in the context of the national nuclear energy and spent fuel policy. The Swedish perspective for international cooperation is also described

  12. Accelerator-driven nuclear synergetic systems-an overview of the research activities in Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Conde, H.; Baecklin, A.; Carius, S. [Uppsala Univ. (Sweden)] [and others

    1995-10-01

    The rapid development of the accelerator technology which enables the construction of reliable and very intense neutron sources has initiated a growing interest for accelerator driven transmutation systems in Sweden. After the Specialist Meeting on Accelerator-Driven Transmutation Technology for Radwaste and other Applications on 24-28 June 1991 at Saltsjoebaden, Sweden, the research activities oriented towards accelerator-driven systems have been started at several research centers in Sweden. Also the governmental agencies responsible for the spent fuel policy showed a positive attitude to these activities through a limited financial support, particularly for studies of the safety aspects of these systems. Also the nuclear power industry and utilities show a positive interest in the research on these concepts. The present paper presents an overview of the Swedish research activities on accelerator-driven systems and the proposed future coordination, organizations and prospects for this research in the context of the national nuclear energy and spent fuel policy. The Swedish perspective for international cooperation is also described.

  13. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    Science.gov (United States)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G; Ring, David; Parisien, Robert

    2016-06-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if other factors are involved. With added experience, uncertainty could be expected to diminish, but perhaps more influential are things like physician confidence, belief in the veracity of what is published, and even one's religious beliefs. In addition, it is plausible that the kind of practice a physician works in can affect the experience of uncertainty. Practicing physicians may not be immediately aware of these effects on how uncertainty is experienced in their clinical decision-making. We asked: (1) Does uncertainty and overconfidence bias decrease with years of practice? (2) What sociodemographic factors are independently associated with less recognition of uncertainty, in particular belief in God or other deity or deities, and how is atheism associated with recognition of uncertainty? (3) Do confidence bias (confidence that one's skill is greater than it actually is), degree of trust in the orthopaedic evidence, and degree of statistical sophistication correlate independently with recognition of uncertainty? We created a survey to establish an overall recognition of uncertainty score (four questions), trust in the orthopaedic evidence base (four questions), confidence bias (three questions), and statistical understanding (six questions). Seven hundred six members of the Science of Variation Group, a collaboration that aims to study variation in the definition and treatment of human illness, were approached to complete our survey. This group represents mainly orthopaedic surgeons specializing in trauma or hand and wrist surgery, practicing in Europe and North America, of whom the majority is involved in teaching. Approximately half of the group has more than 10 years

  14. Uncertainty quantification in resonance absorption

    International Nuclear Information System (INIS)

    Williams, M.M.R.

    2012-01-01

    We assess the uncertainty in the resonance escape probability due to uncertainty in the neutron and radiation line widths for the first 21 resonances in 232 Th as given by . Simulation, quadrature and polynomial chaos methods are used and the resonance data are assumed to obey a beta distribution. We find the uncertainty in the total resonance escape probability to be the equivalent, in reactivity, of 75–130 pcm. Also shown are pdfs of the resonance escape probability for each resonance and the variation of the uncertainty with temperature. The viability of the polynomial chaos expansion method is clearly demonstrated.

  15. International New Venture Legitimation

    DEFF Research Database (Denmark)

    Turcan, Romeo V.

    2013-01-01

    the process of their emergence. It is a longitudinal, multiple-case study research that employs critical incident technique for data collection, analysis and interpretation. Following theory driven sampling, five international new ventures were selected that were operating in the software sector in the UK......There is limited theoretical understanding and empirical evidence for how international new ventures legitimate. Drawing from legitimation theory, this study fills in this gap by exploring how international new ventures legitimate and strive for survival in the face of critical events during......, and had internationalized and struggled for survival during the dotcom era. Grounded in data, this study corroborates a number of legitimation strategies yielded by prior research and refutes others. It further contributes to our understanding of international new venture legitimation by suggesting new...

  16. Technical Note: Atmospheric CO2 inversions on the mesoscale using data-driven prior uncertainties: methodology and system evaluation

    Science.gov (United States)

    Kountouris, Panagiotis; Gerbig, Christoph; Rödenbeck, Christian; Karstens, Ute; Koch, Thomas Frank; Heimann, Martin

    2018-03-01

    Atmospheric inversions are widely used in the optimization of surface carbon fluxes on a regional scale using information from atmospheric CO2 dry mole fractions. In many studies the prior flux uncertainty applied to the inversion schemes does not directly reflect the true flux uncertainties but is used to regularize the inverse problem. Here, we aim to implement an inversion scheme using the Jena inversion system and applying a prior flux error structure derived from a model-data residual analysis using high spatial and temporal resolution over a full year period in the European domain. We analyzed the performance of the inversion system with a synthetic experiment, in which the flux constraint is derived following the same residual analysis but applied to the model-model mismatch. The synthetic study showed a quite good agreement between posterior and true fluxes on European, country, annual and monthly scales. Posterior monthly and country-aggregated fluxes improved their correlation coefficient with the known truth by 7 % compared to the prior estimates when compared to the reference, with a mean correlation of 0.92. The ratio of the SD between the posterior and reference and between the prior and reference was also reduced by 33 % with a mean value of 1.15. We identified temporal and spatial scales on which the inversion system maximizes the derived information; monthly temporal scales at around 200 km spatial resolution seem to maximize the information gain.

  17. Mechanics and uncertainty

    CERN Document Server

    Lemaire, Maurice

    2014-01-01

    Science is a quest for certainty, but lack of certainty is the driving force behind all of its endeavors. This book, specifically, examines the uncertainty of technological and industrial science. Uncertainty and Mechanics studies the concepts of mechanical design in an uncertain setting and explains engineering techniques for inventing cost-effective products. Though it references practical applications, this is a book about ideas and potential advances in mechanical science.

  18. Needs of the CSAU uncertainty method

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2000-01-01

    The use of best estimate codes for safety analysis requires quantification of the uncertainties. These uncertainties are inherently linked to the chosen safety analysis methodology. Worldwide, various methods were proposed for this quantification. The purpose of this paper was to identify the needs of the Code Scaling, Applicability, and Uncertainty (CSAU) methodology and then to answer the needs. The specific procedural steps were combined from other methods for uncertainty evaluation and new tools and procedures were proposed. The uncertainty analysis approach and tools were then utilized for confirmatory study. The uncertainty was quantified for the RELAP5/MOD3.2 thermalhydraulic computer code. The results of the adapted CSAU approach to the small-break loss-of-coolant accident (SB LOCA) show that the adapted CSAU can be used for any thermal-hydraulic safety analysis with uncertainty evaluation. However, it was indicated that there are still some limitations in the CSAU approach that need to be resolved. (author)

  19. Uncertainty in social dilemmas

    NARCIS (Netherlands)

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size

  20. A method for calculating Bayesian uncertainties on internal doses resulting from complex occupational exposures

    International Nuclear Information System (INIS)

    Puncher, M.; Birchall, A.; Bull, R. K.

    2012-01-01

    Estimating uncertainties on doses from bioassay data is of interest in epidemiology studies that estimate cancer risk from occupational exposures to radionuclides. Bayesian methods provide a logical framework to calculate these uncertainties. However, occupational exposures often consist of many intakes, and this can make the Bayesian calculation computationally intractable. This paper describes a novel strategy for increasing the computational speed of the calculation by simplifying the intake pattern to a single composite intake, termed as complex intake regime (CIR). In order to assess whether this approximation is accurate and fast enough for practical purposes, the method is implemented by the Weighted Likelihood Monte Carlo Sampling (WeLMoS) method and evaluated by comparing its performance with a Markov Chain Monte Carlo (MCMC) method. The MCMC method gives the full solution (all intakes are independent), but is very computationally intensive to apply routinely. Posterior distributions of model parameter values, intakes and doses are calculated for a representative sample of plutonium workers from the United Kingdom Atomic Energy cohort using the WeLMoS method with the CIR and the MCMC method. The distributions are in good agreement: posterior means and Q 0.025 and Q 0.975 quantiles are typically within 20 %. Furthermore, the WeLMoS method using the CIR converges quickly: a typical case history takes around 10-20 min on a fast workstation, whereas the MCMC method took around 12-hr. The advantages and disadvantages of the method are discussed. (authors)

  1. The same as it never was? Uncertainty and the changing contours of international law

    NARCIS (Netherlands)

    Kessler, Oliver

    2011-01-01

    International law has changed significantly since the end of the Cold War. As long as the international was thought to be populated by sovereign states predominantly, international law was conceived of as a means for peaceful dispute settlement. That is: the reference to state sovereignty not only

  2. CEC/USDOE workshop on uncertainty analysis

    International Nuclear Information System (INIS)

    Elderkin, C.E.; Kelly, G.N.

    1990-07-01

    Any measured or assessed quantity contains uncertainty. The quantitative estimation of such uncertainty is becoming increasingly important, especially in assuring that safety requirements are met in design, regulation, and operation of nuclear installations. The CEC/USDOE Workshop on Uncertainty Analysis, held in Santa Fe, New Mexico, on November 13 through 16, 1989, was organized jointly by the Commission of European Communities (CEC's) Radiation Protection Research program, dealing with uncertainties throughout the field of consequence assessment, and DOE's Atmospheric Studies in Complex Terrain (ASCOT) program, concerned with the particular uncertainties in time and space variant transport and dispersion. The workshop brought together US and European scientists who have been developing or applying uncertainty analysis methodologies, conducted in a variety of contexts, often with incomplete knowledge of the work of others in this area. Thus, it was timely to exchange views and experience, identify limitations of approaches to uncertainty and possible improvements, and enhance the interface between developers and users of uncertainty analysis methods. Furthermore, the workshop considered the extent to which consistent, rigorous methods could be used in various applications within consequence assessment. 3 refs

  3. Refinement of the concept of uncertainty.

    Science.gov (United States)

    Penrod, J

    2001-04-01

    To analyse the conceptual maturity of uncertainty; to develop an expanded theoretical definition of uncertainty; to advance the concept using methods of concept refinement; and to analyse congruency with the conceptualization of uncertainty presented in the theory of hope, enduring, and suffering. Uncertainty is of concern in nursing as people experience complex life events surrounding health. In an earlier nursing study that linked the concepts of hope, enduring, and suffering into a single theoretical scheme, a state best described as 'uncertainty' arose. This study was undertaken to explore how this conceptualization fit with the scientific literature on uncertainty and to refine the concept. Initially, a concept analysis using advanced methods described by Morse, Hupcey, Mitcham and colleagues was completed. The concept was determined to be partially mature. A theoretical definition was derived and techniques of concept refinement using the literature as data were applied. The refined concept was found to be congruent with the concept of uncertainty that had emerged in the model of hope, enduring and suffering. Further investigation is needed to explore the extent of probabilistic reasoning and the effects of confidence and control on feelings of uncertainty and certainty.

  4. Uncertainty in oil projects

    International Nuclear Information System (INIS)

    Limperopoulos, G.J.

    1995-01-01

    This report presents an oil project valuation under uncertainty by means of two well-known financial techniques: The Capital Asset Pricing Model (CAPM) and The Black-Scholes Option Pricing Formula. CAPM gives a linear positive relationship between expected rate of return and risk but does not take into consideration the aspect of flexibility which is crucial for an irreversible investment as an oil price is. Introduction of investment decision flexibility by using real options can increase the oil project value substantially. Some simple tests for the importance of uncertainty in stock market for oil investments are performed. Uncertainty in stock returns is correlated with aggregate product market uncertainty according to Pindyck (1991). The results of the tests are not satisfactory due to the short data series but introducing two other explanatory variables the interest rate and Gross Domestic Product make the situation better. 36 refs., 18 figs., 6 tabs

  5. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    Science.gov (United States)

    Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  6. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  7. Impact of geometric uncertainties on evaluation of treatment techniques for prostate cancer

    International Nuclear Information System (INIS)

    Craig, Tim; Wong, Eugene; Bauman, Glenn; Battista, Jerry; Van Dyk, Jake

    2005-01-01

    Purpose: To assess the impact of patient repositioning and internal organ motion on prostate treatment plans using three-dimensional conformal and intensity-modulated radiotherapy. Methods and materials: Four-field, six-field, and simplified intensity-modulated arc therapy plans were generated for 5 prostate cancer patients. The planning target volume was created by adding a 1-cm margin to the clinical target volume. A convolution model was used to estimate the effect of random geometric uncertainties during treatment. Dose statistics, tumor control probabilities, and normal tissue complication probabilities were compared with and without the presence of uncertainty. The impact of systematic uncertainties was also investigated. Results: Compared with the planned treatments, the delivered dose distribution with random geometric uncertainties displayed an increase in the apparent minimal dose to the prostate and seminal vesicles and a decrease in the rectal volume receiving a high dose. This increased the tumor control probabilities and decreased the normal tissue complication probabilities. Changes were seen in the percentage of prostate volume receiving 100% and 95% of the prescribed dose, and the minimal dose and tumor control probabilities for the target volume. In addition, the volume receiving at least 65 Gy, the minimal dose, and normal tissue complication probabilities changed considerably for the rectum. The simplified intensity-modulated arc therapy technique was the most sensitive to systematic errors, especially in the anterior-posterior and superior-inferior directions. Conclusion: Geometric uncertainties should be considered when evaluating treatment plans. Contrary to the widely held belief, increased conformation of the dose distribution is not always associated with increased sensitivity to random geometric uncertainties if a sufficient planning target volume margin is used. Systematic errors may have a variable effect, depending on the treatment

  8. Greenhouse gas scenario sensitivity and uncertainties in precipitation projections for central Belgium

    Science.gov (United States)

    Van Uytven, E.; Willems, P.

    2018-03-01

    Climate change impact assessment on meteorological variables involves large uncertainties as a result of incomplete knowledge on the future greenhouse gas concentrations and climate model physics, next to the inherent internal variability of the climate system. Given that the alteration in greenhouse gas concentrations is the driver for the change, one expects the impacts to be highly dependent on the considered greenhouse gas scenario (GHS). In this study, we denote this behavior as GHS sensitivity. Due to the climate model related uncertainties, this sensitivity is, at local scale, not always that strong as expected. This paper aims to study the GHS sensitivity and its contributing role to climate scenarios for a case study in Belgium. An ensemble of 160 CMIP5 climate model runs is considered and climate change signals are studied for precipitation accumulation, daily precipitation intensities and wet day frequencies. This was done for the different seasons of the year and the scenario periods 2011-2040, 2031-2060, 2051-2081 and 2071-2100. By means of variance decomposition, the total variance in the climate change signals was separated in the contribution of the differences in GHSs and the other model-related uncertainty sources. These contributions were found dependent on the variable and season. Following the time of emergence concept, the GHS uncertainty contribution is found dependent on the time horizon and increases over time. For the most distinct time horizon (2071-2100), the climate model uncertainty accounts for the largest uncertainty contribution. The GHS differences explain up to 18% of the total variance in the climate change signals. The results point further at the importance of the climate model ensemble design, specifically the ensemble size and the combination of climate models, whereupon climate scenarios are based. The numerical noise, introduced at scales smaller than the skillful scale, e.g. at local scale, was not considered in this study.

  9. Modeling Input Errors to Improve Uncertainty Estimates for Sediment Transport Model Predictions

    Science.gov (United States)

    Jung, J. Y.; Niemann, J. D.; Greimann, B. P.

    2016-12-01

    Bayesian methods using Markov chain Monte Carlo algorithms have recently been applied to sediment transport models to assess the uncertainty in the model predictions due to the parameter values. Unfortunately, the existing approaches can only attribute overall uncertainty to the parameters. This limitation is critical because no model can produce accurate forecasts if forced with inaccurate input data, even if the model is well founded in physical theory. In this research, an existing Bayesian method is modified to consider the potential errors in input data during the uncertainty evaluation process. The input error is modeled using Gaussian distributions, and the means and standard deviations are treated as uncertain parameters. The proposed approach is tested by coupling it to the Sedimentation and River Hydraulics - One Dimension (SRH-1D) model and simulating a 23-km reach of the Tachia River in Taiwan. The Wu equation in SRH-1D is used for computing the transport capacity for a bed material load of non-cohesive material. Three types of input data are considered uncertain: (1) the input flowrate at the upstream boundary, (2) the water surface elevation at the downstream boundary, and (3) the water surface elevation at a hydraulic structure in the middle of the reach. The benefits of modeling the input errors in the uncertainty analysis are evaluated by comparing the accuracy of the most likely forecast and the coverage of the observed data by the credible intervals to those of the existing method. The results indicate that the internal boundary condition has the largest uncertainty among those considered. Overall, the uncertainty estimates from the new method are notably different from those of the existing method for both the calibration and forecast periods.

  10. Accelerator driven systems: Energy generation and transmutation of nuclear waste. Status report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-11-01

    The report includes 31 individual contributions by experts from six countries and two international organizations in different areas of the accelerator driven transmutation technology intended to be applied for the treatment of highly radioactive waste and power generation. A separate abstract was prepared for each paper. Refs, figs, tabs.

  11. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  12. An examination of the relationships among uncertainty, appraisal, and information-seeking behavior proposed in uncertainty management theory.

    Science.gov (United States)

    Rains, Stephen A; Tukachinsky, Riva

    2015-01-01

    Uncertainty management theory (UMT; Brashers, 2001, 2007) is rooted in the assumption that, as opposed to being inherently negative, health-related uncertainty is appraised for its meaning. Appraisals influence subsequent behaviors intended to manage uncertainty, such as information seeking. This study explores the connections among uncertainty, appraisal, and information-seeking behavior proposed in UMT. A laboratory study was conducted in which participants (N = 157) were primed to feel and desire more or less uncertainty about skin cancer and were given the opportunity to search for skin cancer information using the World Wide Web. The results show that desired uncertainty level predicted appraisal intensity, and appraisal intensity predicted information-seeking depth-although the latter relationship was in the opposite direction of what was expected.

  13. Pricing of medical devices under coverage uncertainty--a modelling approach.

    Science.gov (United States)

    Girling, Alan J; Lilford, Richard J; Young, Terry P

    2012-12-01

    Product vendors and manufacturers are increasingly aware that purchasers of health care will fund new clinical treatments only if they are perceived to deliver value-for-money. This influences companies' internal commercial decisions, including the price they set for their products. Other things being equal, there is a price threshold, which is the maximum price at which the device will be funded and which, if its value were known, would play a central role in price determination. This paper examines the problem of pricing a medical device from the vendor's point of view in the presence of uncertainty about what the price threshold will be. A formal solution is obtained by maximising the expected value of the net revenue function, assuming a Bayesian prior distribution for the price threshold. A least admissible price is identified. The model can also be used as a tool for analysing proposed pricing policies when no formal prior specification of uncertainty is available. Copyright © 2011 John Wiley & Sons, Ltd.

  14. The uncertainty of crop yield projections is reduced by improved temperature response functions

    DEFF Research Database (Denmark)

    Wang, Enli; Martre, Pierre; Zhao, Zhigan

    2017-01-01

    , we show that variations in the mathematical functions currently used to simulate temperature responses of physiological processes in 29 wheat models account for >50% of uncertainty in simulated grain yields for mean growing season temperatures from 14 °C to 33 °C. We derived a set of new temperature......Quality) and analysing their results against the HSC data and an additional global dataset from the International Heat Stress Genotpye Experiment (IHSGE)8 carried out by the International Maize and Wheat Improvement Center (CIMMYT). More importantly, we derive, based on newest knowledge and data, a set of new...

  15. Uncertainty estimation with a small number of measurements, part II: a redefinition of uncertainty and an estimator method

    Science.gov (United States)

    Huang, Hening

    2018-01-01

    This paper is the second (Part II) in a series of two papers (Part I and Part II). Part I has quantitatively discussed the fundamental limitations of the t-interval method for uncertainty estimation with a small number of measurements. This paper (Part II) reveals that the t-interval is an ‘exact’ answer to a wrong question; it is actually misused in uncertainty estimation. This paper proposes a redefinition of uncertainty, based on the classical theory of errors and the theory of point estimation, and a modification of the conventional approach to estimating measurement uncertainty. It also presents an asymptotic procedure for estimating the z-interval. The proposed modification is to replace the t-based uncertainty with an uncertainty estimator (mean- or median-unbiased). The uncertainty estimator method is an approximate answer to the right question to uncertainty estimation. The modified approach provides realistic estimates of uncertainty, regardless of whether the population standard deviation is known or unknown, or if the sample size is small or large. As an application example of the modified approach, this paper presents a resolution to the Du-Yang paradox (i.e. Paradox 2), one of the three paradoxes caused by the misuse of the t-interval in uncertainty estimation.

  16. Field Performance of Inverter-Driven Heat Pumps in Cold Climates

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, James [Consortium of Advanced Residential Buildings, Norwalk, CT (United States); Aldrich, Robb [Consortium of Advanced Residential Buildings, Norwalk, CT (United States)

    2015-08-19

    Traditionally, air-source heat pumps (ASHPs) have been used more often in warmer climates; however, some new ASHPs are gaining ground in colder areas. These systems operate at subzero (Fahrenheit) temperatures and many do not include backup electric resistance elements. There are still uncertainties, however, about capacity and efficiency in cold weather. Also, questions such as “how cold is too cold?” do not have clear answers. These uncertainties could lead to skepticism among homeowners; poor energy savings estimates; suboptimal system selection by heating, ventilating, and air-conditioning contractors; and inconsistent energy modeling. In an effort to better understand and characterize the heating performance of these units in cold climates, the U.S. Department of Energy Building America team, Consortium for Advanced Residential Buildings (CARB), monitored seven inverter-driven, ductless ASHPs across the Northeast. Operating data were collected for three Mitsubishi FE18 units, three Mitsubishi FE12 units, and one Fujitsu 15RLS2 unit. The intent of this research was to assess heat output, electricity consumption, and coefficients of performance (COPs) at various temperatures and load conditions. This assessment was accomplished with long- and short-term tests that measured power consumption; supply, return, and outdoor air temperatures; and airflow through the indoor fan coil.

  17. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  18. How much is new information worth? Evaluating the financial benefit of resolving management uncertainty

    Science.gov (United States)

    Maxwell, Sean L.; Rhodes, Jonathan R.; Runge, Michael C.; Possingham, Hugh P.; Ng, Chooi Fei; McDonald Madden, Eve

    2015-01-01

    Conservation decision-makers face a trade-off between spending limited funds on direct management action, or gaining new information in an attempt to improve management performance in the future. Value-of-information analysis can help to resolve this trade-off by evaluating how much management performance could improve if new information was gained. Value-of-information analysis has been used extensively in other disciplines, but there are only a few examples where it has informed conservation planning, none of which have used it to evaluate the financial value of gaining new information. We address this gap by applying value-of-information analysis to the management of a declining koala Phascolarctos cinereuspopulation. Decision-makers responsible for managing this population face uncertainty about survival and fecundity rates, and how habitat cover affects mortality threats. The value of gaining new information about these uncertainties was calculated using a deterministic matrix model of the koala population to find the expected population growth rate if koala mortality threats were optimally managed under alternative model hypotheses, which represented the uncertainties faced by koala managers. Gaining new information about survival and fecundity rates and the effect of habitat cover on mortality threats will do little to improve koala management. Across a range of management budgets, no more than 1·7% of the budget should be spent on resolving these uncertainties. The value of information was low because optimal management decisions were not sensitive to the uncertainties we considered. Decisions were instead driven by a substantial difference in the cost efficiency of management actions. The value of information was up to forty times higher when the cost efficiencies of different koala management actions were similar. Synthesis and applications. This study evaluates the ecological and financial benefits of gaining new information to inform a conservation

  19. Uncertainty and Sensitivity Analyses of a Pebble Bed HTGR Loss of Cooling Event

    Directory of Open Access Journals (Sweden)

    Gerhard Strydom

    2013-01-01

    Full Text Available The Very High Temperature Reactor Methods Development group at the Idaho National Laboratory identified the need for a defensible and systematic uncertainty and sensitivity approach in 2009. This paper summarizes the results of an uncertainty and sensitivity quantification investigation performed with the SUSA code, utilizing the International Atomic Energy Agency CRP 5 Pebble Bed Modular Reactor benchmark and the INL code suite PEBBED-THERMIX. Eight model input parameters were selected for inclusion in this study, and after the input parameters variations and probability density functions were specified, a total of 800 steady state and depressurized loss of forced cooling (DLOFC transient PEBBED-THERMIX calculations were performed. The six data sets were statistically analyzed to determine the 5% and 95% DLOFC peak fuel temperature tolerance intervals with 95% confidence levels. It was found that the uncertainties in the decay heat and graphite thermal conductivities were the most significant contributors to the propagated DLOFC peak fuel temperature uncertainty. No significant differences were observed between the results of Simple Random Sampling (SRS or Latin Hypercube Sampling (LHS data sets, and use of uniform or normal input parameter distributions also did not lead to any significant differences between these data sets.

  20. Uncertainties in coupled thermal-hydrological processes associated with the drift scale test at Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Mukhopadhyay, Sumitra; Tsang, Y.W.

    2002-01-01

    Understanding thermally driven coupled hydrological, mechanical, and chemical processes in unsaturated fractured tuff is essential for evaluating the performance of the potential radioactive waste repository at Yucca Mountain, Nevada. The Drift Scale Test (DST), intended for acquiring such an understanding of these processes, has generated a huge volume of temperature and moisture redistribution data. Sophisticated thermal hydrological (TH) conceptual models have yielded a good fit between simulation results and those measured data. However, some uncertainties in understanding the TH processes associated with the DST still exist. This paper evaluates these uncertainties and provides quantitative estimates of the range of these uncertainties. Of particular interest for the DST are the uncertainties resulting from the unmonitored loss of vapor through an open bulkhead of the test. There was concern that the outcome from the test might have been significantly altered by these losses. Using alternative conceptual models, we illustrate that predicted mean temperatures from the DST are within 1 degree C of the measured mean temperatures through the first two years of heating. The simulated spatial and temporal evolution of drying and condensation fronts is found to be qualitatively consistent with measured saturation data. Energy and mass balance computation shows that no more than 13 percent of the input energy is lost because of vapor leaving the test domain through the bulkhead. The change in average saturation in fractures is also relatively small. For a hypothetical situation in which no vapor is allowed to exit through the bulkhead, the simulated average fracture saturation is not qualitatively different enough to be discerned by measured moisture redistribution data. This leads us to conclude that the DST, despite the uncertainties associated with open field testing, has provided an excellent understanding of the TH processes