WorldWideScience

Sample records for internally driven uncertainty

  1. Uncertainty assessment for accelerator-driven systems

    International Nuclear Information System (INIS)

    Finck, P. J.; Gomes, I.; Micklich, B.; Palmiotti, G.

    1999-01-01

    The concept of a subcritical system driven by an external source of neutrons provided by an accelerator ADS (Accelerator Driver System) has been recently revived and is becoming more popular in the world technical community with active programs in Europe, Russia, Japan, and the U.S. A general consensus has been reached in adopting for the subcritical component a fast spectrum liquid metal cooled configuration. Both a lead-bismuth eutectic, sodium and gas are being considered as a coolant; each has advantages and disadvantages. The major expected advantage is that subcriticality avoids reactivity induced transients. The potentially large subcriticality margin also should allow for the introduction of very significant quantities of waste products (minor Actinides and Fission Products) which negatively impact the safety characteristics of standard cores. In the U.S. these arguments are the basis for the development of the Accelerator Transmutation of Waste (ATW), which has significant potential in reducing nuclear waste levels. Up to now, neutronic calculations have not attached uncertainties on the values of the main nuclear integral parameters that characterize the system. Many of these parameters (e.g., degree of subcriticality) are crucial to demonstrate the validity and feasibility of this concept. In this paper we will consider uncertainties related to nuclear data only. The present knowledge of the cross sections of many isotopes that are not usually utilized in existing reactors (like Bi, Pb-207, Pb-208, and also Minor Actinides and Fission Products) suggests that uncertainties in the integral parameters will be significantly larger than for conventional reactor systems, and this raises concerns on the neutronic performance of those systems

  2. Coping With Uncertainty in International Business

    OpenAIRE

    Briance Mascarenhas

    1982-01-01

    International business, as compared with domestic business, is usually characterized by increased uncertainty. A study of 10 multinational companies uncovered several methods of coping with uncertainty. This paper focuses on 2 methods which may not be apparent control and flexibility. A framework of analysis suggesting appropriate methods for coping with uncertainty is also developed.© 1982 JIBS. Journal of International Business Studies (1982) 13, 87–98

  3. Data-Driven Model Uncertainty Estimation in Hydrologic Data Assimilation

    Science.gov (United States)

    Pathiraja, S.; Moradkhani, H.; Marshall, L.; Sharma, A.; Geenens, G.

    2018-02-01

    The increasing availability of earth observations necessitates mathematical methods to optimally combine such data with hydrologic models. Several algorithms exist for such purposes, under the umbrella of data assimilation (DA). However, DA methods are often applied in a suboptimal fashion for complex real-world problems, due largely to several practical implementation issues. One such issue is error characterization, which is known to be critical for a successful assimilation. Mischaracterized errors lead to suboptimal forecasts, and in the worst case, to degraded estimates even compared to the no assimilation case. Model uncertainty characterization has received little attention relative to other aspects of DA science. Traditional methods rely on subjective, ad hoc tuning factors or parametric distribution assumptions that may not always be applicable. We propose a novel data-driven approach (named SDMU) to model uncertainty characterization for DA studies where (1) the system states are partially observed and (2) minimal prior knowledge of the model error processes is available, except that the errors display state dependence. It includes an approach for estimating the uncertainty in hidden model states, with the end goal of improving predictions of observed variables. The SDMU is therefore suited to DA studies where the observed variables are of primary interest. Its efficacy is demonstrated through a synthetic case study with low-dimensional chaotic dynamics and a real hydrologic experiment for one-day-ahead streamflow forecasting. In both experiments, the proposed method leads to substantial improvements in the hidden states and observed system outputs over a standard method involving perturbation with Gaussian noise.

  4. Uncertainty, learning and international environmental policy coordination

    International Nuclear Information System (INIS)

    Ulph, A.; Maddison, D.

    1997-01-01

    In this paper we construct a simple model of global warming which captures a number of key features of the global warming problem: (1) environmental damages are related to the stock of greenhouse gases in the atmosphere; (2) the global commons nature of the problem means that these are strategic interactions between the emissions policies of the governments of individual nation states; (3) there is uncertainty about the extent of the future damages that will be incurred by each country from any given level of concentration of greenhouse gases but there is the possibility that at a future date better information about the true extent of environmental damages may become available; an important aspect of the problem is the extent to which damages in different countries may be correlated. In the first part of the paper we consider a simple model with two symmetric countries and show that the value of perfect information is an increasing function of the correlation between damages in the two countries in both the cooperative and non-cooperative equilibria. However, while the value of perfect information is always non-negative in the cooperative equilibrium, in the non- cooperative equilibrium there is a critical value of the correlation coefficient below which the value of perfect information will be negative. In the second part of the paper we construct an empirical model of global warming distinguishing between OECD and non-OECD countries and show that in the non-cooperative equilibrium the value of perfect information for OECD countries is negative when the correlation coefficient between environmental damages for OECD and non-OECD countries is negative. The implications of these results for international agreements are discussed. 3 tabs., 26 refs

  5. Investment and uncertainty in the international oil and gas industry

    International Nuclear Information System (INIS)

    Mohn, Klaus; Misund, Baard

    2009-01-01

    The standard theory of irreversible investments and real options suggests a negative relation between investment and uncertainty. Richer models with compound option structures open for a positive relationship. This paper presents a micro-econometric study of corporate investment and uncertainty in a period of market turbulence and restructuring in the international oil and gas industry. Based on data for 115 companies over the period 1992-2005, we estimate four different specifications of the q model of investment, with robust results for the uncertainty variables. The estimated models suggest that macroeconomic uncertainty creates a bottleneck for oil and gas investment and production, whereas industry-specific uncertainty has a stimulating effect. (author)

  6. Managing uncertainty in adaptation | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2011-03-01

    Mar 1, 2011 ... Along with methodological issues that are common elsewhere, researchers in Africa also face a lack of solid basic information, such as historical climate data or reliable census data. But experience from participatory action research in Africa suggests that scientific uncertainty is not the main obstacle to ...

  7. Internally driven inertial waves in geodynamo simulations

    Science.gov (United States)

    Ranjan, A.; Davidson, P. A.; Christensen, U. R.; Wicht, J.

    2018-05-01

    Inertial waves are oscillations in a rotating fluid, such as the Earth's outer core, which result from the restoring action of the Coriolis force. In an earlier work, it was argued by Davidson that inertial waves launched near the equatorial regions could be important for the α2 dynamo mechanism, as they can maintain a helicity distribution which is negative (positive) in the north (south). Here, we identify such internally driven inertial waves, triggered by buoyant anomalies in the equatorial regions in a strongly forced geodynamo simulation. Using the time derivative of vertical velocity, ∂uz/∂t, as a diagnostic for traveling wave fronts, we find that the horizontal movement in the buoyancy field near the equator is well correlated with a corresponding movement of the fluid far from the equator. Moreover, the azimuthally averaged spectrum of ∂uz/∂t lies in the inertial wave frequency range. We also test the dispersion properties of the waves by computing the spectral energy as a function of frequency, ϖ, and the dispersion angle, θ. Our results suggest that the columnar flow in the rotation-dominated core, which is an important ingredient for the maintenance of a dipolar magnetic field, is maintained despite the chaotic evolution of the buoyancy field on a fast timescale by internally driven inertial waves.

  8. Agriculture-driven deforestation in the tropics from 1990-2015: emissions, trends and uncertainties

    Science.gov (United States)

    Carter, Sarah; Herold, Martin; Avitabile, Valerio; de Bruin, Sytze; De Sy, Veronique; Kooistra, Lammert; Rufino, Mariana C.

    2018-01-01

    Limited data exists on emissions from agriculture-driven deforestation, and available data are typically uncertain. In this paper, we provide comparable estimates of emissions from both all deforestation and agriculture-driven deforestation, with uncertainties for 91 countries across the tropics between 1990 and 2015. Uncertainties associated with input datasets (activity data and emissions factors) were used to combine the datasets, where most certain datasets contribute the most. This method utilizes all the input data, while minimizing the uncertainty of the emissions estimate. The uncertainty of input datasets was influenced by the quality of the data, the sample size (for sample-based datasets), and the extent to which the timeframe of the data matches the period of interest. Area of deforestation, and the agriculture-driver factor (extent to which agriculture drives deforestation), were the most uncertain components of the emissions estimates, thus improvement in the uncertainties related to these estimates will provide the greatest reductions in uncertainties of emissions estimates. Over the period of the study, Latin America had the highest proportion of deforestation driven by agriculture (78%), and Africa had the lowest (62%). Latin America had the highest emissions from agriculture-driven deforestation, and these peaked at 974 ± 148 Mt CO2 yr-1 in 2000-2005. Africa saw a continuous increase in emissions between 1990 and 2015 (from 154 ± 21-412 ± 75 Mt CO2 yr-1), so mitigation initiatives could be prioritized there. Uncertainties for emissions from agriculture-driven deforestation are ± 62.4% (average over 1990-2015), and uncertainties were highest in Asia and lowest in Latin America. Uncertainty information is crucial for transparency when reporting, and gives credibility to related mitigation initiatives. We demonstrate that uncertainty data can also be useful when combining multiple open datasets, so we recommend new data

  9. Uncertainty

    International Nuclear Information System (INIS)

    Silva, T.A. da

    1988-01-01

    The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt

  10. Fifth International Conference on Squeezed States and Uncertainty Relations

    Science.gov (United States)

    Han, D. (Editor); Janszky, J. (Editor); Kim, Y. S. (Editor); Man'ko, V. I. (Editor)

    1998-01-01

    The Fifth International Conference on Squeezed States and Uncertainty Relations was held at Balatonfured, Hungary, on 27-31 May 1997. This series was initiated in 1991 at the College Park Campus of the University of Maryland as the Workshop on Squeezed States and Uncertainty Relations. The scientific purpose of this series was to discuss squeezed states of light, but in recent years the scope is becoming broad enough to include studies of uncertainty relations and squeeze transformations in all branches of physics including quantum optics and foundations of quantum mechanics. Quantum optics will continue playing the pivotal role in the future, but the future meetings will include all branches of physics where squeeze transformations are basic. As the meeting attracted more participants and started covering more diversified subjects, the fourth meeting was called an international conference. The Fourth International Conference on Squeezed States and Uncertainty Relations was held in 1995 was hosted by Shanxi University in Taiyuan, China. The fifth meeting of this series, which was held at Balatonfured, Hungary, was also supported by the IUPAP. In 1999, the Sixth International Conference will be hosted by the University of Naples in 1999. The meeting will take place in Ravello near Naples.

  11. The Second International Workshop on Squeezed States and Uncertainty Relations

    Science.gov (United States)

    Han, D. (Editor); Kim, Y. S.; Manko, V. I.

    1993-01-01

    This conference publication contains the proceedings of the Second International Workshop on Squeezed States and Uncertainty Relations held in Moscow, Russia, on 25-29 May 1992. The purpose of this workshop was to study possible applications of squeezed states of light. The Workshop brought together many active researchers in squeezed states of light and those who may find the concept of squeezed states useful in their research, particularly in understanding the uncertainty relations. It was found at this workshop that the squeezed state has a much broader implication than the two-photon coherent states in quantum optics, since the squeeze transformation is one of the most fundamental transformations in physics.

  12. Management of internal communication in times of uncertainty

    International Nuclear Information System (INIS)

    Fernandez de la Gala, F.

    2014-01-01

    Garona is having a strong media coverage since 2009. The continuity process is under great controversy that has generated increased uncertainty for workers and their families, affecting motivation. Although internal communication has sought to manage its effects on the structure of the company, the rate of spread of alien information has made this complex mission. The regulatory body has been interested in its potential impact on safety culture, making a significant difference compared to other industrial sectors. (Author)

  13. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  14. Scenario and modelling uncertainty in global mean temperature change derived from emission driven Global Climate Models

    Science.gov (United States)

    Booth, B. B. B.; Bernie, D.; McNeall, D.; Hawkins, E.; Caesar, J.; Boulton, C.; Friedlingstein, P.; Sexton, D.

    2012-09-01

    We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission driven rather than concentration driven perturbed parameter ensemble of a Global Climate Model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration driven simulations (with 10-90 percentile ranges of 1.7 K for the aggressive mitigation scenario up to 3.9 K for the high end business as usual scenario). A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 degrees (RCP8.5) and even under aggressive mitigation (RCP2.6) temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission driven experiments, they do not change existing expectations (based on previous concentration driven experiments) on the timescale that different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in case of SRES A1B and the Representative Concentration Pathways (RCPs), the concentration pathways used to drive GCM ensembles lies towards the lower end of our simulated distribution. This design decision (a legecy of previous assessments) is likely to lead concentration driven experiments to under-sample strong feedback responses in concentration driven projections. Our ensemble of emission driven simulations span the global temperature response of other multi-model frameworks except at the low end, where combinations of low climate sensitivity and low carbon cycle feedbacks lead to responses outside our ensemble range. The ensemble simulates a number of high end responses which lie above the CMIP5 carbon

  15. Fourth International Conference on Squeezed States and Uncertainty Relations

    Science.gov (United States)

    Han, D. (Editor); Peng, Kunchi (Editor); Kim, Y. S. (Editor); Manko, V. I. (Editor)

    1996-01-01

    The fourth International Conference on Squeezed States and Uncertainty Relations was held at Shanxi University, Taiyuan, Shanxi, China, on June 5 - 9, 1995. This conference was jointly organized by Shanxi University, the University of Maryland (U.S.A.), and the Lebedev Physical Institute (Russia). The first meeting of this series was called the Workshop on Squeezed States and Uncertainty Relations, and was held in 1991 at College Park, Maryland. The second and third meetings in this series were hosted in 1992 by the Lebedev Institute in Moscow, and in 1993 by the University of Maryland Baltimore County, respectively. The scientific purpose of this series was initially to discuss squeezed states of light, but in recent years, the scope is becoming broad enough to include studies of uncertainty relations and squeeze transformations in all branches of physics, including, of course, quantum optics and foundations of quantum mechanics. Quantum optics will continue playing the pivotal role in the future, but the future meetings will include all branches of physics where squeeze transformations are basic transformation. This transition took place at the fourth meeting of this series held at Shanxi University in 1995. The fifth meeting in this series will be held in Budapest (Hungary) in 1997, and the principal organizer will be Jozsef Janszky of the Laboratory of Crystal Physics, P.O. Box 132, H-1052. Budapest, Hungary.

  16. Uncertainty Driven Action (UDA) model: A foundation for unifying perspectives on design activity

    DEFF Research Database (Denmark)

    Cash, Philip; Kreye, Melanie

    2017-01-01

    are linked via uncertainty perception. The foundations of the UDA model in the design literature are elaborated in terms of the three core actions and their links to designer cognition and behaviour, utilising definitions and concepts from Activity Theory. The practical relevance and theoretical......This paper proposes the Uncertainty Driven Action (UDA) model, which unifies the fragmented literature on design activity. The UDA model conceptualises design activity as a process consisting of three core actions: information action, knowledge-sharing action, and representation action, which...... contributions of the UDA model are discussed. This paper contributes to the design literature by offering a comprehensive formalisation of design activity of individual designers, which connects cognition and action, to provide a foundation for understanding previously disparate descriptions of design activity....

  17. USACM Thematic Workshop On Uncertainty Quantification And Data-Driven Modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, James R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    The USACM Thematic Workshop on Uncertainty Quantification and Data-Driven Modeling was held on March 23-24, 2017, in Austin, TX. The organizers of the technical program were James R. Stewart of Sandia National Laboratories and Krishna Garikipati of University of Michigan. The administrative organizer was Ruth Hengst, who serves as Program Coordinator for the USACM. The organization of this workshop was coordinated through the USACM Technical Thrust Area on Uncertainty Quantification and Probabilistic Analysis. The workshop website (http://uqpm2017.usacm.org) includes the presentation agenda as well as links to several of the presentation slides (permission to access the presentations was granted by each of those speakers, respectively). Herein, this final report contains the complete workshop program that includes the presentation agenda, the presentation abstracts, and the list of posters.

  18. Stakeholder-driven multi-attribute analysis for energy project selection under uncertainty

    International Nuclear Information System (INIS)

    Read, Laura; Madani, Kaveh; Mokhtari, Soroush; Hanks, Catherine

    2017-01-01

    In practice, selecting an energy project for development requires balancing criteria and competing stakeholder priorities to identify the best alternative. Energy source selection can be modeled as multi-criteria decision-maker problems to provide quantitative support to reconcile technical, economic, environmental, social, and political factors with respect to the stakeholders' interests. Decision making among these complex interactions should also account for the uncertainty present in the input data. In response, this work develops a stochastic decision analysis framework to evaluate alternatives by involving stakeholders to identify both quantitative and qualitative selection criteria and performance metrics which carry uncertainties. The developed framework is illustrated using a case study from Fairbanks, Alaska, where decision makers and residents must decide on a new source of energy for heating and electricity. We approach this problem in a five step methodology: (1) engaging experts (role players) to develop criteria of project performance; (2) collecting a range of quantitative and qualitative input information to determine the performance of each proposed solution according to the selected criteria; (3) performing a Monte-Carlo analysis to capture uncertainties given in the inputs; (4) applying multi-criteria decision-making, social choice (voting), and fallback bargaining methods to account for three different levels of cooperation among the stakeholders; and (5) computing an aggregate performance index (API) score for each alternative based on its performance across criteria and cooperation levels. API scores communicate relative performance between alternatives. In this way, our methodology maps uncertainty from the input data to reflect risk in the decision and incorporates varying degrees of cooperation into the analysis to identify an optimal and practical alternative. - Highlights: • We develop an applicable stakeholder-driven framework for

  19. Optimisation of internal contamination monitoring programme by integration of uncertainties

    International Nuclear Information System (INIS)

    Davesne, E.; Casanova, P.; Chojnacki, E.; Paquet, F.; Blanchardon, E.

    2011-01-01

    Potential internal contamination of workers is monitored by periodic bioassay measurements interpreted in terms of intake and committed effective dose by the use of biokinetic and dosimetric models. After a prospective evaluation of exposure at a workplace, a suitable monitoring programme can be defined by choosing adequate measurement techniques and frequency. In this study, the sensitivity of a programme is evaluated by the minimum intake and dose, which may be detected with a given level of confidence by taking into account uncertainties on exposure conditions and measurements. This is made for programme optimisation, which is performed by comparing the sensitivities of different alternative programmes. These methods were applied at the AREVA NC reprocessing plant and support the current monitoring programme as the best compromise between the cost of the measurements and the sensitivity of the programme. (authors)

  20. International conference on sub-critical accelerator driven systems. Proceedings

    International Nuclear Information System (INIS)

    Litovkina, L.P.; Titarenko, Yu.E.

    1999-01-01

    The International Meeting on Sub-Critical Accelerator Driven Systems was organized by the State Scientific Center - Institute for Theoretical and Experimental Physics with participation of Atomic Ministry of RF. The Meeting objective was to analyze the recent achievements and tendencies of the accelerator-driven systems development. The Meeting program covers a broad range of problems including the accelerator-driven systems (ADS) conceptual design; analyzing the ADS role in nuclear fuel cycle; accuracy of modeling the main parameters of ADS; conceptual design of high-current accelerators. Moreover, the results of recent experimental and theoretical studies on nuclear data accumulation to support the ADS technologies are presented. About 70 scientists from the main scientific centers of Russia, as well as scientists from USA, France, Belgium, India, and Yugoslavia, attended the meeting and presented 44 works [ru

  1. Sixth International Conference on Squeezed States and Uncertainty Relations

    Science.gov (United States)

    Han, D. (Editor); Kim, Y. S. (Editor); Solimento, S. (Editor)

    2000-01-01

    These proceedings contain contributions from about 200 participants to the 6th International Conference on Squeezed States and Uncertainty Relations (ICSSUR'99) held in Naples May 24-29, 1999, and organized jointly by the University of Naples "Federico II," the University of Maryland at College Park, and the Lebedev Institute, Moscow. This was the sixth of a series of very successful meetings started in 1990 at the College Park Campus of the University of Maryland. The other meetings in the series were held in Moscow (1992), Baltimore (1993), Taiyuan P.R.C. (1995) and Balatonfuered, Hungary (1997). The present one was held at the campus Monte Sant'Angelo of the University "Federico II" of Naples. The meeting sought to provide a forum for updating and reviewing a wide range of quantum optics disciplines, including device developments and applications, and related areas of quantum measurements and quantum noise. Over the years, the ICSSUR Conference evolved from a meeting on quantum measurement sector of quantum optics, to a wide range of quantum optics themes, including multifacet aspects of generation, measurement, and applications of nonclassical light (squeezed and Schrodinger cat radiation fields, etc.), and encompassing several related areas, ranging from quantum measurement to quantum noise. ICSSUR'99 brought together about 250 people active in the field of quantum optics, with special emphasis on nonclassical light sources and related areas. The Conference was organized in 8 Sections: Squeezed states and uncertainty relations; Harmonic oscillators and squeeze transformations; Methods of quantum interference and correlations; Quantum measurements; Generation and characterisation of non-classical light; Quantum noise; Quantum communication and information; and Quantum-like systems.

  2. International conference on Facets of Uncertainties and Applications

    CERN Document Server

    Skowron, Andrzej; Maiti, Manoranjan; Kar, Samarjit

    2015-01-01

    Since the emergence of the formal concept of probability theory in the seventeenth century, uncertainty has been perceived solely in terms of probability theory. However, this apparently unique link between uncertainty and probability theory has come under investigation a few decades back. Uncertainties are nowadays accepted to be of various kinds. Uncertainty in general could refer to different sense like not certainly known, questionable, problematic, vague, not definite or determined, ambiguous, liable to change, not reliable. In Indian languages, particularly in Sanskrit-based languages, there are other higher levels of uncertainties. It has been shown that several mathematical concepts such as the theory of fuzzy sets, theory of rough sets, evidence theory, possibility theory, theory of complex systems and complex network, theory of fuzzy measures and uncertainty theory can also successfully model uncertainty.

  3. Essentialist beliefs, sexual identity uncertainty, internalized homonegativity and psychological wellbeing in gay men.

    Science.gov (United States)

    Morandini, James S; Blaszczynski, Alexander; Ross, Michael W; Costa, Daniel S J; Dar-Nimrod, Ilan

    2015-07-01

    The present study examined essentialist beliefs about sexual orientation and their implications for sexual identity uncertainty, internalized homonegativity and psychological wellbeing in a sample of gay men. A combination of targeted sampling and snowball strategies were used to recruit 639 gay identifying men for a cross-sectional online survey. Participants completed a questionnaire assessing sexual orientation beliefs, sexual identity uncertainty, internalized homonegativity, and psychological wellbeing outcomes. Structural equation modeling was used to test whether essentialist beliefs were associated with psychological wellbeing indirectly via their effect on sexual identity uncertainty and internalized homonegativity. A unique pattern of direct and indirect effects was observed in which facets of essentialism predicted sexual identity uncertainty, internalized homonegativity and psychological wellbeing. Of note, viewing sexual orientation as immutable/biologically based and as existing in discrete categories, were associated with less sexual identity uncertainty. On the other hand, these beliefs had divergent relationships with internalized homonegativity, with immutability/biological beliefs associated with lower, and discreteness beliefs associated with greater internalized homonegativity. Of interest, although sexual identity uncertainty was associated with poorer psychological wellbeing via its contribution to internalized homophobia, there was no direct relationship between identity uncertainty and psychological wellbeing. Findings indicate that essentializing sexual orientation has mixed implications for sexual identity uncertainty and internalized homonegativity and wellbeing in gay men. Those undertaking educational and clinical interventions with gay men should be aware of the benefits and of caveats of essentialist theories of homosexuality for this population. (c) 2015 APA, all rights reserved).

  4. A data-driven approach for modeling post-fire debris-flow volumes and their uncertainty

    Science.gov (United States)

    Friedel, Michael J.

    2011-01-01

    This study demonstrates the novel application of genetic programming to evolve nonlinear post-fire debris-flow volume equations from variables associated with a data-driven conceptual model of the western United States. The search space is constrained using a multi-component objective function that simultaneously minimizes root-mean squared and unit errors for the evolution of fittest equations. An optimization technique is then used to estimate the limits of nonlinear prediction uncertainty associated with the debris-flow equations. In contrast to a published multiple linear regression three-variable equation, linking basin area with slopes greater or equal to 30 percent, burn severity characterized as area burned moderate plus high, and total storm rainfall, the data-driven approach discovers many nonlinear and several dimensionally consistent equations that are unbiased and have less prediction uncertainty. Of the nonlinear equations, the best performance (lowest prediction uncertainty) is achieved when using three variables: average basin slope, total burned area, and total storm rainfall. Further reduction in uncertainty is possible for the nonlinear equations when dimensional consistency is not a priority and by subsequently applying a gradient solver to the fittest solutions. The data-driven modeling approach can be applied to nonlinear multivariate problems in all fields of study.

  5. Uncertainties

    Indian Academy of Sciences (India)

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  6. Uncertainty budget in internal monostandard NAA for small and large size samples analysis

    International Nuclear Information System (INIS)

    Dasari, K.B.; Acharya, R.

    2014-01-01

    Total uncertainty budget evaluation on determined concentration value is important under quality assurance programme. Concentration calculation in NAA or carried out by relative NAA and k0 based internal monostandard NAA (IM-NAA) method. IM-NAA method has been used for small and large sample analysis of clay potteries. An attempt was made to identify the uncertainty components in IM-NAA and uncertainty budget for La in both small and large size samples has been evaluated and compared. (author)

  7. Uncertainty evaluation in normalization of isotope delta measurement results against international reference materials.

    Science.gov (United States)

    Meija, Juris; Chartrand, Michelle M G

    2018-01-01

    Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.

  8. Pupil-linked arousal is driven by decision uncertainty and alters serial choice bias

    NARCIS (Netherlands)

    Urai, A.E.; Braun, A.; Donner, T.H.

    2017-01-01

    While judging their sensory environments, decision-makers seem to use the uncertainty about their choices to guide adjustments of their subsequent behaviour. One possible source of these behavioural adjustments is arousal: decision uncertainty might drive the brain's arousal systems, which control

  9. Scenario and modelling uncertainty in global mean temperature change derived from emission-driven global climate models

    Directory of Open Access Journals (Sweden)

    B. B. B. Booth

    2013-04-01

    Full Text Available We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission-driven rather than concentration-driven perturbed parameter ensemble of a global climate model (GCM. These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration-driven simulations (with 10–90th percentile ranges of 1.7 K for the aggressive mitigation scenario, up to 3.9 K for the high-end, business as usual scenario. A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 K (RCP8.5 and even under aggressive mitigation (RCP2.6 temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission-driven experiments, they do not change existing expectations (based on previous concentration-driven experiments on the timescales over which different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in the case of SRES A1B and the Representative Concentration Pathways (RCPs, the concentration scenarios used to drive GCM ensembles, lies towards the lower end of our simulated distribution. This design decision (a legacy of previous assessments is likely to lead concentration-driven experiments to under-sample strong feedback responses in future projections. Our ensemble of emission-driven simulations span the global temperature response of the CMIP5 emission-driven simulations, except at the low end. Combinations of low climate sensitivity and low carbon cycle feedbacks lead to a number of CMIP5 responses to lie below our ensemble range. The ensemble simulates a number of high

  10. Scenario and modelling uncertainty in global mean temperature change derived from emission-driven global climate models

    Science.gov (United States)

    Booth, B. B. B.; Bernie, D.; McNeall, D.; Hawkins, E.; Caesar, J.; Boulton, C.; Friedlingstein, P.; Sexton, D. M. H.

    2013-04-01

    We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission-driven rather than concentration-driven perturbed parameter ensemble of a global climate model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration-driven simulations (with 10-90th percentile ranges of 1.7 K for the aggressive mitigation scenario, up to 3.9 K for the high-end, business as usual scenario). A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 K (RCP8.5) and even under aggressive mitigation (RCP2.6) temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission-driven experiments, they do not change existing expectations (based on previous concentration-driven experiments) on the timescales over which different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in the case of SRES A1B and the Representative Concentration Pathways (RCPs), the concentration scenarios used to drive GCM ensembles, lies towards the lower end of our simulated distribution. This design decision (a legacy of previous assessments) is likely to lead concentration-driven experiments to under-sample strong feedback responses in future projections. Our ensemble of emission-driven simulations span the global temperature response of the CMIP5 emission-driven simulations, except at the low end. Combinations of low climate sensitivity and low carbon cycle feedbacks lead to a number of CMIP5 responses to lie below our ensemble range. The ensemble simulates a number of high-end responses which lie

  11. Laser-driven nuclear-polarized hydrogen internal gas target

    International Nuclear Information System (INIS)

    Seely, J.; Crawford, C.; Clasie, B.; Xu, W.; Dutta, D.; Gao, H.

    2006-01-01

    We report the performance of a laser-driven polarized internal hydrogen gas target (LDT) in a configuration similar to that used in scattering experiments. This target used the technique of spin-exchange optical pumping to produce nuclear spin polarized hydrogen gas that was fed into a cylindrical storage (target) cell. We present in this paper the performance of the target, methods that were tried to improve the figure-of-merit (FOM) of the target, and a Monte Carlo simulation of spin-exchange optical pumping. The dimensions of the apparatus were optimized using the simulation and the experimental results were in good agreement with the results from the simulation. The best experimental result achieved was at a hydrogen flow rate of 1.1x10 18 atoms/s, where the sample beam exiting the storage cell had 58.2% degree of dissociation and 50.5% polarization. Based on this measurement, the atomic fraction in the storage cell was 49.6% and the density averaged nuclear polarization was 25.0%. This represents the highest FOM for hydrogen from an LDT and is higher than the best FOM reported by atomic beam sources that used storage cells

  12. Pupil-linked arousal is driven by decision uncertainty and alters serial choice bias

    Science.gov (United States)

    Urai, Anne E.; Braun, Anke; Donner, Tobias H.

    2017-03-01

    While judging their sensory environments, decision-makers seem to use the uncertainty about their choices to guide adjustments of their subsequent behaviour. One possible source of these behavioural adjustments is arousal: decision uncertainty might drive the brain's arousal systems, which control global brain state and might thereby shape subsequent decision-making. Here, we measure pupil diameter, a proxy for central arousal state, in human observers performing a perceptual choice task of varying difficulty. Pupil dilation, after choice but before external feedback, reflects three hallmark signatures of decision uncertainty derived from a computational model. This increase in pupil-linked arousal boosts observers' tendency to alternate their choice on the subsequent trial. We conclude that decision uncertainty drives rapid changes in pupil-linked arousal state, which shape the serial correlation structure of ongoing choice behaviour.

  13. Internal design of technical systems under conditions of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Krasnoshchekov, P S; Morozov, V V; Fedorov, V V

    1982-03-01

    An investigation is made of a model of internal design of a complex technical system in the presence of uncertain factors. The influence of an opponent on the design is examined. The concepts of hierarchical and balanced compatibility between the criteria of the designer, the opponent and the segregations functions are introduced and studied. The connection between the approach proposed and the methods of artificial intelligence is discussed. 5 references.

  14. Third International Workshop on Squeezed States and Uncertainty Relations

    Science.gov (United States)

    Han, D. (Editor); Kim, Y. S. (Editor); Rubin, Morton H. (Editor); Shih, Yan-Hua (Editor); Zachary, Woodford W. (Editor)

    1994-01-01

    The purpose of these workshops is to bring together an international selection of scientists to discuss the latest developments in Squeezed States in various branches of physics, and in the understanding of the foundations of quantum mechanics. At the third workshop, special attention was given to the influence that quantum optics is having on our understanding of quantum measurement theory. The fourth meeting in this series will be held in the People's Republic of China.

  15. 2nd International Conference on Cable-Driven Parallel Robots

    CERN Document Server

    Bruckmann, Tobias

    2015-01-01

    This volume presents the outcome of the second forum to cable-driven parallel robots, bringing the cable robot community together. It shows the new ideas of the active researchers developing cable-driven robots. The book presents the state of the art, including both summarizing contributions as well as latest research and future options. The book cover all topics which are essential for cable-driven robots: Classification Kinematics, Workspace and Singularity Analysis Statics and Dynamics Cable Modeling Control and Calibration Design Methodology Hardware Development Experimental Evaluation Prototypes, Application Reports and new Application concepts

  16. On international fisheries agreements, entry deterrence, and ecological uncertainty.

    Science.gov (United States)

    Ellefsen, Hans; Grønbæk, Lone; Ravn-Jonsen, Lars

    2017-05-15

    A prerequisite for an international fisheries agreement (IFA) to be stable is that parties expect the benefits from joining the agreement to exceed the benefits from free riding on the agreement, and parties only comply with the agreement as long as this is true. The agreement, therefore, implicitly builds on an expectation of the ecological condition of the natural resource. Game theoretical models often assume that all parties have the same (often perfect) information about the resource and that the exploitation is an equilibrium use of the stock. As stated by experts in natural science, the fish ecology still has many open questions, for example how to predict population dynamics, migration patterns, food availability, etc. In some cases, parties disagree about the state, abundance, and migration of a stock, which can reduce the possibilities of reaching an agreement for exploitation of the stock. This paper develops a model and applies it to the North-East Atlantic mackerel fishery, in order to analyze an IFA under different ecological scenarios, and also combines the model with the economic theory of entry deterrence. The model is used empirically to determine whether the parties with original access to the resource have an advantage when forming an agreement with a new party in having the ability to fish the stock down to a smaller size and thereby prevent another party from entering into the fishery. With a basis in entry deterrence, combined with lack of information, the paper illustrates the obstacles that have made an agreement for the North-East Atlantic mackerel so difficult to achieve. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Narrative of certitude for uncertainty normalisation regarding biotechnology in international organisations

    OpenAIRE

    Heath , Robert; Proutheau , Stéphanie

    2012-01-01

    International audience; Narrative theory has gained prominence especially as a companion to social construction of reality In matters of regulation and normalization, narratives socially and culturallyconstruct relevant contingencies, uncertainties, values, and decision. Here, decision dynamics pit risk generators, bearers, bearers' advocates, arbiters, researchers and informers as advocates and counter advocates (Palmlund, 2009). the decision-relevant narrative components (actors, themes, sc...

  18. Reliability of a new biokinetic model of zirconium in internal dosimetry: part I, parameter uncertainty analysis.

    Science.gov (United States)

    Li, Wei Bo; Greiter, Matthias; Oeh, Uwe; Hoeschen, Christoph

    2011-12-01

    The reliability of biokinetic models is essential in internal dose assessments and radiation risk analysis for the public, occupational workers, and patients exposed to radionuclides. In this paper, a method for assessing the reliability of biokinetic models by means of uncertainty and sensitivity analysis was developed. The paper is divided into two parts. In the first part of the study published here, the uncertainty sources of the model parameters for zirconium (Zr), developed by the International Commission on Radiological Protection (ICRP), were identified and analyzed. Furthermore, the uncertainty of the biokinetic experimental measurement performed at the Helmholtz Zentrum München-German Research Center for Environmental Health (HMGU) for developing a new biokinetic model of Zr was analyzed according to the Guide to the Expression of Uncertainty in Measurement, published by the International Organization for Standardization. The confidence interval and distribution of model parameters of the ICRP and HMGU Zr biokinetic models were evaluated. As a result of computer biokinetic modelings, the mean, standard uncertainty, and confidence interval of model prediction calculated based on the model parameter uncertainty were presented and compared to the plasma clearance and urinary excretion measured after intravenous administration. It was shown that for the most important compartment, the plasma, the uncertainty evaluated for the HMGU model was much smaller than that for the ICRP model; that phenomenon was observed for other organs and tissues as well. The uncertainty of the integral of the radioactivity of Zr up to 50 y calculated by the HMGU model after ingestion by adult members of the public was shown to be smaller by a factor of two than that of the ICRP model. It was also shown that the distribution type of the model parameter strongly influences the model prediction, and the correlation of the model input parameters affects the model prediction to a

  19. The treatment of climate-driven environmental change and associated uncertainty in post-closure assessments

    International Nuclear Information System (INIS)

    Wilmot, R.D.

    1993-01-01

    The post-closure performance of radioactive waste repositories is influenced by a range of processes such as groundwater flow and fracture movement which are in turn affected by conditions in the surface environment. For deep repositories the period for which an assessment must be performed is in the order of 10 6 years. The geological record of the last 10 6 years shows that surface environmental conditions have varied considerably over such time-scales. A model of surface environmental change, known as TIME4, has been developed on behalf of the UK Department of the Environment for use with the probabilistic risk assessment code VANDAL. This paper describes the extent of surface environmental change, discusses possible driving mechanisms for such changes and summarises the processes which have been incorporated within the TIME4 model. The underlying cause of change in surface environment sub-systems is inferred to be climate change but considerable uncertainty remains over the mechanisms of such change. Methods for treating these uncertainties are described. (author)

  20. Communication of uncertainty in hydrological predictions: a user-driven example web service for Europe

    Science.gov (United States)

    Fry, Matt; Smith, Katie; Sheffield, Justin; Watts, Glenn; Wood, Eric; Cooper, Jon; Prudhomme, Christel; Rees, Gwyn

    2017-04-01

    Water is fundamental to society as it impacts on all facets of life, the economy and the environment. But whilst it creates opportunities for growth and life, it can also cause serious damages to society and infrastructure through extreme hydro-meteorological events such as floods or droughts. Anticipation of future water availability and extreme event risks would both help optimise growth and limit damage through better preparedness and planning, hence providing huge societal benefits. Recent scientific research advances make it now possible to provide hydrological outlooks at monthly to seasonal lead time, and future projections up to the end of the century accounting for climatic changes. However, high uncertainty remains in the predictions, which varies depending on location, time of the year, prediction range and hydrological variable. It is essential that this uncertainty is fully understood by decision makers so they can account for it in their planning. Hence, the challenge is to finds ways to communicate such uncertainty for a range of stakeholders with different technical background and environmental science knowledge. The project EDgE (End-to end Demonstrator for improved decision making in the water sector for Europe) funded by the Copernicus programme (C3S) is a proof-of-concept project that develops a unique service to support decision making for the water sector at monthly to seasonal and to multi-decadal lead times. It is a mutual effort of co-production between hydrologists and environmental modellers, computer scientists and stakeholders representative of key decision makers in Europe for the water sector. This talk will present the iterative co-production process of a web service that serves the need of the user community. Through a series of Focus Group meetings in Spain, Norway and the UK, options for visualising the hydrological predictions and associated uncertainties are presented and discussed first as mock-up dash boards, off-line tools

  1. International survey for good practices in forecasting uncertainty assessment and communication

    Science.gov (United States)

    Berthet, Lionel; Piotte, Olivier

    2014-05-01

    Achieving technically sound flood forecasts is a crucial objective for forecasters but remains of poor use if the users do not understand properly their significance and do not use it properly in decision making. One usual way to precise the forecasts limitations is to communicate some information about their uncertainty. Uncertainty assessment and communication to stakeholders are thus important issues for operational flood forecasting services (FFS) but remain open fields for research. French FFS wants to publish graphical streamflow and level forecasts along with uncertainty assessment in near future on its website (available to the greater public). In order to choose the technical options best adapted to its operational context, it carried out a survey among more than 15 fellow institutions. Most of these are providing forecasts and warnings to civil protection officers while some were mostly working for hydroelectricity suppliers. A questionnaire has been prepared in order to standardize the analysis of the practices of the surveyed institutions. The survey was conducted by gathering information from technical reports or from the scientific literature, as well as 'interviews' driven by phone, email discussions or meetings. The questionnaire helped in the exploration of practices in uncertainty assessment, evaluation and communication. Attention was paid to the particular context within which every insitution works, in the analysis drawn from raw results. Results show that most services interviewed assess their forecasts uncertainty. However, practices can differ significantly from a country to another. Popular techniques are ensemble approaches. They allow to take into account several uncertainty sources. Statistical past forecasts analysis (such as the quantile regressions) are also commonly used. Contrary to what was expected, only few services emphasize the role of the forecaster (subjective assessment). Similar contrasts can be observed in uncertainty

  2. Uncertainty-driven nuclear data evaluation including thermal (n,α) applied to 59Ni

    Science.gov (United States)

    Helgesson, P.; Sjöstrand, H.; Rochman, D.

    2017-11-01

    This paper presents a novel approach to the evaluation of nuclear data (ND), combining experimental data for thermal cross sections with resonance parameters and nuclear reaction modeling. The method involves sampling of various uncertain parameters, in particular uncertain components in experimental setups, and provides extensive covariance information, including consistent cross-channel correlations over the whole energy spectrum. The method is developed for, and applied to, 59Ni, but may be used as a whole, or in part, for other nuclides. 59Ni is particularly interesting since a substantial amount of 59Ni is produced in thermal nuclear reactors by neutron capture in 58Ni and since it has a non-threshold (n,α) cross section. Therefore, 59Ni gives a very important contribution to the helium production in stainless steel in a thermal reactor. However, current evaluated ND libraries contain old information for 59Ni, without any uncertainty information. The work includes a study of thermal cross section experiments and a novel combination of this experimental information, giving the full multivariate distribution of the thermal cross sections. In particular, the thermal (n,α) cross section is found to be 12.7 ± . 7 b. This is consistent with, but yet different from, current established values. Further, the distribution of thermal cross sections is combined with reported resonance parameters, and with TENDL-2015 data, to provide full random ENDF files; all of this is done in a novel way, keeping uncertainties and correlations in mind. The random files are also condensed into one single ENDF file with covariance information, which is now part of a beta version of JEFF 3.3. Finally, the random ENDF files have been processed and used in an MCNP model to study the helium production in stainless steel. The increase in the (n,α) rate due to 59Ni compared to fresh stainless steel is found to be a factor of 5.2 at a certain time in the reactor vessel, with a relative

  3. Uncertainties in Future Regional Sea Level Trends: How to Deal with the Internal Climate Variability?

    Science.gov (United States)

    Becker, M.; Karpytchev, M.; Hu, A.; Deser, C.; Lennartz-Sassinek, S.

    2017-12-01

    Today, the Climate models (CM) are the main tools for forecasting sea level rise (SLR) at global and regional scales. The CM forecasts are accompanied by inherent uncertainties. Understanding and reducing these uncertainties is becoming a matter of increasing urgency in order to provide robust estimates of SLR impact on coastal societies, which need sustainable choices of climate adaptation strategy. These CM uncertainties are linked to structural model formulation, initial conditions, emission scenario and internal variability. The internal variability is due to complex non-linear interactions within the Earth Climate System and can induce diverse quasi-periodic oscillatory modes and long-term persistences. To quantify the effects of internal variability, most studies used multi-model ensembles or sea level projections from a single model ran with perturbed initial conditions. However, large ensembles are not generally available, or too small, and computationally expensive. In this study, we use a power-law scaling of sea level fluctuations, as observed in many other geophysical signals and natural systems, which can be used to characterize the internal climate variability. From this specific statistical framework, we (1) use the pre-industrial control run of the National Center for Atmospheric Research Community Climate System Model (NCAR-CCSM) to test the robustness of the power-law scaling hypothesis; (2) employ the power-law statistics as a tool for assessing the spread of regional sea level projections due to the internal climate variability for the 21st century NCAR-CCSM; (3) compare the uncertainties in predicted sea level changes obtained from a NCAR-CCSM multi-member ensemble simulations with estimates derived for power-law processes, and (4) explore the sensitivity of spatial patterns of the internal variability and its effects on regional sea level projections.

  4. Quantification of uncertainty in photon source spot size inference during laser-driven radiography experiments at TRIDENT

    Energy Technology Data Exchange (ETDEWEB)

    Tobias, Benjamin John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Palaniyappan, Sasikumar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gautier, Donald Cort [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mendez, Jacob [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Burris-Mog, Trevor John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Huang, Chengkun K. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Favalli, Andrea [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hunter, James F. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Espy, Michelle E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Schmidt, Derek William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Nelson, Ronald Owen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sefkow, Adam [Univ. of Rochester, NY (United States); Shimada, Tsutomu [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Johnson, Randall Philip [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fernandez, Juan Carlos [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-24

    Images of the R2DTO resolution target were obtained during laser-driven-radiography experiments performed at the TRIDENT laser facility, and analysis of these images using the Bayesian Inference Engine (BIE) determines a most probable full-width half maximum (FWHM) spot size of 78 μm. However, significant uncertainty prevails due to variation in the measured detector blur. Propagating this uncertainty in detector blur through the forward model results in an interval of probabilistic ambiguity spanning approximately 35-195 μm when the laser energy impinges on a thick (1 mm) tantalum target. In other phases of the experiment, laser energy is deposited on a thin (~100 nm) aluminum target placed 250 μm ahead of the tantalum converter. When the energetic electron beam is generated in this manner, upstream from the bremsstrahlung converter, the inferred spot size shifts to a range of much larger values, approximately 270-600 μm FWHM. This report discusses methods applied to obtain these intervals as well as concepts necessary for interpreting the result within a context of probabilistic quantitative inference.

  5. Internal dose assessments: Uncertainty studies and update of ideas guidelines and databases within CONRAD project

    International Nuclear Information System (INIS)

    Marsh, J. W.; Castellani, C. M.; Hurtgen, C.; Lopez, M. A.; Andrasi, A.; Bailey, M. R.; Birchall, A.; Blanchardon, E.; Desai, A. D.; Dorrian, M. D.; Doerfel, H.; Koukouliou, V.; Luciani, A.; Malatova, I.; Molokanov, A.; Puncher, M.; Vrba, T.

    2008-01-01

    The work of Task Group 5.1 (uncertainty studies and revision of IDEAS guidelines) and Task Group 5.5 (update of IDEAS databases) of the CONRAD project is described. Scattering factor (SF) values (i.e. measurement uncertainties) have been calculated for different radionuclides and types of monitoring data using real data contained in the IDEAS Internal Contamination Database. Based upon this work and other published values, default SF values are suggested. Uncertainty studies have been carried out using both a Bayesian approach as well as a frequentist (classical) approach. The IDEAS guidelines have been revised in areas relating to the evaluation of an effective AMAD, guidance is given on evaluating wound cases with the NCRP wound model and suggestions made on the number and type of measurements required for dose assessment. (authors)

  6. Response of ENSO amplitude to global warming in CESM large ensemble: uncertainty due to internal variability

    Science.gov (United States)

    Zheng, Xiao-Tong; Hui, Chang; Yeh, Sang-Wook

    2018-06-01

    El Niño-Southern Oscillation (ENSO) is the dominant mode of variability in the coupled ocean-atmospheric system. Future projections of ENSO change under global warming are highly uncertain among models. In this study, the effect of internal variability on ENSO amplitude change in future climate projections is investigated based on a 40-member ensemble from the Community Earth System Model Large Ensemble (CESM-LE) project. A large uncertainty is identified among ensemble members due to internal variability. The inter-member diversity is associated with a zonal dipole pattern of sea surface temperature (SST) change in the mean along the equator, which is similar to the second empirical orthogonal function (EOF) mode of tropical Pacific decadal variability (TPDV) in the unforced control simulation. The uncertainty in CESM-LE is comparable in magnitude to that among models of the Coupled Model Intercomparison Project phase 5 (CMIP5), suggesting the contribution of internal variability to the intermodel uncertainty in ENSO amplitude change. However, the causations between changes in ENSO amplitude and the mean state are distinct between CESM-LE and CMIP5 ensemble. The CESM-LE results indicate that a large ensemble of 15 members is needed to separate the relative contributions to ENSO amplitude change over the twenty-first century between forced response and internal variability.

  7. Inherently safe nuclear-driven internal combustion engines

    International Nuclear Information System (INIS)

    Alesso, P.; Chow, Tze-Show; Condit, R.; Heidrich, J.; Pettibone, J.; Streit, R.

    1991-01-01

    A family of nuclear driven engines is described in which nuclear energy released by fissioning of uranium or plutonium in a prompt critical assembly is used to heat a working gas. Engine performance is modeled using a code that calculates hydrodynamics, fission energy production, and neutron transport self-consistently. Results are given demonstrating a large negative temperature coefficient that produces self-shutoff of energy production. Reduced fission product inventory and the self-shutoff provide inherent nuclear safety. It is expected that nuclear engine reactor units could be scaled from 100 MW on up. 7 refs., 3 figs

  8. SPRT Calibration Uncertainties and Internal Quality Control at a Commercial SPRT Calibration Facility

    Science.gov (United States)

    Wiandt, T. J.

    2008-06-01

    The Hart Scientific Division of the Fluke Corporation operates two accredited standard platinum resistance thermometer (SPRT) calibration facilities, one at the Hart Scientific factory in Utah, USA, and the other at a service facility in Norwich, UK. The US facility is accredited through National Voluntary Laboratory Accreditation Program (NVLAP), and the UK facility is accredited through UKAS. Both provide SPRT calibrations using similar equipment and procedures, and at similar levels of uncertainty. These uncertainties are among the lowest available commercially. To achieve and maintain low uncertainties, it is required that the calibration procedures be thorough and optimized. However, to minimize customer downtime, it is also important that the instruments be calibrated in a timely manner and returned to the customer. Consequently, subjecting the instrument to repeated calibrations or extensive repeated measurements is not a viable approach. Additionally, these laboratories provide SPRT calibration services involving a wide variety of SPRT designs. These designs behave differently, yet predictably, when subjected to calibration measurements. To this end, an evaluation strategy involving both statistical process control and internal consistency measures is utilized to provide confidence in both the instrument calibration and the calibration process. This article describes the calibration facilities, procedure, uncertainty analysis, and internal quality assurance measures employed in the calibration of SPRTs. Data will be reviewed and generalities will be presented. Finally, challenges and considerations for future improvements will be discussed.

  9. An approach to routine individual internal dose monitoring at the object 'Shelter' personnel considering uncertainties

    International Nuclear Information System (INIS)

    Mel'nichuk, D.V.; Bondarenko, O.O.; Medvedjev, S.Yu.

    2002-01-01

    An approach to organisation of routine individual internal dose monitoring of the personnel of the Object 'Shelter' is presented in the work, that considers individualised uncertainties. In this aspect two methods of effective dose assessment based on bioassay are considered in the work: (1) traditional indirect method at which application results of workplace monitoring are not taken into account, and (2) a combined method in which both results of bioassay measurements and workplace monitoring are considered

  10. Uncertainties in carbon residence time and NPP-driven carbon uptake in terrestrial ecosystems of the conterminous USA: a Bayesian approach

    Directory of Open Access Journals (Sweden)

    Xuhui Zhou

    2012-10-01

    Full Text Available Carbon (C residence time is one of the key factors that determine the capacity of ecosystem C storage. However, its uncertainties have not been well quantified, especially at regional scales. Assessing uncertainties of C residence time is thus crucial for an improved understanding of terrestrial C sequestration. In this study, the Bayesian inversion and Markov Chain Monte Carlo (MCMC technique were applied to a regional terrestrial ecosystem (TECO-R model to quantify C residence times and net primary productivity (NPP-driven ecosystem C uptake and assess their uncertainties in the conterminous USA. The uncertainty was represented by coefficient of variation (CV. The 13 spatially distributed data sets of C pools and fluxes have been used to constrain TECO-R model for each biome (totally eight biomes. Our results showed that estimated ecosystem C residence times ranged from 16.6±1.8 (cropland to 85.9±15.3 yr (evergreen needleleaf forest with an average of 56.8±8.8 yr in the conterminous USA. The ecosystem C residence times and their CV were spatially heterogeneous and varied with vegetation types and climate conditions. Large uncertainties appeared in the southern and eastern USA. Driven by NPP changes from 1982 to 1998, terrestrial ecosystems in the conterminous USA would absorb 0.20±0.06 Pg C yr−1. Their spatial pattern was closely related to the greenness map in the summer with larger uptake in central and southeast regions. The lack of data or timescale mismatching between the available data and the estimated parameters lead to uncertainties in the estimated C residence times, which together with initial NPP resulted in the uncertainties in the estimated NPP-driven C uptake. The Bayesian approach with MCMC inversion provides an effective tool to estimate spatially distributed C residence time and assess their uncertainties in the conterminous USA.

  11. Determination of internal series resistance of PV devices: repeatability and uncertainty

    International Nuclear Information System (INIS)

    Trentadue, Germana; Pavanello, Diego; Salis, Elena; Field, Mike; Müllejans, Harald

    2016-01-01

    The calibration of photovoltaic devices requires the measurement of their current–voltage characteristics at standard test conditions (STC). As the latter can only be reached approximately, a curve translation is necessary, requiring among others the internal series resistance of the photovoltaic device as an input parameter. Therefore accurate and reliable determination of the series resistance is important in measurement and test laboratories. This work follows standard IEC 60891 ed 2 (2009) for the determination of the internal series resistance and investigates repeatability and uncertainty of the result in three aspects for a number of typical photovoltaic technologies. Firstly the effect of varying device temperature on the determined series resistance is determined experimentally and compared to a theoretical derivation showing agreement. It is found that the series resistance can be determined with an uncertainty of better than 5% if the device temperature is stable within  ±0.1 °C, whereas the temperature range of  ±2 °C allowed by the standard leads to much larger variations. Secondly the repeatability of the series resistance determination with respect to noise in current–voltage measurement is examined yielding typical values of  ±5%. Thirdly the determination of the series resistance using three different experimental set-ups (solar simulators) shows agreement on the level of  ±5% for crystalline Silicon photovoltaic devices and deviations up to 15% for thin-film devices. It is concluded that the internal series resistance of photovoltaic devices could be determined with an uncertainty of better than 10%. The influence of this uncertainty in series resistance on the electrical performance parameters of photovoltaic devices was estimated and showed a contribution of 0.05% for open-circuit voltage and 0.1% for maximum power. Furthermore it is concluded that the range of device temperatures allowed during determination of series

  12. A review of the uncertainties in internal radiation dose assessment for inhaled thorium

    International Nuclear Information System (INIS)

    Hewson, G.S.

    1989-01-01

    Present assessments of internal radiation dose to designated radiation workers in the mineral sands industry, calculated using ICRP 26/30 methodology and data, indicate that some workers approach and exceed statutory radiation dose limits. Such exposures are indicative of the need for a critical assessment of work and operational procedures and also of metabolic and dosimetric models used to estimate internal dose. This paper reviews past occupational exposure experience with inhaled thorium compounds, examines uncertainties in the underlying radiation protection models, and indicates the effect of alternative assumptions on the calculation of committed effective dose equivalent. The extremely low recommended inhalation limits for thorium in air do not appear to be well supported by studies on the health status of former thorium refinery workers who were exposed to thorium well in excess of presently accepted limits. The effect of cautious model assumptions is shown to result in internal dose assessments that could be up to an order of magnitude too high. It is concluded that the effect of such uncertainty constrains the usefulness of internal dose estimates as a reliable indicator of actual health risk. 26 refs., 5 figs., 3 tabs

  13. Uncertainty in Indian Ocean Dipole response to global warming: the role of internal variability

    Science.gov (United States)

    Hui, Chang; Zheng, Xiao-Tong

    2018-01-01

    The Indian Ocean Dipole (IOD) is one of the leading modes of interannual sea surface temperature (SST) variability in the tropical Indian Ocean (TIO). The response of IOD to global warming is quite uncertain in climate model projections. In this study, the uncertainty in IOD change under global warming, especially that resulting from internal variability, is investigated based on the community earth system model large ensemble (CESM-LE). For the IOD amplitude change, the inter-member uncertainty in CESM-LE is about 50% of the intermodel uncertainty in the phase 5 of the coupled model intercomparison project (CMIP5) multimodel ensemble, indicating the important role of internal variability in IOD future projection. In CESM-LE, both the ensemble mean and spread in mean SST warming show a zonal positive IOD-like (pIOD-like) pattern in the TIO. This pIOD-like mean warming regulates ocean-atmospheric feedbacks of the interannual IOD mode, and weakens the skewness of the interannual variability. However, as the changes in oceanic and atmospheric feedbacks counteract each other, the inter-member variability in IOD amplitude change is not correlated with that of the mean state change. Instead, the ensemble spread in IOD amplitude change is correlated with that in ENSO amplitude change in CESM-LE, reflecting the close inter-basin relationship between the tropical Pacific and Indian Ocean in this model.

  14. Decision making under internal uncertainty: the case of multiple-choice tests with different scoring rules.

    Science.gov (United States)

    Bereby-Meyer, Yoella; Meyer, Joachim; Budescu, David V

    2003-02-01

    This paper assesses framing effects on decision making with internal uncertainty, i.e., partial knowledge, by focusing on examinees' behavior in multiple-choice (MC) tests with different scoring rules. In two experiments participants answered a general-knowledge MC test that consisted of 34 solvable and 6 unsolvable items. Experiment 1 studied two scoring rules involving Positive (only gains) and Negative (only losses) scores. Although answering all items was the dominating strategy for both rules, the results revealed a greater tendency to answer under the Negative scoring rule. These results are in line with the predictions derived from Prospect Theory (PT) [Econometrica 47 (1979) 263]. The second experiment studied two scoring rules, which allowed respondents to exhibit partial knowledge. Under the Inclusion-scoring rule the respondents mark all answers that could be correct, and under the Exclusion-scoring rule they exclude all answers that might be incorrect. As predicted by PT, respondents took more risks under the Inclusion rule than under the Exclusion rule. The results illustrate that the basic process that underlies choice behavior under internal uncertainty and especially the effect of framing is similar to the process of choice under external uncertainty and can be described quite accurately by PT. Copyright 2002 Elsevier Science B.V.

  15. International cooperation behind the veil of uncertainty. The case of transboundary pollution

    International Nuclear Information System (INIS)

    Helm, C.

    1998-01-01

    The complexities of international environmental problems are only poorly understood. Hence, decision makers have to negotiate about abatement measures even though they do not know the 'true' model of the ecological system and have only a rough idea about the costs and benefits of their action. It will be analysed to what extent this kind of 'model uncertainty' - where players do not only have incomplete information about the payoff functions of the other players, but also about their own payoff function - affects the prospects of international cooperation. Using a simple game- theoretic model, it is shown how countries can use the veil of uncertainty to hide their distributional interests. The arguments are based on a deviation from the common prior assumption, which seems particularly questionable in a setting comprising various countries with different cultural and scientific backgrounds. Finally the model will prove useful to quantitatively and qualitatively illustrate the role of model uncertainty in the negotiations of the first Sulphur Protocol signed to combat boundary acidification. 26 refs

  16. Internal Transport Barrier Driven by Redistribution of Energetic Ions

    International Nuclear Information System (INIS)

    Wong, K.L.; Heidbrink, W.W.; Ruskov, E.; Petty, C.C.; Greenfield, C.M.; Nazikian, R.; Budny, R.

    2004-01-01

    Alfven instabilities excited by energetic ions are used as a means to reduce the central magnetic shear in a tokamak via redistribution of energetic ions. When the central magnetic shear is low enough, ballooning modes become stable for any plasma pressure gradient and an internal transport barrier (ITB) with a steep pressure gradient can exist. This mechanism can sustain a steady-state ITB as demonstrated by experimental data from the DIII-D tokamak. It can also produce a shear in toroidal and poloidal plasma rotation. Possible application of this technique to use the energetic alpha particles for improvement of burning plasma performance is discussed

  17. Homogeneous internal wave turbulence driven by tidal flows

    Science.gov (United States)

    Le Reun, Thomas; Favier, Benjamin; Le Bars, Michael; Erc Fludyco Team

    2017-11-01

    We propose a novel investigation of the stability of strongly stratified planetary fluid layers undergoing periodic tidal distortion in the limit where rotational effects are negligible compared to buoyancy. With the help of a local model focusing on a small fluid area compared to the global layer, we find that periodic tidal distortion drives a parametric subharmonic resonance of internal. This instability saturates into an homogeneous internal wave turbulence pervading the whole fluid interior: the energy is injected in the unstable waves which then feed a succession of triadic resonances also generating small spatial scales. As the timescale separation between the forcing and Brunt-Väisälä is increased, the temporal spectrum of this turbulence displays a -2 power law reminiscent of the Garrett and Munk spectrum measured in the oceans (Garett & Munk 1979). Moreover, in this state consisting of a superposition of waves in weak non-linear interaction, the mixing efficiency is increased compared to classical, Kolmogorov-like stratified turbulence. This study is of wide interest in geophysical fluid dynamics ranging from oceanic turbulence and tidal heating in icy satellites to dynamo action in partially stratified planetary cores as it could be the case in the Earth. We acknowledge support from the European Research Council (ERC) under the European Union's Horizon 2020 research and innovation program (Grant Agreement No. 681835-FLUDYCO-ERC-2015-CoG).

  18. Qualification and application of nuclear reactor accident analysis code with the capability of internal assessment of uncertainty

    International Nuclear Information System (INIS)

    Borges, Ronaldo Celem

    2001-10-01

    This thesis presents an independent qualification of the CIAU code ('Code with the capability of - Internal Assessment of Uncertainty') which is part of the internal uncertainty evaluation process with a thermal hydraulic system code on a realistic basis. This is done by combining the uncertainty methodology UMAE ('Uncertainty Methodology based on Accuracy Extrapolation') with the RELAP5/Mod3.2 code. This allows associating uncertainty band estimates with the results obtained by the realistic calculation of the code, meeting licensing requirements of safety analysis. The independent qualification is supported by simulations with RELAP5/Mod3.2 related to accident condition tests of LOBI experimental facility and to an event which has occurred in Angra 1 nuclear power plant, by comparison with measured results and by establishing uncertainty bands on safety parameter calculated time trends. These bands have indeed enveloped the measured trends. Results from this independent qualification of CIAU have allowed to ascertain the adequate application of a systematic realistic code procedure to analyse accidents with uncertainties incorporated in the results, although there is an evident need of extending the uncertainty data base. It has been verified that use of the code with this internal assessment of uncertainty is feasible in the design and license stages of a NPP. (author)

  19. Mammalian cycles: internally defined periods and interaction-driven amplitudes

    Directory of Open Access Journals (Sweden)

    LR Ginzburg

    2015-08-01

    Full Text Available The cause of mammalian cycles—the rise and fall of populations over a predictable period of time—has remained controversial since these patterns were first observed over a century ago. In spite of extensive work on observable mammalian cycles, the field has remained divided upon what the true cause is, with a majority of opinions attributing it to either predation or to intra-species mechanisms. Here we unite the eigenperiod hypothesis, which describes an internal, maternal effect-based mechanism to explain the cycles’ periods with a recent generalization explaining the amplitude of snowshoe hare cycles in northwestern North America based on initial predator abundance. By explaining the period and the amplitude of the cycle with separate mechanisms, a unified and consistent view of the causation of cycles is reached. Based on our suggested theory, we forecast the next snowshoe hare cycle (predicted peak in 2016 to be of extraordinarily low amplitude.

  20. Laser-driven polarized hydrogen and deuterium internal targets

    International Nuclear Information System (INIS)

    Jones, C.E.; Fedchak, J.A.; Kowalczyk, R.S.

    1995-01-01

    After completing comprehensive tests of the performance of the source with both hydrogen and deuterium gas, we began tests of a realistic polarized deuterium internal target. These tests involve characterizing the atomic polarization and dissociation fraction of atoms in a storage cell as a function of flow and magnetic field, and making direct measurements of the average nuclear tensor polarization of deuterium atoms in the storage cell. Transfer of polarization from the atomic electron to the nucleus as a result of D-D spin-exchange collisions was observed in deuterium, verifying calculations suggesting that high vector polarization in both hydrogen and deuterium can be obtained in a gas in spin temperature equilibrium without inducing RF transitions between the magnetic substates. In order to improve the durability of the system, the source glassware was redesigned to simplify construction and installation and eliminate stress points that led to frequent breakage. Improvements made to the nuclear polarimeter, which used the low energy 3 H(d,n) 4 He reaction to analyze the tensor polarization of the deuterium, included installing acceleration lenses constructed of wire mesh to improve pumping conductance, construction of a new holding field coil, and elimination of the Wien filter from the setup. These changes substantially simplified operation of the polarimeter and should have reduced depolarization in collisions with the wall. However, when a number of tests failed to show an improvement of the nuclear polarization, it was discovered that extended operation of the system with a section of teflon as a getter for potassium caused the dissociation fraction to decline with time under realistic operating conditions, suggesting that teflon may not be a suitable material to eliminate potassium from the target. We are replacing the teflon surfaces with drifilm-coated ones and plan to continue tests of the polarized internal target in this configuration

  1. Quantifying uncertainty due to internal variability using high-resolution regional climate model simulations

    Science.gov (United States)

    Gutmann, E. D.; Ikeda, K.; Deser, C.; Rasmussen, R.; Clark, M. P.; Arnold, J. R.

    2015-12-01

    The uncertainty in future climate predictions is as large or larger than the mean climate change signal. As such, any predictions of future climate need to incorporate and quantify the sources of this uncertainty. One of the largest sources comes from the internal, chaotic, variability within the climate system itself. This variability has been approximated using the 30 ensemble members of the Community Earth System Model (CESM) large ensemble. Here we examine the wet and dry end members of this ensemble for cool-season precipitation in the Colorado Rocky Mountains with a set of high-resolution regional climate model simulations. We have used the Weather Research and Forecasting model (WRF) to simulate the periods 1990-2000, 2025-2035, and 2070-2080 on a 4km grid. These simulations show that the broad patterns of change depicted in CESM are inherited by the high-resolution simulations; however, the differences in the height and location of the mountains in the WRF simulation, relative to the CESM simulation, means that the location and magnitude of the precipitation changes are very different. We further show that high-resolution simulations with the Intermediate Complexity Atmospheric Research model (ICAR) predict a similar spatial pattern in the change signal as WRF for these ensemble members. We then use ICAR to examine the rest of the CESM Large Ensemble as well as the uncertainty in the regional climate model due to the choice of physics parameterizations.

  2. International target values 2010 for achievable measurement uncertainties in nuclear material accountancy

    International Nuclear Information System (INIS)

    Dias, Fabio C.; Almeida, Silvio G. de; Renha Junior, Geraldo

    2011-01-01

    The International Target Values (ITVs) are reasonable uncertainty estimates that can be used in judging the reliability of measurement techniques applied to industrial nuclear and fissile materials subject to accountancy and/or safeguards verification. In the absence of relevant experimental estimates, ITVs can also be used to select measurement techniques and calculate sample population during the planning phase of verification activities. It is important to note that ITVs represent estimates of the 'state-of-the-practice', which should be achievable under routine measurement conditions affecting both facility operators and safeguards inspectors, not only in the field, but also in laboratory. Tabulated values cover measurement methods used for the determination of volume or mass of the nuclear material, for its elemental and isotopic assays, and for its sampling. The 2010 edition represents the sixth revision of the International Target Values (ITVs), issued by the International Atomic Energy Agency (IAEA) as a Safeguards Technical Report (STR-368). The first version was released as 'Target Values' in 1979 by the Working Group on Techniques and Standards for Destructive Analysis (WGDA) of the European Safeguards Research and Development Association (ESARDA) and focused on destructive analytical methods. In the latest 2010 revision, international standards in estimating and expressing uncertainties have been considered while maintaining a format that allows comparison with the previous editions of the ITVs. Those standards have been usually applied in QC/QA programmes, as well as qualification of methods, techniques and instruments. Representatives of the Brazilian Nuclear Energy Commission (CNEN) and the Brazilian-Argentine Agency for Accounting and Control of Nuclear Materials (ABACC) participated in previous Consultants Group Meetings since the one convened to establish the first list of ITVs released in 1993 and in subsequent revisions, including the latest one

  3. Deriving proper measurement uncertainty from Internal Quality Control data: An impossible mission?

    Science.gov (United States)

    Ceriotti, Ferruccio

    2018-03-30

    Measurement uncertainty (MU) is a "non-negative parameter characterizing the dispersion of the quantity values being attributed to a measurand, based on the information used". In the clinical laboratory the most convenient way to calculate MU is the "top down" approach based on the use of Internal Quality Control data. As indicated in the definition, MU depends on the information used for its calculation and so different estimates of MU can be obtained. The most problematic aspect is how to deal with bias. In fact bias is difficult to detect and quantify and it should be corrected including only the uncertainty derived from this correction. Several approaches to calculate MU starting from Internal Quality Control data are presented. The minimum requirement is to use only the intermediate precision data, provided to include 6 months of results obtained with a commutable quality control material at a concentration close to the clinical decision limit. This approach is the minimal requirement and it is convenient for all those measurands that are especially used for monitoring or where a reference measurement system does not exist and so a reference for calculating the bias is lacking. Other formulas including the uncertainty of the value of the calibrator, including the bias from a commutable certified reference material or from a material specifically prepared for trueness verification, including the bias derived from External Quality Assessment schemes or from historical mean of the laboratory are presented and commented. MU is an important parameter, but a single, agreed upon way to calculate it in a clinical laboratory is not yet available. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  4. Optimization of internal contamination monitoring programmes by studying uncertainties linked to dosimetric assessment

    International Nuclear Information System (INIS)

    Davesne, Estelle

    2010-01-01

    To optimise the protection of workers against ionizing radiations, the International Commission on Radiological Protection recommends the use of dose constraint and limits. To verify the compliance of the means of protection with these values when a risk of internal contamination exists, monitoring programmes formed of periodic bioassay measurements are performed. However, uncertainty in the dose evaluation arises from the variability of the activity measurement and from the incomplete knowledge of the exposure conditions. This uncertainty was taken into account by means of classical, Bayesian and possibilist statistics. The developed methodology was applied to the evaluation of the potential exposure during nuclear fuel preparation or mining; and to the analysis of the monitoring programme of workers purifying plutonium in AREVA NC La Hague reprocessing plant. From the measurement decision threshold, the minimum dose detectable (MDD) by the programme with a given confidence level can be calculated through the software OPSCI. It is shown to be a useful support in the optimisation of monitoring programmes when seeking a compromise between their sensitivity and their costs. (author)

  5. Using internal discharge data in a distributed conceptual model to reduce uncertainty in streamflow simulations

    Science.gov (United States)

    Guerrero, J.; Halldin, S.; Xu, C.; Lundin, L.

    2011-12-01

    Distributed hydrological models are important tools in water management as they account for the spatial variability of the hydrological data, as well as being able to produce spatially distributed outputs. They can directly incorporate and assess potential changes in the characteristics of our basins. A recognized problem for models in general is equifinality, which is only exacerbated for distributed models who tend to have a large number of parameters. We need to deal with the fundamentally ill-posed nature of the problem that such models force us to face, i.e. a large number of parameters and very few variables that can be used to constrain them, often only the catchment discharge. There is a growing but yet limited literature showing how the internal states of a distributed model can be used to calibrate/validate its predictions. In this paper, a distributed version of WASMOD, a conceptual rainfall runoff model with only three parameters, combined with a routing algorithm based on the high-resolution HydroSHEDS data was used to simulate the discharge in the Paso La Ceiba basin in Honduras. The parameter space was explored using Monte-Carlo simulations and the region of space containing the parameter-sets that were considered behavioral according to two different criteria was delimited using the geometric concept of alpha-shapes. The discharge data from five internal sub-basins was used to aid in the calibration of the model and to answer the following questions: Can this information improve the simulations at the outlet of the catchment, or decrease their uncertainty? Also, after reducing the number of model parameters needing calibration through sensitivity analysis: Is it possible to relate them to basin characteristics? The analysis revealed that in most cases the internal discharge data can be used to reduce the uncertainty in the discharge at the outlet, albeit with little improvement in the overall simulation results.

  6. A probabilistic approach to quantify the uncertainties in internal dose assessment using response surface and neural network

    International Nuclear Information System (INIS)

    Baek, M.; Lee, S.K.; Lee, U.C.; Kang, C.S.

    1996-01-01

    A probabilistic approach is formulated to assess the internal radiation exposure following the intake of radioisotopes. This probabilistic approach consists of 4 steps as follows: (1) screening, (2) quantification of uncertainties, (3) propagation of uncertainties, and (4) analysis of output. The approach has been applied for Pu-induced internal dose assessment and a multi-compartment dosimetric model is used for internal transport. In this approach, surrogate models of original system are constructed using response and neural network. And the results of these surrogate models are compared with those of original model. Each surrogate model well approximates the original model. The uncertainty and sensitivity analysis of the model parameters are evaluated in this process. Dominant contributors to each organ are identified and the results show that this approach could serve a good tool of assessing the internal radiation exposure

  7. ENTERPRISE OPERATION PLANNING IN THE CONDITIONS OF RISK AND UNCERTAINTY IN THE EXTERNAL AND INTERNAL ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Titov V. V.

    2017-09-01

    Full Text Available Optimization of the enterprise activity planning taking into account the risk and uncertainty of the external and internal environment is a complex scientific and methodological problem. Its solution is important for the planning practice. Therefore, the relevance of this research topic is beyond doubt. Planning is based on the use of a multilevel system of models. At the top level, the achievement of key strategic indicators is ensured by the development and implementation of innovations, mainly related to the planning of the release of new high-tech products. However, it is at this level that the risks and uncertainties have the greatest impact on the planning processes for the development, production and marketing of new products. In the scientific literature it is proposed to use the stochastic graphs with returns for this purpose. This idea is also supported in this work. However, the implementation of such an idea requires additional methodological developments and quantitative calculations. The coordination of strategic decisions with tactical plans is based on the idea of eliminating the economic and other risks associated with the economic activity of the enterprise in tactical planning, by creating the stochastic reserves based on the implementation of additional innovations that ensure the receipt of above-target sales volumes, profits and other indicators of the strategic plan. The organization of operational management of production is represented by an iterative, sliding process (reducing risks in production, which is realized taking into account the limitations of tactical control.

  8. International Target Values 2010 for Measurement Uncertainties in Safeguarding Nuclear Materials

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, M.; Penkin, M.; Norman, C.; Balsley, S. [IAEA, Vienna (Australia); others, and

    2012-12-15

    This issue of the International Target Values (ITVs) represents the sixth revision, following the first release of such tables issued in 1979 by the ESARDA/WGDA. The ITVs are uncertainties to be considered in judging the reliability of analytical techniques applied to industrial nuclear and fissile material, which are subject to safeguards verification. The tabulated values represent estimates of the 'state of the practice' which should be achievable under routine measurement conditions. The most recent standard conventions in representing uncertainty have been considered, while maintaining a format that allows comparison with the previous releases of the ITVs. The present report explains why target values are needed, how the concept evolved and how they relate to the operator's and inspector's measurement systems. The ITVs-2010 are intended to be used by plant operators and safeguards organizations, as a reference of the quality of measurements achievable in nuclear material accountancy, and for planning purposes. The report suggests that the use of ITVs can be beneficial for statistical inferences regarding the significance of operator-inspector differences whenever valid performance values are not available.

  9. International target values 2010 for achievable measurement uncertainties in nuclear material accountancy

    Energy Technology Data Exchange (ETDEWEB)

    Dias, Fabio C., E-mail: fabio@ird.gov.b [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil); Almeida, Silvio G. de; Renha Junior, Geraldo, E-mail: silvio@abacc.org.b, E-mail: grenha@abacc.org.b [Agencia Brasileiro-Argentina de Contabilidade e Controle de Materiais Nucleares (ABACC), Rio de Janeiro, RJ (Brazil)

    2011-07-01

    The International Target Values (ITVs) are reasonable uncertainty estimates that can be used in judging the reliability of measurement techniques applied to industrial nuclear and fissile materials subject to accountancy and/or safeguards verification. In the absence of relevant experimental estimates, ITVs can also be used to select measurement techniques and calculate sample population during the planning phase of verification activities. It is important to note that ITVs represent estimates of the 'state-of-the-practice', which should be achievable under routine measurement conditions affecting both facility operators and safeguards inspectors, not only in the field, but also in laboratory. Tabulated values cover measurement methods used for the determination of volume or mass of the nuclear material, for its elemental and isotopic assays, and for its sampling. The 2010 edition represents the sixth revision of the International Target Values (ITVs), issued by the International Atomic Energy Agency (IAEA) as a Safeguards Technical Report (STR-368). The first version was released as 'Target Values' in 1979 by the Working Group on Techniques and Standards for Destructive Analysis (WGDA) of the European Safeguards Research and Development Association (ESARDA) and focused on destructive analytical methods. In the latest 2010 revision, international standards in estimating and expressing uncertainties have been considered while maintaining a format that allows comparison with the previous editions of the ITVs. Those standards have been usually applied in QC/QA programmes, as well as qualification of methods, techniques and instruments. Representatives of the Brazilian Nuclear Energy Commission (CNEN) and the Brazilian-Argentine Agency for Accounting and Control of Nuclear Materials (ABACC) participated in previous Consultants Group Meetings since the one convened to establish the first list of ITVs released in 1993 and in subsequent revisions

  10. Routine internal- and external-quality control data in clinical laboratories for estimating measurement and diagnostic uncertainty using GUM principles.

    Science.gov (United States)

    Magnusson, Bertil; Ossowicki, Haakan; Rienitz, Olaf; Theodorsson, Elvar

    2012-05-01

    Healthcare laboratories are increasingly joining into larger laboratory organizations encompassing several physical laboratories. This caters for important new opportunities for re-defining the concept of a 'laboratory' to encompass all laboratories and measurement methods measuring the same measurand for a population of patients. In order to make measurement results, comparable bias should be minimized or eliminated and measurement uncertainty properly evaluated for all methods used for a particular patient population. The measurement as well as diagnostic uncertainty can be evaluated from internal and external quality control results using GUM principles. In this paper the uncertainty evaluations are described in detail using only two main components, within-laboratory reproducibility and uncertainty of the bias component according to a Nordtest guideline. The evaluation is exemplified for the determination of creatinine in serum for a conglomerate of laboratories both expressed in absolute units (μmol/L) and relative (%). An expanded measurement uncertainty of 12 μmol/L associated with concentrations of creatinine below 120 μmol/L and of 10% associated with concentrations above 120 μmol/L was estimated. The diagnostic uncertainty encompasses both measurement uncertainty and biological variation, and can be estimated for a single value and for a difference. This diagnostic uncertainty for the difference for two samples from the same patient was determined to be 14 μmol/L associated with concentrations of creatinine below 100 μmol/L and 14 % associated with concentrations above 100 μmol/L.

  11. Development of an expert system for the taking into account of uncertainties in the monitoring of internal contaminations

    International Nuclear Information System (INIS)

    Davesne, E.; Blanchardon, E.; Casanova, P.; Chojnacki, E.; Paquet, F.

    2010-01-01

    Internal contaminations may result from professional exposure and they can be monitored by anthropo-radiometric and radio-toxicological measurements which are interpreted in terms of embedded activity and effective dose by means of biokinetic and dosimetric models. In spite of standards, some uncertainties in the dosimetric interpretation of radio-toxicological measurements may remain. The authors report the development of a software (OPSCI code) which takes into account uncertainties related to the worker internal dosimetry, the calculation of the minimum detectable dose related to an exposure, and the development of a data monitoring programme

  12. International target values 2000 for measurement uncertainties in safeguarding nuclear materials

    International Nuclear Information System (INIS)

    Aigner, H.; Binner, R.; Kuhn, E.

    2001-01-01

    The IAEA has prepared a revised and updated version of International Target Values (ITVs) for uncertainty components in measurements of nuclear material. The ITVs represent uncertainties to be considered in judging the reliability of analytical techniques applied to industrial nuclear and fissile material subject to safeguards verification. The tabulated values represent estimates of the 'state of the practice' which ought to be achievable under routine conditions by adequately equipped, experienced laboratories. The ITVs 2000 are intended to be used by plant operators and safeguards organizations as a reference of the quality of measurements achievable in nuclear material accountancy, and for planning purposes. The IAEA prepared a draft of a technical report presenting the proposed ITVs 2000, and in April 2000 the chairmen or officers of the panels or organizations listed below were invited to co- author the report and to submit the draft to a discussion by their panels and organizations. Euratom Safeguards Inspectorate, ESAKDA Working Group on Destructive Analysis, ESARDA Working Group on Non Destructive Analysis, Institute of Nuclear Material Management, Japanese Expert Group on ITV-2000, ISO Working Group on Analyses in Spent Fuel Reprocessing, ISO Working Group on Analyses in Uranium Fuel Fabrication, ISO Working Group on Analyses in MOX Fuel Fabrication, Agencia Brasileno-Argentina de Contabilidad y Control de Materiales Nucleares (ABACC). Comments from the above groups were received and incorporated into the final version of the document, completed in April 2001. The ITVs 2000 represent target standard uncertainties, expressing the precision achievable under stipulated conditions. These conditions typically fall in one of the two following categories: 'repeatability conditions' normally encountered during the measurements done within one inspection period; or 'reproducibility conditions' involving additional sources of measurement variability such as

  13. On the rejection of internal and external disturbances in a wind energy conversion system with direct-driven PMSG.

    Science.gov (United States)

    Li, Shengquan; Zhang, Kezhao; Li, Juan; Liu, Chao

    2016-03-01

    This paper deals with the critical issue in a wind energy conversion system (WECS) based on a direct-driven permanent magnet synchronous generator (PMSG): the rejection of lumped disturbance, including the system uncertainties in the internal dynamics and unknown external forces. To simultaneously track the motor speed in real time and capture the maximum power, a maximum power point tracking strategy is proposed based on active disturbance rejection control (ADRC) theory. In real application, system inertia, drive torque and some other parameters change in a wide range with the variations of disturbances and wind speeds, which substantially degrade the performance of WECS. The ADRC design must incorporate the available model information into an extended state observer (ESO) to compensate the lumped disturbance efficiently. Based on this principle, a model-compensation ADRC is proposed in this paper. Simulation study is conducted to evaluate the performance of the proposed control strategy. It is shown that the effect of lumped disturbance is compensated in a more effective way compared with the traditional ADRC approach. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  14. After the Hague, Bonn and Marrakech: uncertainties on the future international market of emission permits

    International Nuclear Information System (INIS)

    Kitous, A.; Criqui, P.; Blanchard, O.

    2002-01-01

    The purpose of this article is to present an economic assessment, step by step, of the successive developments of the negotiation on weather changes since the Kyoto protocol in 1997 until the agreement achieved in Marrakech during the seventh Conference of the Parties (COP 7) in November 2001. The analysis covers the international market of emission rights, a key mechanism of the Protocol, the purpose of which is to facilitate the Parties' compliance with their undertakings, by introducing flexibility to improve the economic efficiency of emission reduction. However, it now appears that despite the Marrakech agreement in November 2001, the system is weakened by the withdrawal of the USA decided by President G.W. Bush in March 2001, following COP 6 in The Hague, and by a potential excess of permits due to the economic recession of transition countries since the early nineties (hot air). As things stands, the establishment of the market between the countries taking part in the process will undoubtedly require some management of this hot air between transition countries (Eastern Europe and Ex USSR) and the other Parties of appendix B still involved in the process. The uncertainties weighing on the future market of emission permits strengthen the strategic significance of the implementation of effective reduction policies within those regions and particularly within Europe. (authors)

  15. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    International Nuclear Information System (INIS)

    Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-01-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has

  16. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-11-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach

  17. ICG: a wiki-driven knowledgebase of internal control genes for RT-qPCR normalization.

    Science.gov (United States)

    Sang, Jian; Wang, Zhennan; Li, Man; Cao, Jiabao; Niu, Guangyi; Xia, Lin; Zou, Dong; Wang, Fan; Xu, Xingjian; Han, Xiaojiao; Fan, Jinqi; Yang, Ye; Zuo, Wanzhu; Zhang, Yang; Zhao, Wenming; Bao, Yiming; Xiao, Jingfa; Hu, Songnian; Hao, Lili; Zhang, Zhang

    2018-01-04

    Real-time quantitative PCR (RT-qPCR) has become a widely used method for accurate expression profiling of targeted mRNA and ncRNA. Selection of appropriate internal control genes for RT-qPCR normalization is an elementary prerequisite for reliable expression measurement. Here, we present ICG (http://icg.big.ac.cn), a wiki-driven knowledgebase for community curation of experimentally validated internal control genes as well as their associated experimental conditions. Unlike extant related databases that focus on qPCR primers in model organisms (mainly human and mouse), ICG features harnessing collective intelligence in community integration of internal control genes for a variety of species. Specifically, it integrates a comprehensive collection of more than 750 internal control genes for 73 animals, 115 plants, 12 fungi and 9 bacteria, and incorporates detailed information on recommended application scenarios corresponding to specific experimental conditions, which, collectively, are of great help for researchers to adopt appropriate internal control genes for their own experiments. Taken together, ICG serves as a publicly editable and open-content encyclopaedia of internal control genes and accordingly bears broad utility for reliable RT-qPCR normalization and gene expression characterization in both model and non-model organisms. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. Proceedings of the international symposium on acceleration-driven transmutation systems and Asia ADS network initiative

    International Nuclear Information System (INIS)

    Oigawa, Hiroyuki

    2003-09-01

    An International Symposium on 'Accelerator-Driven Transmutation Systems and Asia ADS Network Initiative' was held on March 24 and 25, 2003 at Gakushi-Kaikan, Tokyo, hosted by Japan Atomic Energy Research Institute, Kyoto University, Osaka University, High Energy Accelerator Research Organization and Tokyo Institute of Technology. The objectives of this symposium are to make participants acquainted with the current status and future plans for research and development (R and D) of ADS in the world and to enhance the initiation of an international collaborative network for ADS in Asia. This report records the papers and the materials of 15 presentations in the symposium. On the first day of the symposium, current activities for R and D of ADS were presented from United States, Europe, Japan, Korea, and China. On the second day, R and D activities in the fields of accelerator and nuclear physics were presented. After these presentations, a panel discussion was organized with regard to the prospective international collaboration and multidisciplinary synergy effect, which are essential to manage various technological issues encountered in R and D stage of ADS. Through the discussion, common understanding was promoted concerning the importance of establishing international network. It was agreed to establish the international network for scientific information exchange among Asian countries including Japan, Korea, China, and Vietnam in view of the future international collaboration in R and D of ADS. (author)

  19. Lattice Boltzmann equation calculation of internal, pressure-driven turbulent flow

    International Nuclear Information System (INIS)

    Hammond, L A; Halliday, I; Care, C M; Stevens, A

    2002-01-01

    We describe a mixing-length extension of the lattice Boltzmann approach to the simulation of an incompressible liquid in turbulent flow. The method uses a simple, adaptable, closure algorithm to bound the lattice Boltzmann fluid incorporating a law-of-the-wall. The test application, of an internal, pressure-driven and smooth duct flow, recovers correct velocity profiles for Reynolds number to 1.25 x 10 5 . In addition, the Reynolds number dependence of the friction factor in the smooth-wall branch of the Moody chart is correctly recovered. The method promises a straightforward extension to other curves of the Moody chart and to cylindrical pipe flow

  20. Validation and uncertainty quantification of Fuego simulations of calorimeter heating in a wind-driven hydrocarbon pool fire.

    Energy Technology Data Exchange (ETDEWEB)

    Domino, Stefan Paul; Figueroa, Victor G.; Romero, Vicente Jose; Glaze, David Jason; Sherman, Martin P.; Luketa-Hanlin, Anay Josephine

    2009-12-01

    The objective of this work is to perform an uncertainty quantification (UQ) and model validation analysis of simulations of tests in the cross-wind test facility (XTF) at Sandia National Laboratories. In these tests, a calorimeter was subjected to a fire and the thermal response was measured via thermocouples. The UQ and validation analysis pertains to the experimental and predicted thermal response of the calorimeter. The calculations were performed using Sierra/Fuego/Syrinx/Calore, an Advanced Simulation and Computing (ASC) code capable of predicting object thermal response to a fire environment. Based on the validation results at eight diversely representative TC locations on the calorimeter the predicted calorimeter temperatures effectively bound the experimental temperatures. This post-validates Sandia's first integrated use of fire modeling with thermal response modeling and associated uncertainty estimates in an abnormal-thermal QMU analysis.

  1. A Data-Driven Stochastic Reactive Power Optimization Considering Uncertainties in Active Distribution Networks and Decomposition Method

    DEFF Research Database (Denmark)

    Ding, Tao; Yang, Qingrun; Yang, Yongheng

    2018-01-01

    To address the uncertain output of distributed generators (DGs) for reactive power optimization in active distribution networks, the stochastic programming model is widely used. The model is employed to find an optimal control strategy with minimum expected network loss while satisfying all......, in this paper, a data-driven modeling approach is introduced to assume that the probability distribution from the historical data is uncertain within a confidence set. Furthermore, a data-driven stochastic programming model is formulated as a two-stage problem, where the first-stage variables find the optimal...... control for discrete reactive power compensation equipment under the worst probability distribution of the second stage recourse. The second-stage variables are adjusted to uncertain probability distribution. In particular, this two-stage problem has a special structure so that the second-stage problem...

  2. Technical Note: Atmospheric CO2 inversions on the mesoscale using data-driven prior uncertainties: methodology and system evaluation

    Science.gov (United States)

    Kountouris, Panagiotis; Gerbig, Christoph; Rödenbeck, Christian; Karstens, Ute; Koch, Thomas Frank; Heimann, Martin

    2018-03-01

    Atmospheric inversions are widely used in the optimization of surface carbon fluxes on a regional scale using information from atmospheric CO2 dry mole fractions. In many studies the prior flux uncertainty applied to the inversion schemes does not directly reflect the true flux uncertainties but is used to regularize the inverse problem. Here, we aim to implement an inversion scheme using the Jena inversion system and applying a prior flux error structure derived from a model-data residual analysis using high spatial and temporal resolution over a full year period in the European domain. We analyzed the performance of the inversion system with a synthetic experiment, in which the flux constraint is derived following the same residual analysis but applied to the model-model mismatch. The synthetic study showed a quite good agreement between posterior and true fluxes on European, country, annual and monthly scales. Posterior monthly and country-aggregated fluxes improved their correlation coefficient with the known truth by 7 % compared to the prior estimates when compared to the reference, with a mean correlation of 0.92. The ratio of the SD between the posterior and reference and between the prior and reference was also reduced by 33 % with a mean value of 1.15. We identified temporal and spatial scales on which the inversion system maximizes the derived information; monthly temporal scales at around 200 km spatial resolution seem to maximize the information gain.

  3. Management of internal communication in times of uncertainty; Gestion de la comunicacion interna en tiempos de incertidumbre

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez de la Gala, F.

    2014-07-01

    Garona is having a strong media coverage since 2009. The continuity process is under great controversy that has generated increased uncertainty for workers and their families, affecting motivation. Although internal communication has sought to manage its effects on the structure of the company, the rate of spread of alien information has made this complex mission. The regulatory body has been interested in its potential impact on safety culture, making a significant difference compared to other industrial sectors. (Author)

  4. Proceedings of the international symposium on future of accelerator-driven system

    International Nuclear Information System (INIS)

    Sugawara, Takanori

    2012-11-01

    The international Symposium on “Future of Accelerator-Driven System” was held on 29th February, 2012 at Gakushi-Kaikan, Tokyo, Japan hosted by Nuclear Science and Engineering Directorate, JAEA (Japan Atomic Energy Agency) and J-PARC (Japan Proton Accelerator Research Complex) Center. The objectives of the symposium were to make participants acquainted with the current status and future plans for research and development of ADS in the world and to discuss an international collaboration for ADS and P and T (Partitioning and Transmutation) technology. About 100 scientists participated in the symposium from Belgium, China, France, India, Italy, Japan, Korea and Mongol. In the morning session, current R and D activities of ADS in Japan were reported. In the afternoon session, current R and D activities were reported from China, Korea, India, Belgium and EU. A panel discussion took place with regards to the international collaboration for ADS at the final session. Two keynote speakers presented their outlooks on the topics and seven panelists and audience discussed those topics. (author)

  5. Coupling ontology driven semantic representation with multilingual natural language generation for tuning international terminologies.

    Science.gov (United States)

    Rassinoux, Anne-Marie; Baud, Robert H; Rodrigues, Jean-Marie; Lovis, Christian; Geissbühler, Antoine

    2007-01-01

    The importance of clinical communication between providers, consumers and others, as well as the requisite for computer interoperability, strengthens the need for sharing common accepted terminologies. Under the directives of the World Health Organization (WHO), an approach is currently being conducted in Australia to adopt a standardized terminology for medical procedures that is intended to become an international reference. In order to achieve such a standard, a collaborative approach is adopted, in line with the successful experiment conducted for the development of the new French coding system CCAM. Different coding centres are involved in setting up a semantic representation of each term using a formal ontological structure expressed through a logic-based representation language. From this language-independent representation, multilingual natural language generation (NLG) is performed to produce noun phrases in various languages that are further compared for consistency with the original terms. Outcomes are presented for the assessment of the International Classification of Health Interventions (ICHI) and its translation into Portuguese. The initial results clearly emphasize the feasibility and cost-effectiveness of the proposed method for handling both a different classification and an additional language. NLG tools, based on ontology driven semantic representation, facilitate the discovery of ambiguous and inconsistent terms, and, as such, should be promoted for establishing coherent international terminologies.

  6. Atmospheric CO2 inversions on the mesoscale using data-driven prior uncertainties: quantification of the European terrestrial CO2 fluxes

    Science.gov (United States)

    Kountouris, Panagiotis; Gerbig, Christoph; Rödenbeck, Christian; Karstens, Ute; Koch, Thomas F.; Heimann, Martin

    2018-03-01

    Optimized biogenic carbon fluxes for Europe were estimated from high-resolution regional-scale inversions, utilizing atmospheric CO2 measurements at 16 stations for the year 2007. Additional sensitivity tests with different data-driven error structures were performed. As the atmospheric network is rather sparse and consequently contains large spatial gaps, we use a priori biospheric fluxes to further constrain the inversions. The biospheric fluxes were simulated by the Vegetation Photosynthesis and Respiration Model (VPRM) at a resolution of 0.1° and optimized against eddy covariance data. Overall we estimate an a priori uncertainty of 0.54 GtC yr-1 related to the poor spatial representation between the biospheric model and the ecosystem sites. The sink estimated from the atmospheric inversions for the area of Europe (as represented in the model domain) ranges between 0.23 and 0.38 GtC yr-1 (0.39 and 0.71 GtC yr-1 up-scaled to geographical Europe). This is within the range of posterior flux uncertainty estimates of previous studies using ground-based observations.

  7. Collaborative Project: The problem of bias in defining uncertainty in computationally enabled strategies for data-driven climate model development. Final Technical Report.

    Energy Technology Data Exchange (ETDEWEB)

    Huerta, Gabriel [Univ. of New Mexico, Albuquerque, NM (United States)

    2016-05-10

    The objective of the project is to develop strategies for better representing scientific sensibilities within statistical measures of model skill that then can be used within a Bayesian statistical framework for data-driven climate model development and improved measures of model scientific uncertainty. One of the thorny issues in model evaluation is quantifying the effect of biases on climate projections. While any bias is not desirable, only those biases that affect feedbacks affect scatter in climate projections. The effort at the University of Texas is to analyze previously calculated ensembles of CAM3.1 with perturbed parameters to discover how biases affect projections of global warming. The hypothesis is that compensating errors in the control model can be identified by their effect on a combination of processes and that developing metrics that are sensitive to dependencies among state variables would provide a way to select version of climate models that may reduce scatter in climate projections. Gabriel Huerta at the University of New Mexico is responsible for developing statistical methods for evaluating these field dependencies. The UT effort will incorporate these developments into MECS, which is a set of python scripts being developed at the University of Texas for managing the workflow associated with data-driven climate model development over HPC resources. This report reflects the main activities at the University of New Mexico where the PI (Huerta) and the Postdocs (Nosedal, Hattab and Karki) worked on the project.

  8. Derivation of RCM-driven potential evapotranspiration for hydrological climate change impact analysis in Great Britain: a comparison of methods and associated uncertainty in future projections

    Directory of Open Access Journals (Sweden)

    C. Prudhomme

    2013-04-01

    Full Text Available Potential evapotranspiration (PET is the water that would be lost by plants through evaporation and transpiration if water was not limited in the soil, and it is commonly used in conceptual hydrological modelling in the calculation of runoff production and hence river discharge. Future changes of PET are likely to be as important as changes in precipitation patterns in determining changes in river flows. However PET is not calculated routinely by climate models so it must be derived independently when the impact of climate change on river flow is to be assessed. This paper compares PET estimates from 12 equations of different complexity, driven by the Hadley Centre's HadRM3-Q0 model outputs representative of 1961–1990, with MORECS PET, a product used as reference PET in Great Britain. The results show that the FAO56 version of the Penman–Monteith equations reproduces best the spatial and seasonal variability of MORECS PET across GB when driven by HadRM3-Q0 estimates of relative humidity, total cloud, wind speed and linearly bias-corrected mean surface temperature. This suggests that potential biases in HadRM3-Q0 climate do not result in significant biases when the physically based FAO56 equations are used. Percentage changes in PET between the 1961–1990 and 2041–2070 time slices were also calculated for each of the 12 PET equations from HadRM3-Q0. Results show a large variation in the magnitude (and sometimes direction of changes estimated from different PET equations, with Turc, Jensen–Haise and calibrated Blaney–Criddle methods systematically projecting the largest increases across GB for all months and Priestley–Taylor, Makkink, and Thornthwaite showing the smallest changes. We recommend the use of the FAO56 equation as, when driven by HadRM3-Q0 climate data, this best reproduces the reference MORECS PET across Great Britain for the reference period of 1961–1990. Further, the future changes of PET estimated by FAO56 are within

  9. Parametric instability and wave turbulence driven by tidal excitation of internal waves

    Science.gov (United States)

    Le Reun, Thomas; Favier, Benjamin; Le Bars, Michael

    2018-04-01

    We investigate the stability of stratified fluid layers undergoing homogeneous and periodic tidal deformation. We first introduce a local model which allows to study velocity and buoyancy fluctuations in a Lagrangian domain periodically stretched and sheared by the tidal base flow. While keeping the key physical ingredients only, such a model is efficient to simulate planetary regimes where tidal amplitudes and dissipation are small. With this model, we prove that tidal flows are able to drive parametric subharmonic resonances of internal waves, in a way reminiscent of the elliptical instability in rotating fluids. The growth rates computed via Direct Numerical Simulations (DNS) are in very good agreement with WKB analysis and Floquet theory. We also investigate the turbulence driven by this instability mechanism. With spatio-temporal analysis, we show that it is a weak internal wave turbulence occurring at small Froude and buoyancy Reynolds numbers. When the gap between the excitation and the Brunt-V\\"ais\\"al\\"a frequencies is increased, the frequency spectrum of this wave turbulence displays a -2 power law reminiscent of the high-frequency branch of the Garett and Munk spectrum (Garrett & Munk 1979) which has been measured in the oceans. In addition, we find that the mixing efficiency is altered compared to what is computed in the context of DNS of stratified turbulence excited at small Froude and large buoyancy Reynolds numbers and is consistent with a superposition of waves.

  10. The same as it never was? Uncertainty and the changing contours of international law

    NARCIS (Netherlands)

    Kessler, Oliver

    2011-01-01

    International law has changed significantly since the end of the Cold War. As long as the international was thought to be populated by sovereign states predominantly, international law was conceived of as a means for peaceful dispute settlement. That is: the reference to state sovereignty not only

  11. The Effects of Data-Driven Learning upon Vocabulary Acquisition for Secondary International School Students in Vietnam

    Science.gov (United States)

    Karras, Jacob Nolen

    2016-01-01

    Within the field of computer assisted language learning (CALL), scant literature exists regarding the effectiveness and practicality for secondary students to utilize data-driven learning (DDL) for vocabulary acquisition. In this study, there were 100 participants, who had a mean age of thirteen years, and were attending an international school in…

  12. Validation and calculation of uncertainties of the method of determination of creatinine in urine in internal dosimetry

    International Nuclear Information System (INIS)

    Sierra Barcedo, I.; Hernandez Gonzalez, C.; Benito Alonso, P.; Lopez Zarza, C.

    2011-01-01

    This paper describes the methodology used Lo conduct the validation of the quantification technique of content by specLrophoLomeLry creatinine in urine sarnples of exposed workers at risk of internal counterirritant, and the sludgy of ah sources uncertainty that influence in the proceas. This technique is used Lo carry ouL Lhe normahizaLion of Lhe amount of urine to urinary 24h, necessary for dosimeLric purposes, as well as a criterion for accepLance ox rejecLion of urine specimens received by the laboraLory.

  13. A method for calculating Bayesian uncertainties on internal doses resulting from complex occupational exposures

    International Nuclear Information System (INIS)

    Puncher, M.; Birchall, A.; Bull, R. K.

    2012-01-01

    Estimating uncertainties on doses from bioassay data is of interest in epidemiology studies that estimate cancer risk from occupational exposures to radionuclides. Bayesian methods provide a logical framework to calculate these uncertainties. However, occupational exposures often consist of many intakes, and this can make the Bayesian calculation computationally intractable. This paper describes a novel strategy for increasing the computational speed of the calculation by simplifying the intake pattern to a single composite intake, termed as complex intake regime (CIR). In order to assess whether this approximation is accurate and fast enough for practical purposes, the method is implemented by the Weighted Likelihood Monte Carlo Sampling (WeLMoS) method and evaluated by comparing its performance with a Markov Chain Monte Carlo (MCMC) method. The MCMC method gives the full solution (all intakes are independent), but is very computationally intensive to apply routinely. Posterior distributions of model parameter values, intakes and doses are calculated for a representative sample of plutonium workers from the United Kingdom Atomic Energy cohort using the WeLMoS method with the CIR and the MCMC method. The distributions are in good agreement: posterior means and Q 0.025 and Q 0.975 quantiles are typically within 20 %. Furthermore, the WeLMoS method using the CIR converges quickly: a typical case history takes around 10-20 min on a fast workstation, whereas the MCMC method took around 12-hr. The advantages and disadvantages of the method are discussed. (authors)

  14. Directed International Technological Change and Climate Policy: New Methods for Identifying Robust Policies Under Conditions of Deep Uncertainty

    Science.gov (United States)

    Molina-Perez, Edmundo

    It is widely recognized that international environmental technological change is key to reduce the rapidly rising greenhouse gas emissions of emerging nations. In 2010, the United Nations Framework Convention on Climate Change (UNFCCC) Conference of the Parties (COP) agreed to the creation of the Green Climate Fund (GCF). This new multilateral organization has been created with the collective contributions of COP members, and has been tasked with directing over USD 100 billion per year towards investments that can enhance the development and diffusion of clean energy technologies in both advanced and emerging nations (Helm and Pichler, 2015). The landmark agreement arrived at the COP 21 has reaffirmed the key role that the GCF plays in enabling climate mitigation as it is now necessary to align large scale climate financing efforts with the long-term goals agreed at Paris 2015. This study argues that because of the incomplete understanding of the mechanics of international technological change, the multiplicity of policy options and ultimately the presence of climate and technological change deep uncertainty, climate financing institutions such as the GCF, require new analytical methods for designing long-term robust investment plans. Motivated by these challenges, this dissertation shows that the application of new analytical methods, such as Robust Decision Making (RDM) and Exploratory Modeling (Lempert, Popper and Bankes, 2003) to the study of international technological change and climate policy provides useful insights that can be used for designing a robust architecture of international technological cooperation for climate change mitigation. For this study I developed an exploratory dynamic integrated assessment model (EDIAM) which is used as the scenario generator in a large computational experiment. The scope of the experimental design considers an ample set of climate and technological scenarios. These scenarios combine five sources of uncertainty

  15. Sepsis in Internal Medicine wards: current knowledge, uncertainties and new approaches for management optimization.

    Science.gov (United States)

    Zaccone, Vincenzo; Tosoni, Alberto; Passaro, Giovanna; Vallone, Carla Vincenza; Impagnatiello, Michele; Li Puma, Domenica Donatella; De Cosmo, Salvatore; Landolfi, Raffaele; Mirijello, Antonio

    2017-11-01

    Sepsis represents a global health problem in terms of morbidity, mortality, social and economic costs. Although usually managed in Intensive Care Units, sepsis showed an increased prevalence among Internal Medicine wards in the last decade. This is substantially due to the ageing of population and to multi-morbidity. These characteristics represent both a risk factor for sepsis and a relative contra-indication for the admission to Intensive Care Units. Although there is a lack of literature on the management of sepsis in Internal Medicine, the outcome of these patients seems to be gradually improving. This is due to Internists' increased adherence to guidelines and "bundles". The routine use of SOFA score helps physicians in the definition of septic patients, even if the optimal score has still to come. Point-of-care ultrasonography, lactates, procalcitonin and beta-d-glucan are of help for treatment optimization. The purpose of this narrative review is to focus on the management of sepsis in Internal Medicine departments, particularly on crucial concepts regarding diagnosis, risk assessment and treatment. Key Messages Sepsis is a life-threatening organ dysfunction caused by a dysregulated host response to infection. The prevalence of sepsis is constantly increasing, affecting more hospital patients than any other disease. At least half of patients affected by sepsis are admitted to Internal Medicine wards. Adherence to guidelines, routine use of clinical and lab scores and point-of-care ultrasonography are of help for early recognition of septic patients and treatment optimization.

  16. Foxes, hedgehogs, and greenhouse governance: Knowledge, uncertainty, and international policy-making in a warming World

    International Nuclear Information System (INIS)

    Michel, David

    2009-01-01

    Global environmental challenges like greenhouse warming are characterized by profound uncertainties about the workings of complex systems, high stakes as to the costs and benefits of various possible actions, and important differences concerning the values that should shape public choices, confounding ready resolution by conventional decision-making procedures. So-called adaptive or reflexive governance strategies provide policy-makers an alternative framework for tackling the greenhouse problem. Adaptive governance employs deliberate experimentation and continuous learning-by-doing to test and adjust ongoing policy responses. Yet pursuing such approaches poses particular challenges to global climate cooperation. In an increasingly interdependent world, coordinating multiple parties experimentally adopting different climate measures could prove contentious. Unequivocal policy lessons may be difficult to draw and apply. Timely collective revisions to ongoing policies may prove more difficult still to define and agree. Advocates must engage these issues directly and develop means of addressing them if adaptive governance approaches are to allow policy-makers to formulate better strategies for combating climate change. (author)

  17. FRACTURE MECHANICS UNCERTAINTY ANALYSIS IN THE RELIABILITY ASSESSMENT OF THE REACTOR PRESSURE VESSEL: (2D SUBJECTED TO INTERNAL PRESSURE

    Directory of Open Access Journals (Sweden)

    Entin Hartini

    2016-06-01

    Full Text Available ABSTRACT FRACTURE MECHANICS UNCERTAINTY ANALYSIS IN THE RELIABILITY ASSESSMENT OF THE REACTOR PRESSURE VESSEL: (2D SUBJECTED TO INTERNAL PRESSURE. The reactor pressure vessel (RPV is a pressure boundary in the PWR type reactor which serves to confine radioactive material during chain reaction process. The integrity of the RPV must be guaranteed either  in a normal operation or accident conditions. In analyzing the integrity of RPV, especially related to the crack behavior which can introduce break to the reactor pressure vessel, a fracture mechanic approach should be taken for this assessment. The uncertainty of input used in the assessment, such as mechanical properties and physical environment, becomes a reason that the assessment is not sufficient if it is perfomed only by deterministic approach. Therefore, the uncertainty approach should be applied. The aim of this study is to analize the uncertainty of fracture mechanics calculations in evaluating the reliability of PWR`s reactor pressure vessel. Random character of input quantity was generated using probabilistic principles and theories. Fracture mechanics analysis is solved by Finite Element Method (FEM with  MSC MARC software, while uncertainty input analysis is done based on probability density function with Latin Hypercube Sampling (LHS using python script. The output of MSC MARC is a J-integral value, which is converted into stress intensity factor for evaluating the reliability of RPV’s 2D. From the result of the calculation, it can be concluded that the SIF from  probabilistic method, reached the limit value of  fracture toughness earlier than SIF from  deterministic method.  The SIF generated by the probabilistic method is 105.240 MPa m0.5. Meanwhile, the SIF generated by deterministic method is 100.876 MPa m0.5. Keywords: Uncertainty analysis, fracture mechanics, LHS, FEM, reactor pressure vessels   ABSTRAK ANALISIS KETIDAKPASTIAN FRACTURE MECHANIC PADA EVALUASI KEANDALAN

  18. Calibration uncertainty

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...

  19. Smooth Adaptive Internal Model Control Based on U Model for Nonlinear Systems with Dynamic Uncertainties

    Directory of Open Access Journals (Sweden)

    Li Zhao

    2016-01-01

    Full Text Available An improved smooth adaptive internal model control based on U model control method is presented to simplify modeling structure and parameter identification for a class of uncertain dynamic systems with unknown model parameters and bounded external disturbances. Differing from traditional adaptive methods, the proposed controller can simplify the identification of time-varying parameters in presence of bounded external disturbances. Combining the small gain theorem and the virtual equivalent system theory, learning rate of smooth adaptive internal model controller has been analyzed and the closed-loop virtual equivalent system based on discrete U model has been constructed as well. The convergence of this virtual equivalent system is proved, which further shows the convergence of the complex closed-loop discrete U model system. Finally, simulation and experimental results on a typical nonlinear dynamic system verified the feasibility of the proposed algorithm. The proposed method is shown to have lighter identification burden and higher control accuracy than the traditional adaptive controller.

  20. The MDE Diploma: First International Postgraduate Specialization in Model-Driven Engineering

    Science.gov (United States)

    Cabot, Jordi; Tisi, Massimo

    2011-01-01

    Model-Driven Engineering (MDE) is changing the way we build, operate, and maintain our software-intensive systems. Several projects using MDE practices are reporting significant improvements in quality and performance but, to be able to handle these projects, software engineers need a set of technical and interpersonal skills that are currently…

  1. 46 CFR 32.50-35 - Remote manual shutdown for internal combustion engine driven cargo pump on tank vessels-TB/ALL.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Remote manual shutdown for internal combustion engine... for Cargo Handling § 32.50-35 Remote manual shutdown for internal combustion engine driven cargo pump on tank vessels—TB/ALL. (a) Any tank vessel which is equipped with an internal combustion engine...

  2. Specifications of the International Atomic Energy Agency's international project on safety assessment driven radioactive waste management solutions

    International Nuclear Information System (INIS)

    Ghannadi, M.; Asgharizadeh, F.; Assadi, M. R.

    2008-01-01

    Radioactive waste is produced in the generation of nuclear power and the production and use of radioactive materials in the industry, research, and medicine. The nuclear waste management facilities need to perform a safety assessment in order to ensure the safety of a facility. Nuclear safety assessment is a structured and systematic way of examining a proposed facility, process, operation and activity. In nuclear waste management point of view, safety assessment is a process which is used to evaluate the safety of radioactive waste management and disposal facilities. In this regard the International Atomic Energy Agency is planed to implement an international project with cooperation of some member states. The Safety Assessment Driving Radioactive Waste Management Solutions Project is an international programme of work to examine international approaches to safety assessment in aspects of p redisposal r adioactive waste management, including waste conditioning and storage. This study is described the rationale, common aspects, scope, objectives, work plan and anticipated outcomes of the project with refer to International Atomic Energy Agency's documents, such as International Atomic Energy Agency's Safety Standards, as well as the Safety Assessment Driving Radioactive Waste Management Solutions project reports

  3. Verification of gyrokinetic particle simulation of current-driven instability in fusion plasmas. I. Internal kink mode

    Energy Technology Data Exchange (ETDEWEB)

    McClenaghan, J.; Lin, Z.; Holod, I.; Deng, W.; Wang, Z. [University of California, Irvine, California 92697 (United States)

    2014-12-15

    The gyrokinetic toroidal code (GTC) capability has been extended for simulating internal kink instability with kinetic effects in toroidal geometry. The global simulation domain covers the magnetic axis, which is necessary for simulating current-driven instabilities. GTC simulation in the fluid limit of the kink modes in cylindrical geometry is verified by benchmarking with a magnetohydrodynamic eigenvalue code. Gyrokinetic simulations of the kink modes in the toroidal geometry find that ion kinetic effects significantly reduce the growth rate even when the banana orbit width is much smaller than the radial width of the perturbed current layer at the mode rational surface.

  4. 9th International Bielefeld Conference 2009: Upgrading the eLibrary: Enhanced Information Services Driven by Technology and Economics

    Directory of Open Access Journals (Sweden)

    Almuth Gastinger

    2009-10-01

    Full Text Available Thisarticle reports on the 9th International Bielefeld Conference ‘Upgrading the eLibrary: Enhanced Information Services Driven by Technology and Economics’, 3-5 February 2009, in Bielefeld, Germany. The conference focused on future challenges for libraries regarding the development of information services and infrastructures that meet the changing needs of scholarly communication, collaboration (e-science and publication (open access as well as new requirements regarding teaching and learning (virtual learning spaces. In addition attention was paid to economic conditions and strategic positioning of libraries as a general framework for information services.

  5. DREISS: Using State-Space Models to Infer the Dynamics of Gene Expression Driven by External and Internal Regulatory Networks

    Science.gov (United States)

    Gerstein, Mark

    2016-01-01

    Gene expression is controlled by the combinatorial effects of regulatory factors from different biological subsystems such as general transcription factors (TFs), cellular growth factors and microRNAs. A subsystem’s gene expression may be controlled by its internal regulatory factors, exclusively, or by external subsystems, or by both. It is thus useful to distinguish the degree to which a subsystem is regulated internally or externally–e.g., how non-conserved, species-specific TFs affect the expression of conserved, cross-species genes during evolution. We developed a computational method (DREISS, dreiss.gerteinlab.org) for analyzing the Dynamics of gene expression driven by Regulatory networks, both External and Internal based on State Space models. Given a subsystem, the “state” and “control” in the model refer to its own (internal) and another subsystem’s (external) gene expression levels. The state at a given time is determined by the state and control at a previous time. Because typical time-series data do not have enough samples to fully estimate the model’s parameters, DREISS uses dimensionality reduction, and identifies canonical temporal expression trajectories (e.g., degradation, growth and oscillation) representing the regulatory effects emanating from various subsystems. To demonstrate capabilities of DREISS, we study the regulatory effects of evolutionarily conserved vs. divergent TFs across distant species. In particular, we applied DREISS to the time-series gene expression datasets of C. elegans and D. melanogaster during their embryonic development. We analyzed the expression dynamics of the conserved, orthologous genes (orthologs), seeing the degree to which these can be accounted for by orthologous (internal) versus species-specific (external) TFs. We found that between two species, the orthologs have matched, internally driven expression patterns but very different externally driven ones. This is particularly true for genes with

  6. DREISS: Using State-Space Models to Infer the Dynamics of Gene Expression Driven by External and Internal Regulatory Networks.

    Directory of Open Access Journals (Sweden)

    Daifeng Wang

    2016-10-01

    Full Text Available Gene expression is controlled by the combinatorial effects of regulatory factors from different biological subsystems such as general transcription factors (TFs, cellular growth factors and microRNAs. A subsystem's gene expression may be controlled by its internal regulatory factors, exclusively, or by external subsystems, or by both. It is thus useful to distinguish the degree to which a subsystem is regulated internally or externally-e.g., how non-conserved, species-specific TFs affect the expression of conserved, cross-species genes during evolution. We developed a computational method (DREISS, dreiss.gerteinlab.org for analyzing the Dynamics of gene expression driven by Regulatory networks, both External and Internal based on State Space models. Given a subsystem, the "state" and "control" in the model refer to its own (internal and another subsystem's (external gene expression levels. The state at a given time is determined by the state and control at a previous time. Because typical time-series data do not have enough samples to fully estimate the model's parameters, DREISS uses dimensionality reduction, and identifies canonical temporal expression trajectories (e.g., degradation, growth and oscillation representing the regulatory effects emanating from various subsystems. To demonstrate capabilities of DREISS, we study the regulatory effects of evolutionarily conserved vs. divergent TFs across distant species. In particular, we applied DREISS to the time-series gene expression datasets of C. elegans and D. melanogaster during their embryonic development. We analyzed the expression dynamics of the conserved, orthologous genes (orthologs, seeing the degree to which these can be accounted for by orthologous (internal versus species-specific (external TFs. We found that between two species, the orthologs have matched, internally driven expression patterns but very different externally driven ones. This is particularly true for genes with

  7. Quantification and Minimization of Uncertainties of Internal Target Volume for Stereotactic Body Radiation Therapy of Lung Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Ge Hong [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States); Department of Radiation Oncology, Henan Cancer Hospital, the Affiliated Cancer Hospital of Zhengzhou University, Henan (China); Cai Jing; Kelsey, Chris R. [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States); Yin Fangfang, E-mail: fangfang.yin@duke.edu [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States)

    2013-02-01

    Purpose: To quantify uncertainties in delineating an internal target volume (ITV) and to understand how these uncertainties may be individually minimized for stereotactic body radiation therapy (SBRT) of early stage non-small cell lung cancer (NSCLC). Methods and Materials: Twenty patients with NSCLC who were undergoing SBRT were imaged with free-breathing 3-dimensional computed tomography (3DCT) and 10-phase 4-dimensional CT (4DCT) for delineating gross tumor volume (GTV){sub 3D} and ITV{sub 10Phase} (ITV3). The maximum intensity projection (MIP) CT was also calculated from 10-phase 4DCT for contouring ITV{sub MIP} (ITV1). Then, ITV{sub COMB} (ITV2), ITV{sub 10Phase+GTV3D} (ITV4), and ITV{sub 10Phase+ITVCOMB} (ITV5) were generated by combining ITV{sub MIP} and GTV{sub 3D}, ITV{sub 10phase} and GTV{sub 3D}, and ITV{sub 10phase} and ITV{sub COMB}, respectively. All 6 volumes (GTV{sub 3D} and ITV1 to ITV5) were delineated in the same lung window by the same radiation oncologist. The percentage of volume difference (PVD) between any 2 different volumes was determined and was correlated to effective tumor diameter (ETD), tumor motion ranges, R{sub 3D}, and the amplitude variability of the recorded breathing signal (v) to assess their volume variations. Results: The mean (range) tumor motion (R{sub SI}, R{sub AP}, R{sub ML}, and R{sub 3D}) and breathing variability (v) were 7.6 mm (2-18 mm), 4.0 mm (2-8 mm), 3.3 mm (0-7.5 mm), 9.9 mm (4.1-18.7 mm), and 0.17 (0.07-0.37), respectively. The trend of volume variation was GTV{sub 3D}

  8. Ballooning-mirror instability and internally driven Pc 4--5 wave events

    International Nuclear Information System (INIS)

    Cheng, C.Z.; Qian, Q.; Takahashi, K.; Lui, A.T.Y.

    1994-03-01

    A kinetic-MHD field-aligned eigenmode stability analysis of low frequency ballooning-mirror instabilities has been performed for anisotropic pressure plasma sin the magnetosphere. The ballooning mode is mainly a transverse wave driven unstable by pressure gradient in the bad curvature region. The mirror mode with a dominant compressional magnetic field perturbation is excited when the product of plasma beta and pressure anisotropy (P perpendicular /P parallel > 1) is large. From the AMPTE/CCE particle and magnetic field data observed during Pc 4--5 wave events the authors compute the ballooning-mirror instability parameters and perform a correlation study with the theoretical instability threshold. They find that compressional Pc 5 waves approximately satisfy the ballooning-mirror instability condition, and transverse Pc 4--5 waves are probably related to resonant ballooning instabilities with small pressure anisotropy

  9. International Experiences and Frameworks to Support Country-Driven Low-Emissions Development

    Energy Technology Data Exchange (ETDEWEB)

    Benioff, R.; Cochran, J.; Cox, S.

    2012-08-01

    Countries can use low-emission development strategies (LEDS) to advance sustainable development, promote private-sector growth, and reduce greenhouse gas emissions. This paper proposes a framework -- or support infrastructure -- to enable the efficient exchange of LEDS-related knowledge and technical assistance. Under the proposed framework, countries share LEDS-related resources via coordinating forums, 'knowledge platforms,' and networks of experts and investors. The virtual 'knowledge platforms' foster learning by allowing countries to communicate with each other and share technical reports, data, and analysis tools in support of LEDS development. Investing in all elements of the framework in an integrated fashion increases the efficacy of support for country-driven LEDS.

  10. Preface [HD3-2015: International meeting on high-dimensional data-driven science

    International Nuclear Information System (INIS)

    2016-01-01

    A never-ending series of innovations in measurement technology and evolutions in information and communication technologies have led to the ongoing generation and accumulation of large quantities of high-dimensional data every day. While detailed data-centric approaches have been pursued in respective research fields, situations have been encountered where the same mathematical framework of high-dimensional data analysis can be found in a wide variety of seemingly unrelated research fields, such as estimation on the basis of undersampled Fourier transform in nuclear magnetic resonance spectroscopy in chemistry, in magnetic resonance imaging in medicine, and in astronomical interferometry in astronomy. In such situations, bringing diverse viewpoints together therefore becomes a driving force for the creation of innovative developments in various different research fields. This meeting focuses on “Sparse Modeling” (SpM) as a methodology for creation of innovative developments through the incorporation of a wide variety of viewpoints in various research fields. The objective of this meeting is to offer a forum where researchers with interest in SpM can assemble and exchange information on the latest results and newly established methodologies, and discuss future directions of the interdisciplinary studies for High-Dimensional Data-Driven science (HD 3 ). The meeting was held in Kyoto from 14-17 December 2015. We are pleased to publish 22 papers contributed by invited speakers in this volume of Journal of Physics: Conference Series. We hope that this volume will promote further development of High-Dimensional Data-Driven science. (paper)

  11. Monitoring stress among internal medicine residents: an experience-driven, practical and short measure.

    Science.gov (United States)

    Myszkowski, Nils; Villoing, Barbara; Zenasni, Franck; Jaury, Philippe; Boujut, Emilie

    2017-07-01

    Residents experience severely high levels of stress, depression and burnout, leading to perceived medical errors, as well as to symptoms of impairment, such as chronic anger, cognitive impairment, suicidal behavior and substance abuse. Because research has not yet provided a psychometrically robust population-specific tool to measure the level of stress of medicine residents, we aimed at building and validating such a measure. Using an inductive scale development approach, a short, pragmatic measure was built, based on the interviews of 17 medicine residents. The Internal Medicine Residency Stress Scale (IMRSS) was then administered in a sample of 259 internal medicine residents (199 females, 60 males, M Age  = 25.6) along with the Hospital Anxiety and Depression Scale, Maslach Burnout Inventory, Satisfaction With Life Scale and Ways of Coping Checklist. The IMRSS showed satisfactory internal reliability (Cronbach's α = .86), adequate structural validity - studied through Confirmatory Factor Analysis (χ 2 /df = 2.51, CFI = .94; SRMR = .037, RMSEA = .076) - and good criterion validity - the IMRSS was notably strongly correlated with emotional exhaustion (r = .64; p is recommended to quickly and frequently assess and monitor stress among internal medicine residents.

  12. Uncertainties in the Antarctic Ice Sheet Contribution to Sea Level Rise: Exploration of Model Response to Errors in Climate Forcing, Boundary Conditions, and Internal Parameters

    Science.gov (United States)

    Schlegel, N.; Seroussi, H. L.; Boening, C.; Larour, E. Y.; Limonadi, D.; Schodlok, M.; Watkins, M. M.

    2017-12-01

    The Jet Propulsion Laboratory-University of California at Irvine Ice Sheet System Model (ISSM) is a thermo-mechanical 2D/3D parallelized finite element software used to physically model the continental-scale flow of ice at high resolutions. Embedded into ISSM are uncertainty quantification (UQ) tools, based on the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) software. ISSM-DAKOTA offers various UQ methods for the investigation of how errors in model input impact uncertainty in simulation results. We utilize these tools to regionally sample model input and key parameters, based on specified bounds of uncertainty, and run a suite of continental-scale 100-year ISSM forward simulations of the Antarctic Ice Sheet. Resulting diagnostics (e.g., spread in local mass flux and regional mass balance) inform our conclusion about which parameters and/or forcing has the greatest impact on century-scale model simulations of ice sheet evolution. The results allow us to prioritize the key datasets and measurements that are critical for the minimization of ice sheet model uncertainty. Overall, we find that Antartica's total sea level contribution is strongly affected by grounding line retreat, which is driven by the magnitude of ice shelf basal melt rates and by errors in bedrock topography. In addition, results suggest that after 100 years of simulation, Thwaites glacier is the most significant source of model uncertainty, and its drainage basin has the largest potential for future sea level contribution. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  13. How Effective Have Thirty Years of Internationally Driven Conservation and Development Efforts Been in Madagascar?

    Science.gov (United States)

    Wilmé, Lucienne; Mercier, Jean-Roger; Camara, Christian; Lowry, Porter P.

    2016-01-01

    Conservation and development are intricately linked. The international donor community has long provided aid to tropical countries in an effort to alleviate poverty and conserve biodiversity. While hundreds of millions of $ have been invested in over 500 environmental-based projects in Madagascar during the period covered by a series of National Environmental Action Plans (1993–2008) and the protected areas network has expanded threefold, deforestation remains unchecked and none of the eight Millennium Development Goals (MDGs) established for 2000–2015 were likely be met. Efforts to achieve sustainable development had failed to reduce poverty or deliver progress toward any of the MDGs. Cross-sectorial policy adjustments are needed that (i) enable and catalyze Madagascar’s capacities rather than deepening dependency on external actors such as the World Bank, the International Monetary Fund and donor countries, and that (ii) deliver improvements to the livelihoods and wellbeing of the country’s rural poor. PMID:27532499

  14. How Effective Have Thirty Years of Internationally Driven Conservation and Development Efforts Been in Madagascar?

    Science.gov (United States)

    Waeber, Patrick O; Wilmé, Lucienne; Mercier, Jean-Roger; Camara, Christian; Lowry, Porter P

    2016-01-01

    Conservation and development are intricately linked. The international donor community has long provided aid to tropical countries in an effort to alleviate poverty and conserve biodiversity. While hundreds of millions of $ have been invested in over 500 environmental-based projects in Madagascar during the period covered by a series of National Environmental Action Plans (1993-2008) and the protected areas network has expanded threefold, deforestation remains unchecked and none of the eight Millennium Development Goals (MDGs) established for 2000-2015 were likely be met. Efforts to achieve sustainable development had failed to reduce poverty or deliver progress toward any of the MDGs. Cross-sectorial policy adjustments are needed that (i) enable and catalyze Madagascar's capacities rather than deepening dependency on external actors such as the World Bank, the International Monetary Fund and donor countries, and that (ii) deliver improvements to the livelihoods and wellbeing of the country's rural poor.

  15. Customer driven marketing strategy of LIC international in Bahrain: a product specific study

    OpenAIRE

    Pillai, Rajasekharan; Rao, M S; Thampy, Jaik; Peter, Jerrin

    2011-01-01

    Abstract Marketing of service product requires a slightly different strategy owing to the idiosyncratic nature of service items. The present study explores the customer oriented marketing strategy of LIC International in the Kingdom of Bahrain. The approach of the study was exploratory and personal interview was conducted to contribute major input source to the research. The company has been following a different marketing strategy in the study area different from the conventional approach in...

  16. Quantifying Carbon Financial Risk in the International Greenhouse Gas Market: An Application Using Remotely-Sensed Data to Align Scientific Uncertainty with Financial Decisions

    Science.gov (United States)

    Hultman, N. E.

    2002-12-01

    A common complaint about environmental policy is that regulations inadequately reflect scientific uncertainty and scientific consensus. While the causes of this phenomenon are complex and hard to discern, we know that corporations are the primary implementers of environmental regulations; therefore, focusing on how policy relates scientific knowledge to corporate decisions can provide valuable insights. Within the context of the developing international market for greenhouse gas emissions, I examine how corporations would apply finance theory into their investment decisions for carbon abatement projects. Using remotely-sensed ecosystem scale carbon flux measurements, I show how to determine much financial risk of carbon is diversifiable. I also discuss alternative, scientifically sound methods for hedging the non-diversifiable risks in carbon abatement projects. In providing a quantitative common language for scientific and corporate uncertainties, the concept of carbon financial risk provides an opportunity for expanding communication between these elements essential to successful climate policy.

  17. Qualification and application of nuclear reactor accident analysis code with the capability of internal assessment of uncertainty; Qualificacao e aplicacao de codigo de acidentes de reatores nucleares com capacidade interna de avaliacao de incerteza

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Ronaldo Celem

    2001-10-15

    This thesis presents an independent qualification of the CIAU code ('Code with the capability of - Internal Assessment of Uncertainty') which is part of the internal uncertainty evaluation process with a thermal hydraulic system code on a realistic basis. This is done by combining the uncertainty methodology UMAE ('Uncertainty Methodology based on Accuracy Extrapolation') with the RELAP5/Mod3.2 code. This allows associating uncertainty band estimates with the results obtained by the realistic calculation of the code, meeting licensing requirements of safety analysis. The independent qualification is supported by simulations with RELAP5/Mod3.2 related to accident condition tests of LOBI experimental facility and to an event which has occurred in Angra 1 nuclear power plant, by comparison with measured results and by establishing uncertainty bands on safety parameter calculated time trends. These bands have indeed enveloped the measured trends. Results from this independent qualification of CIAU have allowed to ascertain the adequate application of a systematic realistic code procedure to analyse accidents with uncertainties incorporated in the results, although there is an evident need of extending the uncertainty data base. It has been verified that use of the code with this internal assessment of uncertainty is feasible in the design and license stages of a NPP. (author)

  18. PROCEEDINGS OF THE INTERNATIONAL WORKSHOP ON UNCERTAINTY, SENSITIVITY, AND PARAMETER ESTIMATION FOR MULTIMEDIA ENVIRONMENTAL MODELING. EPA/600/R-04/117, NUREG/CP-0187, ERDC SR-04-2.

    Science.gov (United States)

    An International Workshop on Uncertainty, Sensitivity, and Parameter Estimation for Multimedia Environmental Modeling was held August 1921, 2003, at the U.S. Nuclear Regulatory Commission Headquarters in Rockville, Maryland, USA. The workshop was organized and convened by the Fe...

  19. Beyond the International Linear Collider Driven by FEL with Energy Recovery at 5-10TeV

    CERN Document Server

    Hajima, R

    2005-01-01

    The international linear collider (ILC) at the extreme high energy frontier provides the best hope for the scientist to probe the finenst structure of matter and its origin and perhaps even the origin of the Universe. The technology that employs is based on superconducting RF technology. This technology may usher in a new era for the development of superconducting accelerator technology. On the other hand, the gradient that is allowed in such an accelerator is limited. If one wishes something beyond this after one learns the physics at such high energies(~0.5TeV) and utilizing such technology, one may need a new way to employ the supeconducting technology in providing high gradient compact accelerators. Inspired by a former work of 5-TeV colliders based on solid-state tera-watt lasers [1], we explore 5-10 TeV linear colliders driven by free-electron lasers equipped with energy-recovery system. A preliminary design study suggests that a 5-10 TeV collider with the luminosity of 10(34) can be realized by multi-s...

  20. The modulation of EEG variability between internally- and externally-driven cognitive states varies with maturation and task performance.

    Directory of Open Access Journals (Sweden)

    Jessie M H Szostakiwskyj

    Full Text Available Increasing evidence suggests that brain signal variability is an important measure of brain function reflecting information processing capacity and functional integrity. In this study, we examined how maturation from childhood to adulthood affects the magnitude and spatial extent of state-to-state transitions in brain signal variability, and how this relates to cognitive performance. We looked at variability changes between resting-state and task (a symbol-matching task with three levels of difficulty, and within trial (fixation, post-stimulus, and post-response. We calculated variability with multiscale entropy (MSE, and additionally examined spectral power density (SPD from electroencephalography (EEG in children aged 8-14, and in adults aged 18-33. Our results suggest that maturation is characterized by increased local information processing (higher MSE at fine temporal scales and decreased long-range interactions with other neural populations (lower MSE at coarse temporal scales. Children show MSE changes that are similar in magnitude, but greater in spatial extent when transitioning between internally- and externally-driven brain states. Additionally, we found that in children, greater changes in task difficulty were associated with greater magnitude of modulation in MSE. Our results suggest that the interplay between maturational and state-to-state changes in brain signal variability manifest across different spatial and temporal scales, and influence information processing capacity in the brain.

  1. An adaptive control algorithm for optimization of intensity modulated radiotherapy considering uncertainties in beam profiles, patient set-up and internal organ motion

    International Nuclear Information System (INIS)

    Loef, Johan; Lind, Bengt K.; Brahme, Anders

    1998-01-01

    A new general beam optimization algorithm for inverse treatment planning is presented. It utilizes a new formulation of the probability to achieve complication-free tumour control. The new formulation explicitly describes the dependence of the treatment outcome on the incident fluence distribution, the patient geometry, the radiobiological properties of the patient and the fractionation schedule. In order to account for both measured and non-measured positioning uncertainties, the algorithm is based on a combination of dynamic and stochastic optimization techniques. Because of the difficulty in measuring all aspects of the intra- and interfractional variations in the patient geometry, such as internal organ displacements and deformations, these uncertainties are primarily accounted for in the treatment planning process by intensity modulation using stochastic optimization. The information about the deviations from the nominal fluence profiles and the nominal position of the patient relative to the beam that is obtained by portal imaging during treatment delivery, is used in a feedback loop to automatically adjust the profiles and the location of the patient for all subsequent treatments. Based on the treatment delivered in previous fractions, the algorithm furnishes optimal corrections for the remaining dose delivery both with regard to the fluence profile and its position relative to the patient. By dynamically refining the beam configuration from fraction to fraction, the algorithm generates an optimal sequence of treatments that very effectively reduces the influence of systematic and random set-up uncertainties to minimize and almost eliminate their overall effect on the treatment. Computer simulations have shown that the present algorithm leads to a significant increase in the probability of uncomplicated tumour control compared with the simple classical approach of adding fixed set-up margins to the internal target volume. (author)

  2. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  3. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  4. Single-Bunch Instability Driven by the Electron Cloud Effect in the Positron Damping Ring of the International Linear Collider

    International Nuclear Information System (INIS)

    Pivi, Mauro; Raubenheimer, Tor O.; Ghalam, Ali; Harkay, Katherine; Ohmi, Kazuhito; Wanzenberg, Rainer; Wolski, Andrzej; Zimmermann, Frank

    2005-01-01

    Collective instabilities caused by the formation of an electron cloud (EC) are a potential limitation to the performances of the damping rings for a future linear collider. In this paper, we present recent simulation results for the electron cloud build-up in damping rings of different circumferences and discuss the single-bunch instabilities driven by the electron cloud

  5. Awe, uncertainty, and agency detection.

    Science.gov (United States)

    Valdesolo, Piercarlo; Graham, Jesse

    2014-01-01

    Across five studies, we found that awe increases both supernatural belief (Studies 1, 2, and 5) and intentional-pattern perception (Studies 3 and 4)-two phenomena that have been linked to agency detection, or the tendency to interpret events as the consequence of intentional and purpose-driven agents. Effects were both directly and conceptually replicated, and mediational analyses revealed that these effects were driven by the influence of awe on tolerance for uncertainty. Experiences of awe decreased tolerance for uncertainty, which, in turn, increased the tendency to believe in nonhuman agents and to perceive human agency in random events.

  6. Outcome-driven thresholds for home blood pressure measurement: international database of home blood pressure in relation to cardiovascular outcome.

    Science.gov (United States)

    Niiranen, Teemu J; Asayama, Kei; Thijs, Lutgarde; Johansson, Jouni K; Ohkubo, Takayoshi; Kikuya, Masahiro; Boggia, José; Hozawa, Atsushi; Sandoya, Edgardo; Stergiou, George S; Tsuji, Ichiro; Jula, Antti M; Imai, Yutaka; Staessen, Jan A

    2013-01-01

    The lack of outcome-driven operational thresholds limits the clinical application of home blood pressure (BP) measurement. Our objective was to determine an outcome-driven reference frame for home BP measurement. We measured home and clinic BP in 6470 participants (mean age, 59.3 years; 56.9% women; 22.4% on antihypertensive treatment) recruited in Ohasama, Japan (n=2520); Montevideo, Uruguay (n=399); Tsurugaya, Japan (n=811); Didima, Greece (n=665); and nationwide in Finland (n=2075). In multivariable-adjusted analyses of individual subject data, we determined home BP thresholds, which yielded 10-year cardiovascular risks similar to those associated with stages 1 (120/80 mm Hg) and 2 (130/85 mm Hg) prehypertension, and stages 1 (140/90 mm Hg) and 2 (160/100 mm Hg) hypertension on clinic measurement. During 8.3 years of follow-up (median), 716 cardiovascular end points, 294 cardiovascular deaths, 393 strokes, and 336 cardiac events occurred in the whole cohort; in untreated participants these numbers were 414, 158, 225, and 194, respectively. In the whole cohort, outcome-driven systolic/diastolic thresholds for the home BP corresponding with stages 1 and 2 prehypertension and stages 1 and 2 hypertension were 121.4/77.7, 127.4/79.9, 133.4/82.2, and 145.4/86.8 mm Hg; in 5018 untreated participants, these thresholds were 118.5/76.9, 125.2/79.7, 131.9/82.4, and 145.3/87.9 mm Hg, respectively. Rounded thresholds for stages 1 and 2 prehypertension and stages 1 and 2 hypertension amounted to 120/75, 125/80, 130/85, and 145/90 mm Hg, respectively. Population-based outcome-driven thresholds for home BP are slightly lower than those currently proposed in hypertension guidelines. Our current findings could inform guidelines and help clinicians in diagnosing and managing patients.

  7. Redesign of a pilot international online course on accelerator driven systems for nuclear transmutation to implement a massive open online course

    Energy Technology Data Exchange (ETDEWEB)

    Alonso-Ramos, M.; Fernandez-Luna, A. J.; Gonzalez-Romero, E. M.; Sanchez-Elvira, A.; Castro, M.; Ogando, F.; Sanz, J.; Martin, S.

    2014-07-01

    In April 2013, a full-distance international pilot course on ADS (Accelerator Driven Systems) for advanced nuclear waste transmutation was taught by UNED-CIEMAT within FP7 ENEN-III project. The experience ran with 10 trainees from the project, using UNED virtual learning platform a LF. Video classes, web-conferences and recorded simulations of case studies were the main learning materials. Asynchronous and synchronous communication tools were used for tutoring purposes, and a final examination for online submission and a final survey were included. (Author)

  8. Redesign of a pilot international online course on accelerator driven systems for nuclear transmutation to implement a massive open online course

    International Nuclear Information System (INIS)

    Alonso-Ramos, M.; Fernandez-Luna, A. J.; Gonzalez-Romero, E. M.; Sanchez-Elvira, A.; Castro, M.; Ogando, F.; Sanz, J.; Martin, S.

    2014-01-01

    In April 2013, a full-distance international pilot course on ADS (Accelerator Driven Systems) for advanced nuclear waste transmutation was taught by UNED-CIEMAT within FP7 ENEN-III project. The experience ran with 10 trainees from the project, using UNED virtual learning platform a LF. Video classes, web-conferences and recorded simulations of case studies were the main learning materials. Asynchronous and synchronous communication tools were used for tutoring purposes, and a final examination for online submission and a final survey were included. (Author)

  9. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  10. Benchmarking observational uncertainties for hydrology (Invited)

    Science.gov (United States)

    McMillan, H. K.; Krueger, T.; Freer, J. E.; Westerberg, I.

    2013-12-01

    There is a pressing need for authoritative and concise information on the expected error distributions and magnitudes in hydrological data, to understand its information content. Many studies have discussed how to incorporate uncertainty information into model calibration and implementation, and shown how model results can be biased if uncertainty is not appropriately characterised. However, it is not always possible (for example due to financial or time constraints) to make detailed studies of uncertainty for every research study. Instead, we propose that the hydrological community could benefit greatly from sharing information on likely uncertainty characteristics and the main factors that control the resulting magnitude. In this presentation, we review the current knowledge of uncertainty for a number of key hydrological variables: rainfall, flow and water quality (suspended solids, nitrogen, phosphorus). We collated information on the specifics of the data measurement (data type, temporal and spatial resolution), error characteristics measured (e.g. standard error, confidence bounds) and error magnitude. Our results were primarily split by data type. Rainfall uncertainty was controlled most strongly by spatial scale, flow uncertainty was controlled by flow state (low, high) and gauging method. Water quality presented a more complex picture with many component errors. For all variables, it was easy to find examples where relative error magnitude exceeded 40%. We discuss some of the recent developments in hydrology which increase the need for guidance on typical error magnitudes, in particular when doing comparative/regionalisation and multi-objective analysis. Increased sharing of data, comparisons between multiple catchments, and storage in national/international databases can mean that data-users are far removed from data collection, but require good uncertainty information to reduce bias in comparisons or catchment regionalisation studies. Recently it has

  11. Delphi-RAND consensus of the Spanish Society of Internal Medicine on the controversies in anticoagulant therapy and prophylaxis in medical diseases. INTROMBIN Project (Uncertainty in thromboprophylaxis in internal medicine).

    Science.gov (United States)

    Ruiz-Ruiz, F; Medrano, F J; Navarro-Puerto, M A; Rodríguez-Torres, P; Romero-Alonso, A; Santos-Lozano, J M; Alonso-Ortiz Del Rio, C; Varela-Aguilar, J M; Calderón, E J; Marín-León, I

    2018-05-21

    The aim of this study was to determine the opinion of internists on the management of anticoagulation and thromboembolism prophylaxis in complex clinical scenarios in which the risk-benefit ratio of surgery is narrow and to develop a consensus document on the use of drugs anticoagulant therapy in this patient group. To this end, we identified by consensus the clinical areas of greatest uncertainty, a survey was created with 20 scenarios laid out in 40 clinical questions, and we reviewed the specific literature. The survey was distributed among the internists of the Spanish Society of Internal Medicine (SEMI) and was completed by 290 of its members. The consensus process was implemented by changing the Delphi-RAND appropriateness method in an anonymous, double-round process that enabled an expert panel to identify the areas of agreement and uncertainty. In our case, we also added the survey results to the panel, a methodological innovation that helps provide additional information on the standard clinical practice. The result of the process is a set of 19 recommendations formulated by SEMI experts, which helps establish guidelines for action on anticoagulant therapy in complex scenarios (high risk or active haemorrhage, short life expectancy, coexistence of antiplatelet therapy or comorbidities such as kidney disease and liver disease), which are not uncommon in standard clinical practice. Copyright © 2018 Elsevier España, S.L.U. and Sociedad Española de Medicina Interna (SEMI). All rights reserved.

  12. Decommissioning funding: ethics, implementation, uncertainties

    International Nuclear Information System (INIS)

    2006-01-01

    This status report on Decommissioning Funding: Ethics, Implementation, Uncertainties also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). The report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems. (authors)

  13. Exploratory studies into strategies to enhance innovation-driven international competitiveness in a port context : Toward ambidextrous ports

    NARCIS (Netherlands)

    R.M.A. Hollen (Rick)

    2015-01-01

    markdownabstractResearch has highlighted that firms competing in dynamic environments have to balance exploitative (efficiency-directed) activities with explorative (innovation-directed) ones in order to remain internationally competitive. In economically advanced countries, whose competitiveness is

  14. Technology and Components of Accelerator-driven Systems. Second International Workshop Proceedings, Nantes, France, 21-23 May 2013

    International Nuclear Information System (INIS)

    2015-01-01

    The accelerator-driven system (ADS) is a potential transmutation system option as part of partitioning and transmutation strategies for radioactive waste in advanced nuclear fuel cycles. Following the success of the workshop series on the utilisation and reliability of the High Power Proton Accelerators (HPPA), the scope of this new workshop series on Technology and Components of Accelerator-driven Systems has been extended to cover subcritical systems as well as the use of neutron sources. The workshop organised by the OECD Nuclear Energy Agency provided experts with a forum to present and discuss state-of-the-art developments in the field of ADS and neutron sources. A total of 40 papers were presented during the oral and poster sessions. Four technical sessions were organised addressing ADS experiments and test facilities, accelerators, simulation, safety, data, neutron sources that were opportunity to present the status of projects like the MYRRHA facility, the MEGAPIE target, FREYA and GUINEVERE experiments, the KIPT neutron source, and the FAIR linac. These proceedings include all the papers presented at the workshop

  15. Teaching Uncertainties

    Science.gov (United States)

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  16. Elevated Body Mass Index is Associated with Increased Integration and Reduced Cohesion of Sensory-Driven and Internally Guided Resting-State Functional Brain Networks.

    Science.gov (United States)

    Doucet, Gaelle E; Rasgon, Natalie; McEwen, Bruce S; Micali, Nadia; Frangou, Sophia

    2018-03-01

    Elevated body mass index (BMI) is associated with increased multi-morbidity and mortality. The investigation of the relationship between BMI and brain organization has the potential to provide new insights relevant to clinical and policy strategies for weight control. Here, we quantified the association between increasing BMI and the functional organization of resting-state brain networks in a sample of 496 healthy individuals that were studied as part of the Human Connectome Project. We demonstrated that higher BMI was associated with changes in the functional connectivity of the default-mode network (DMN), central executive network (CEN), sensorimotor network (SMN), visual network (VN), and their constituent modules. In siblings discordant for obesity, we showed that person-specific factors contributing to obesity are linked to reduced cohesiveness of the sensory networks (SMN and VN). We conclude that higher BMI is associated with widespread alterations in brain networks that balance sensory-driven (SMN, VN) and internally guided (DMN, CEN) states which may augment sensory-driven behavior leading to overeating and subsequent weight gain. Our results provide a neurobiological context for understanding the association between BMI and brain functional organization while accounting for familial and person-specific influences. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models....... This retooling addresses several shortcomings. First, the imperfect correlation of demands reconciles the sales variation observed in and across destinations. Second, since demands for the firm's output are correlated across destinations, a firm can use previously realized demands to forecast unknown demands...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles...

  18. Proceedings of 14th international workshop on Asian network for accelerator-driven system and nuclear transmutation technology (ADS-NTT 2016)

    International Nuclear Information System (INIS)

    Pyeon, Cheol Ho

    2016-09-01

    The proceedings describe the current status on research and development (R and D) of accelerator-driven system (ADS) and nuclear transmutation techniques (NTT), including nuclear data, accelerator techniques, Pb-Bi target, fuel technologies and reactor physics, in East Asian countries: China, Korea and Japan. The proceedings also include all presentation materials presented in 'the 14th International Workshop on Asian Network for ADS and NTT (ADS-NTT2016)' held at Mito, Japan on 5th September, 2016. The objective of this workshop is to make actual progress of ADS R and D especially in East Asian countries, as well as in European countries, through sharing mutual interests and conducting the information exchange each other. The report is composed of these following items: Presentation materials: ADS-NTT 2016. (author)

  19. Proceedings of 12th international workshop on Asian network for accelerator-driven system and nuclear transmutation technology (ADS+NTT 2014)

    International Nuclear Information System (INIS)

    Pyeon, Cheol Ho

    2015-01-01

    The proceedings describe the current status on research and development (R and D) of accelerator-driven system (ADS) and nuclear transmutation techniques (NTT), including nuclear data, accelerator techniques, Pb-Bi target, fuel technologies and reactor physics, in East Asian countries: China, Japan and Korea. The proceedings also include all presentation materials presented in 'the 12th International Workshop on Asian Network for ADS and NTT (ADS+NTT 2014)' held at the Institute of Nuclear Energy and Safety Technology, Chinese Academy of Sciences, Hefei, China on 15th and 16th December, 2014. The objective of this workshop is to make actual progress of ADS R and D especially in East Asian countries, as well as in European countries, through sharing mutual interests and conducting the information exchange each other. The report is composed of these following items: Presentation materials: ADS+NTT 2014. (author)

  20. Proceedings of 11th international workshop on Asian network for accelerator-driven system and nuclear transmutation technology (ADS+NTT 2013)

    International Nuclear Information System (INIS)

    Pyeon, Cheol Ho

    2014-01-01

    The proceedings describe the current status on research and development (R and D) of accelerator-driven system (ADS) and nuclear transmutation techniques (NTT), including nuclear data, accelerator techniques, Pb-Bi target, fuel technologies and reactor physics, in East Asian countries: Korea, China and Japan. The proceedings also include all presentation materials presented in 'the 11th International Workshop on Asian Network for ADS and NTT (ADS+NTT 2013)' held at the Seoul National University, Seoul, Korea on 12th and 13th December, 2013. The objective of this workshop is to make actual progress of ADS R and D especially in East Asian countries, as well as in European countries, through sharing mutual interests and conducting the information exchange each other. The report is composed of these following items: Presentation materials: ADS+NTT 2013. (author)

  1. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  2. Validation and calculation of uncertainties of the method of determination of creatinine in urine in internal dosimetry; Validacion y calculo de incertidumbres del metodo de determinacion de creatinina en orina en dosimetria interna

    Energy Technology Data Exchange (ETDEWEB)

    Sierra Barcedo, I.; Hernandez Gonzalez, C.; Benito Alonso, P.; Lopez Zarza, C.

    2011-07-01

    This paper describes the methodology used to conduct the validation of the quantification technique of content by spectrophotometry creatinine in urine samples of exposed workers at risk of internal counter and the study of all sources uncertainty that influence in the process. This technique is used to carry out the normalization of the amount of urine to urinary 24h, necessary for dosimetric purposes, as well as a criterion for acceptance of rejection of urine specimens received by the laboratory.

  3. Photometric Uncertainties

    Science.gov (United States)

    Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon

    2018-01-01

    The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.

  4. Uncertainty analysis

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  5. But science is international! Finding time and space to encourage intercultural learning in a content-driven physiology unit.

    Science.gov (United States)

    Etherington, Sarah J

    2014-06-01

    Internationalization of the curriculum is central to the strategic direction of many modern universities and has widespread benefits for student learning. However, these clear aspirations for internationalization of the curriculum have not been widely translated into more internationalized course content and teaching methods in the classroom, particularly in scientific disciplines. This study addressed one major challenge to promoting intercultural competence among undergraduate science students: finding time to scaffold such learning within the context of content-heavy, time-poor units. Small changes to enhance global and intercultural awareness were incorporated into existing assessments and teaching activities within a second-year biomedical physiology unit. Interventions were designed to start a conversation about global and intercultural perspectives on physiology, to embed the development of global awareness into the assessment and to promote cultural exchanges through peer interactions. In student surveys, 40% of domestic and 60% of international student respondents articulated specific learning about interactions in cross-cultural groups resulting from unit activities. Many students also identified specific examples of how cultural beliefs would impact on the place of biomedical physiology within the global community. In addition, staff observed more widespread benefits for student engagement and learning. It is concluded that a significant development of intercultural awareness and a more global perspective on scientific understanding can be supported among undergraduates with relatively modest, easy to implement adaptations to course content.

  6. Simplified propagation of standard uncertainties

    International Nuclear Information System (INIS)

    Shull, A.H.

    1997-01-01

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper

  7. Fast ion stabilization of the ion temperature gradient driven modes in the Joint European Torus hybrid-scenario plasmas: a trigger mechanism for internal transport barrier formation

    Energy Technology Data Exchange (ETDEWEB)

    Romanelli, M; Zocco, A [Euratom/CCFE Fusion Association, Culham Science Centre, Abingdon, Oxon, OX14 3DB (United Kingdom); Crisanti, F, E-mail: Michele.Romanelli@ccfe.ac.u [Associazione Euratom-ENEA sulla Fusione, C.R. Frascati, Frascati (Italy)

    2010-04-15

    Understanding and modelling turbulent transport in thermonuclear fusion plasmas are crucial for designing and optimizing the operational scenarios of future fusion reactors. In this context, plasmas exhibiting state transitions, such as the formation of an internal transport barrier (ITB), are particularly interesting since they can shed light on transport physics and offer the opportunity to test different turbulence suppression models. In this paper, we focus on the modelling of ITB formation in the Joint European Torus (JET) [1] hybrid-scenario plasmas, where, due to the monotonic safety factor profile, magnetic shear stabilization cannot be invoked to explain the transition. The turbulence suppression mechanism investigated here relies on the increase in the plasma pressure gradient in the presence of a minority of energetic ions. Microstability analysis of the ion temperature gradient driven modes (ITG) in the presence of a fast-hydrogen minority shows that energetic ions accelerated by the ion cyclotron resonance heating (ICRH) system (hydrogen, n{sub H,fast}/n{sub D,thermal} up to 10%, T{sub H,fast}/T{sub D,thermal} up to 30) can increase the pressure gradient enough to stabilize the ITG modes driven by the gradient of the thermal ions (deuterium). Numerical analysis shows that, by increasing the temperature of the energetic ions, electrostatic ITG modes are gradually replaced by nearly electrostatic modes with tearing parity at progressively longer wavelengths. The growth rate of the microtearing modes is found to be lower than that of the ITG modes and comparable to the local E x B-velocity shearing rate. The above mechanism is proposed as a possible trigger for the formation of ITBs in this type of discharges.

  8. Operational Flexibility Responses to Environmental Uncertainties

    OpenAIRE

    Miller, Kent D.

    1994-01-01

    This study develops and tests a behavioral model of organizational changes in operational flexibility. Regression results using an international data set provide strong support for the general proposition that uncertainties associated with different environmental components--poitical, government policy, macroeconomic, competitive, input and product demand uncertainties--have different implications for firm internal, locational, and supploer flexibility. Slack acts as a buffer attenuating, a...

  9. Measurement of buoyancy driven convection and microaccelerations on board International Space Station with the use of convection sensor Dacon-M

    Science.gov (United States)

    Putin, Gennady; Belyaev, Mikhail; Babushkin, Igor; Glukhov, Alexander; Zilberman, Evgeny; Maksimova, Marina; Ivanov, Alexander; Sazonov, Viktor; Nikitin, Sergey; Zavalishin, Denis; Polezhaev, Vadim

    The system for studying buoyancy driven convection and low-frequency microaccelerations aboard spacecraft is described. The system consists of: 1. facility for experimentation on a spaceship - the convection sensor and electronic equipment for apparatus control and for acquisition and processing of relevant information; 2. facility for ground-based laboratory modeling of various fluid motion mechanisms in application to orbital flight environment; 3. the system for computer simulations of convection processes in a fluid cell of a sensor using the data on microaccelerations obtained by accelerometers and another devices aboard the orbital station. The arrangement and functioning of the sensor and control hardware are expounded. The results of terrestrial experiments performed in order to determine the sensitivity of the sensor are described. The results of experiments carried out in 2008 - 2011 with the “DACON-M” apparatus in different modules of the Russian Segment of International Space Station and for various regimes of Station activity are reported. Experimental data recorded by “DACON-M” apparatus have been compared with the calculations of acceleration components based on the telemetry information about the orientation of the Station.

  10. MR Imaging of the Internal Auditory Canal and Inner Ear at 3T: Comparison between 3D Driven Equilibrium and 3D Balanced Fast Field Echo Sequences

    Energy Technology Data Exchange (ETDEWEB)

    Byun, Jun Soo; Kim, Hyung Jin; Yim, Yoo Jeong; Kim, Sung Tae; Jeon, Pyoung; Kim, Keon Ha [Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul (Korea, Republic of); Kim, Sam Soo; Jeon, Yong Hwan; Lee, Ji Won [Kangwon National University College of Medicine, Chuncheon (Korea, Republic of)

    2008-06-15

    To compare the use of 3D driven equilibrium (DRIVE) imaging with 3D balanced fast field echo (bFFE) imaging in the assessment of the anatomic structures of the internal auditory canal (IAC) and inner ear at 3 Tesla (T). Thirty ears of 15 subjects (7 men and 8 women; age range, 22 71 years; average age, 50 years) without evidence of ear problems were examined on a whole-body 3T MR scanner with both 3D DRIVE and 3D bFFE sequences by using an 8-channel sensitivity encoding (SENSE) head coil. Two neuroradiologists reviewed both MR images with particular attention to the visibility of the anatomic structures, including four branches of the cranial nerves within the IAC, anatomic structures of the cochlea, vestibule, and three semicircular canals. Although both techniques provided images of relatively good quality, the 3D DRIVE sequence was somewhat superior to the 3D bFFE sequence. The discrepancies were more prominent for the basal turn of the cochlea, vestibule, and all semicircular canals, and were thought to be attributed to the presence of greater magnetic susceptibility artifacts inherent to gradient-echo techniques such as bFFE. Because of higher image quality and less susceptibility artifacts, we highly recommend the employment of 3D DRIVE imaging as the MR imaging choice for the IAC and inner ear

  11. Uncertainties in Nuclear Proliferation Modeling

    International Nuclear Information System (INIS)

    Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok

    2015-01-01

    There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies

  12. The Uncertainties of Risk Management

    DEFF Research Database (Denmark)

    Vinnari, Eija; Skærbæk, Peter

    2014-01-01

    for expanding risk management. More generally, such uncertainties relate to the professional identities and responsibilities of operational managers as defined by the framing devices. Originality/value – The paper offers three contributions to the extant literature: first, it shows how risk management itself......Purpose – The purpose of this paper is to analyse the implementation of risk management as a tool for internal audit activities, focusing on unexpected effects or uncertainties generated during its application. Design/methodology/approach – Public and confidential documents as well as semi......-structured interviews are analysed through the lens of actor-network theory to identify the effects of risk management devices in a Finnish municipality. Findings – The authors found that risk management, rather than reducing uncertainty, itself created unexpected uncertainties that would otherwise not have emerged...

  13. Internal Light Source-Driven Photoelectrochemical 3D-rGO/Cellulose Device Based on Cascade DNA Amplification Strategy Integrating Target Analog Chain and DNA Mimic Enzyme.

    Science.gov (United States)

    Lan, Feifei; Liang, Linlin; Zhang, Yan; Li, Li; Ren, Na; Yan, Mei; Ge, Shenguang; Yu, Jinghua

    2017-11-01

    In this work, a chemiluminescence-driven collapsible greeting card-like photoelectrochemical lab-on-paper device (GPECD) with hollow channel was demonstrated, in which target-triggering cascade DNA amplification strategy was ingeniously introduced. The GPECD had the functions of reagents storage and signal collection, and the change of configuration could control fluidic path, reaction time and alterations in electrical connectivity. In addition, three-dimentional reduced graphene oxide affixed Au flower was in situ grown on paper cellulose fiber for achieving excellent conductivity and biocompatibility. The cascade DNA amplification strategy referred to the cyclic formation of target analog chain and its trigger action to hybridization chain reaction (HCR), leading to the formation of numerous hemin/G-quadruplex DNA mimic enzyme with the presence of hemin. Subjected to the catalysis of hemin/G-quadruplex, the strong chemiluminiscence of luminol-H 2 O 2 system was obtained, which then was used as internal light source to excite photoactive materials realizing the simplification of instrument. In this analyzing process, thrombin served as proof-of-concept, and the concentration of target was converted into the DNA signal output by the specific recognition of aptamer-protein and target analog chain recycling. The target analog chain was produced in quantity with the presence of target, which further triggered abundant HCR and introduced hemin/G-quadruplex into the system. The photocurrent signal was obtained after the nitrogen-doped carbon dots sensitized ZnO was stimulated by chemiluminescence. The proposed GPECD exhibited excellent specificity and sensitivity toward thrombin with a detection limit of 16.7 fM. This judiciously engineered GPECD paved a luciferous way for detecting other protein with trace amounts in bioanalysis and clinical biomedicine.

  14. A Guideline for Successful Calibration and Uncertainty Analysis for Soil and Water Assessment: A Review of Papers from the 2016 International SWAT Conference

    Directory of Open Access Journals (Sweden)

    Karim C. Abbaspour

    2017-12-01

    Full Text Available Application of integrated hydrological models to manage a watershed’s water resources are increasingly finding their way into the decision-making processes. The Soil and Water Assessment Tool (SWAT is a multi-process model integrating hydrology, ecology, agriculture, and water quality. SWAT is a continuation of nearly 40 years of modeling efforts conducted by the United States Department of Agriculture (USDA Agricultural Research Service (ARS. A large number of SWAT-related papers have appeared in ISI journals, building a world-wide consensus around the model’s stability and usefulness. The current issue is a collection of the latest research using SWAT as the modeling tool. Most models must undergo calibration/validation and uncertainty analysis. Unfortunately, these sciences are not formal subjects of teaching in most universities and the students are often left to their own resources to calibrate their model. In this paper, we focus on calibration and uncertainty analysis highlighting some serious issues in the calibration of distributed models. A protocol for calibration is also highlighted to guide the users to obtain better modeling results. Finally, a summary of the papers published in this special issue is provided in the Appendix.

  15. Uncertainty and measurement

    International Nuclear Information System (INIS)

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  16. Chapter 3: Traceability and uncertainty

    International Nuclear Information System (INIS)

    McEwen, Malcolm

    2014-01-01

    Chapter 3 presents: an introduction; Traceability (measurement standard, role of the Bureau International des Poids et Mesures, Secondary Standards Laboratories, documentary standards and traceability as process review); Uncertainty (Example 1 - Measurement, M raw (SSD), Example 2 - Calibration data, N D.w 60 Co, kQ, Example 3 - Correction factor, P TP ) and Conclusion

  17. International training program: 3D S.UN.COP - Scaling, uncertainty and 3D thermal-hydraulics/neutron-kinetics coupled codes seminar

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.; Bajs, T.; Reventos, F.

    2006-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the 'user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP 2005 (Scaling, Uncertainty and 3D COuPled code calculations) seminar has been organized by University of Pisa and University of Zagreb as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users (D'Auria, 1998). It was recognized that such a course represented both a source of continuing education for current code users and a means for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The seminar-training was successfully held with the participation of 19 persons coming from 9 countries and 14 different institutions (universities, vendors, national laboratories and regulatory bodies). More than 15 scientists were involved in the organization of the seminar, presenting theoretical aspects of the proposed methodologies and holding the training and the final examination. A certificate (LA Code User grade) was released

  18. Uncertainty analysis comes to integrated assessment models for climate change…and conversely

    NARCIS (Netherlands)

    Cooke, R.M.

    2012-01-01

    This article traces the development of uncertainty analysis through three generations punctuated by large methodology investments in the nuclear sector. Driven by a very high perceived legitimation burden, these investments aimed at strengthening the scientific basis of uncertainty quantification.

  19. Decay heat uncertainty quantification of MYRRHA

    Directory of Open Access Journals (Sweden)

    Fiorito Luca

    2017-01-01

    Full Text Available MYRRHA is a lead-bismuth cooled MOX-fueled accelerator driven system (ADS currently in the design phase at SCK·CEN in Belgium. The correct evaluation of the decay heat and of its uncertainty level is very important for the safety demonstration of the reactor. In the first part of this work we assessed the decay heat released by the MYRRHA core using the ALEPH-2 burnup code. The second part of the study focused on the nuclear data uncertainty and covariance propagation to the MYRRHA decay heat. Radioactive decay data, independent fission yield and cross section uncertainties/covariances were propagated using two nuclear data sampling codes, namely NUDUNA and SANDY. According to the results, 238U cross sections and fission yield data are the largest contributors to the MYRRHA decay heat uncertainty. The calculated uncertainty values are deemed acceptable from the safety point of view as they are well within the available regulatory limits.

  20. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  1. International training program in support of safety analysis. 3D S.UN.COP-scaling uncertainty and 3D thermal-hydraulics/neutron-kinetics coupled codes seminars

    International Nuclear Information System (INIS)

    Petruzzi, Alessandro; D'Auria, Francesco; Bajs, Tomislav; Reventos, Francesc; Hassan, Yassin

    2007-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysis to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP (Scaling, Uncertainty and 3D COuPled code calculations) seminars have been organized as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users. Six seminars have been held at University of Pisa (2003, 2004), at The Pennsylvania State University (2004), at University of Zagreb (2005), at the School of Industrial Engineering of Barcelona (January-February 2006) and in Buenos Aires, Argentina (October 2006), being this last one requested by ARN (Autoridad Regulatoria Nuclear), NA-SA (Nucleoelectrica Argentina S.A) and CNEA (Comision Nacional de Energia Atomica). It was recognized that such courses represented both a source of continuing education for current code users and a mean for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The 3D S.UN.COP 2006 in Barcelona was successfully held with the attendance of 33

  2. International Training Program in Support of Safety Analysis: 3D S.UN.COP-Scaling, Uncertainty and 3D Thermal-Hydraulics/Neutron-Kinetics Coupled Codes Seminars

    International Nuclear Information System (INIS)

    Petruzzi, Alessandro; D'Auria, Francesco; Bajs, Tomislav; Reventos, Francesc

    2006-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the 'user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP (Scaling, Uncertainty and 3D COuPled code calculations) seminars have been organized as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users [1]. Five seminars have been held at University of Pisa (2003, 2004), at The Pennsylvania State University (2004), at University of Zagreb (2005) and at the School of Industrial Engineering of Barcelona (2006). It was recognized that such courses represented both a source of continuing education for current code users and a mean for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The 3D S.UN.COP 2006 was successfully held with the attendance of 33 participants coming from 18 countries and 28 different institutions (universities, vendors, national laboratories and regulatory bodies). More than 30 scientists (coming from 13 countries and 23 different institutions) were

  3. International Training Program: 3D S. Un. Cop - Scaling, Uncertainty and 3D Thermal-Hydraulics/Neutron-Kinetics Coupled Codes Seminar

    International Nuclear Information System (INIS)

    Pertuzzi, A.; D'Auria, F.; Bajs, T.; Reventos, F.

    2006-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers and vendors, nuclear fuel companies, research organizations, consulting companies, and technical support organizations. The computer code user represents a source of uncertainty that can influence the results of system code calculations. This influence is commonly known as the 'user effect' and stems from the limitations embedded in the codes as well as from the limited capability of the analysts to use the codes. Code user training and qualification is an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In other words, the program aims at contributing towards solving the problem of user effect. The 3D S.UN.COP (Scaling, Uncertainty and 3D COuPled code calculations) seminars have been organized as follow-up of the proposal to IAEA for the Permanent Training Course for System Code Users (D'Auria, 1998). Four seminars have been held at University of Pisa (2003, 2004), at The Pennsylvania State University (2004) and at University of Zagreb (2005). It was recognized that such courses represented both a source of continuing education for current code users and a mean for current code users to enter the formal training structure of a proposed 'permanent' stepwise approach to user training. The 3D S.UN.COP 2005 was successfully held with the participation of 19 persons coming from 9 countries and 14 different institutions (universities, vendors, national laboratories and regulatory bodies). More than 15 scientists were involved in the organization of the seminar, presenting theoretical aspects of the proposed methodologies and

  4. Uncertainty in social dilemmas

    OpenAIRE

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size uncertainty). Several researchers have therefore asked themselves the question as to how such uncertainty influences people’s choice behavior. These researchers have repeatedly concluded that uncertainty...

  5. International benchmark for coupled codes and uncertainty analysis in modelling: switching-Off of one of the four operating main circulation pumps at nominal reactor power at NPP Kalinin unit 3

    International Nuclear Information System (INIS)

    Tereshonok, V. A.; Nikonov, S. P.; Lizorkin, M. P.; Velkov, K; Pautz, A.; Ivanov, V.

    2008-01-01

    The paper briefly describes the Specification of an international NEA/OECD benchmark based on measured plant data. During the commissioning tests for nominal power at NPP Kalinin Unit 3 a lot of measurements of neutron and thermo-hydraulic parameters have been carried out in the reactor pressure vessel, primary and the secondary circuits. One of the measured data sets for the transient 'Switching-off of one Main Circulation Pump (MCP) at nominal power' has been chosen to be applied for validation of coupled thermal-hydraulic and neutron-kinetic system codes and additionally for performing of uncertainty analyses as a part of the NEA/OECD Uncertainty Analysis in Modeling Benchmark. The benchmark is opened for all countries and institutions. The experimental data and the final specification with the cross section libraries will be provided to the participants from NEA/OECD only after official declaration of real participation in the benchmark and delivery of the simulated results of the transient for comparison. (Author)

  6. Decommissioning Funding: Ethics, Implementation, Uncertainties

    International Nuclear Information System (INIS)

    2007-01-01

    This status report on decommissioning funding: ethics, implementation, uncertainties is based on a review of recent literature and materials presented at NEA meetings in 2003 and 2004, and particularly at a topical session organised in November 2004 on funding issues associated with the decommissioning of nuclear power facilities. The report also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). This report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems

  7. Uncertainty in Twenty-First-Century CMIP5 Sea Level Projections

    Science.gov (United States)

    Little, Christopher M.; Horton, Radley M.; Kopp, Robert E.; Oppenheimer, Michael; Yip, Stan

    2015-01-01

    The representative concentration pathway (RCP) simulations included in phase 5 of the Coupled Model Intercomparison Project (CMIP5) quantify the response of the climate system to different natural and anthropogenic forcing scenarios. These simulations differ because of 1) forcing, 2) the representation of the climate system in atmosphere-ocean general circulation models (AOGCMs), and 3) the presence of unforced (internal) variability. Global and local sea level rise projections derived from these simulations, and the emergence of distinct responses to the four RCPs depend on the relative magnitude of these sources of uncertainty at different lead times. Here, the uncertainty in CMIP5 projections of sea level is partitioned at global and local scales, using a 164-member ensemble of twenty-first-century simulations. Local projections at New York City (NYSL) are highlighted. The partition between model uncertainty, scenario uncertainty, and internal variability in global mean sea level (GMSL) is qualitatively consistent with that of surface air temperature, with model uncertainty dominant for most of the twenty-first century. Locally, model uncertainty is dominant through 2100, with maxima in the North Atlantic and the Arctic Ocean. The model spread is driven largely by 4 of the 16 AOGCMs in the ensemble; these models exhibit outlying behavior in all RCPs and in both GMSL and NYSL. The magnitude of internal variability varies widely by location and across models, leading to differences of several decades in the local emergence of RCPs. The AOGCM spread, and its sensitivity to model exclusion and/or weighting, has important implications for sea level assessments, especially if a local risk management approach is utilized.

  8. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    Science.gov (United States)

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  9. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  10. Managing project risks and uncertainties

    Directory of Open Access Journals (Sweden)

    Mike Mentis

    2015-01-01

    Full Text Available This article considers threats to a project slipping on budget, schedule and fit-for-purpose. Threat is used here as the collective for risks (quantifiable bad things that can happen and uncertainties (poorly or not quantifiable bad possible events. Based on experience with projects in developing countries this review considers that (a project slippage is due to uncertainties rather than risks, (b while eventuation of some bad things is beyond control, managed execution and oversight are still the primary means to keeping within budget, on time and fit-for-purpose, (c improving project delivery is less about bigger and more complex and more about coordinated focus, effectiveness and developing thought-out heuristics, and (d projects take longer and cost more partly because threat identification is inaccurate, the scope of identified threats is too narrow, and the threat assessment product is not integrated into overall project decision-making and execution. Almost by definition, what is poorly known is likely to cause problems. Yet it is not just the unquantifiability and intangibility of uncertainties causing project slippage, but that they are insufficiently taken into account in project planning and execution that cause budget and time overruns. Improving project performance requires purpose-driven and managed deployment of scarce seasoned professionals. This can be aided with independent oversight by deeply experienced panelists who contribute technical insights and can potentially show that diligence is seen to be done.

  11. Meteorological and small scale internal ecosystem variability characterize the uncertainty of ecosystem level responses to elevated CO2. Insights from the Duke Forest FACE experiment

    Science.gov (United States)

    Paschalis, A.; Katul, G. G.; Fatichi, S.; Palmroth, S.; Way, D.

    2017-12-01

    One of the open questions in climate change research is the pathway by which elevated atmospheric CO2 concentration impacts the biogeochemical and hydrological cycles at the ecosystem scale. This impact leads to significant changes in long-term carbon stocks and the potential of ecosystems to sequester CO2, partially mitigating anthropogenic emissions. While the significance of elevated atmospheric CO2 concentration on instantaneous leaf-level processes such as photosynthesis and transpiration is rarely disputed, its integrated effect at the ecosystem level and at long-time scales remains a subject of debate. This debate has taken on some urgency as illustrated by differences arising between ecosystem modelling studies, and data-model comparisons using Free Air CO2 Enrichment (FACE) sites around the world. Inherent leaf-to-leaf variability in gas exchange rates can generate such inconsistencies. This inherent variability arises from the combined effect of meteorological "temporal" variability and the "spatial" variability of the biochemical parameters regulating vegetation carbon uptake. This combined variability leads to a non-straightforward scaling of ecosystem fluxes from the leaf to ecosystems. To illustrate this scaling behaviour, we used 10 years of leaf gas exchange measurements collected at the Duke Forest FACE experiment. The internal variability of the ecosystem parameters are first quantified and then combined with three different leaf-scale stomatal conductance models and an ecosystem model. The main results are: (a) Variability of the leaf level fluxes is dependent on both the meteorological drivers and differences in leaf age, position within the canopy, nitrogen and CO2 fertilization, which can be accommodated in model parameters; (b) Meteorological variability plays the dominant role at short temporal scales while parameter variability is significant at longer temporal scales. (c) Leaf level results do not necessarily translate to similar ecosystem

  12. Drivers And Uncertainties Of Increasing Global Water Scarcity

    Science.gov (United States)

    Scherer, L.; Pfister, S.

    2015-12-01

    Water scarcity threatens ecosystems and human health and hampers economic development. It generally depends on the ratio of water consumption to availability. We calculated global, spatially explicit water stress indices (WSIs) which describe the vulnerability to additional water consumption on a scale from 0 (low) to 1 (high) and compare them for the decades 1981-1990 and 2001-2010. Input data are obtained from a multi-model ensemble at a resolution of 0.5 degrees. The variability among the models was used to run 1000 Monte Carlo simulations (latin hypercube sampling) and to subsequently estimate uncertainties of the WSIs. Globally, a trend of increasing water scarcity can be observed, however, uncertainties are large. The probability that this trend is actually occurring is as low as 53%. The increase in WSIs is rather driven by higher water use than lower water availability. Water availability is only 40% likely to decrease whereas water consumption is 67% likely to increase. Independent from the trend, we are already living under water scarce conditions, which is reflected in a consumption-weighted average of monthly WSIs of 0.51 in the recent decade. Its coefficient of variation points with 0.8 to the high uncertainties entailed, which might still hide poor model performance where all models consistently over- or underestimate water availability or use. Especially in arid areas, models generally overestimate availability. Although we do not traverse the planetary boundary of freshwater use as global water availability is sufficient, local water scarcity might be high. Therefore the regionalized assessment of WSIs under uncertainty helps to focus on specific regions to optimise water consumption. These global results can also help to raise awareness of water scarcity, and to suggest relevant measures such as more water efficient technologies to international companies, which have to deal with complex and distributed supply chains (e.g. in food production).

  13. International

    International Nuclear Information System (INIS)

    Anon.

    1997-01-01

    This rubric reports on 10 short notes about international economical facts about nuclear power: Electricite de France (EdF) and its assistance and management contracts with Eastern Europe countries (Poland, Hungary, Bulgaria); Transnuclear Inc. company (a 100% Cogema daughter company) acquired the US Vectra Technologies company; the construction of the Khumo nuclear power plant in Northern Korea plays in favour of the reconciliation between Northern and Southern Korea; the delivery of two VVER 1000 Russian reactors to China; the enforcement of the cooperation agreement between Euratom and Argentina; Japan requested for the financing of a Russian fast breeder reactor; Russia has planned to sell a floating barge-type nuclear power plant to Indonesia; the control of the Swedish reactor vessels of Sydkraft AB company committed to Tractebel (Belgium); the renewal of the nuclear cooperation agreement between Swiss and USA; the call for bids from the Turkish TEAS electric power company for the building of the Akkuyu nuclear power plant answered by three candidates: Atomic Energy of Canada Limited (AECL), Westinghouse (US) and the French-German NPI company. (J.S.)

  14. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  15. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  16. Uncertainty and Cognitive Control

    Directory of Open Access Journals (Sweden)

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  17. Decay heat uncertainty quantification of MYRRHA

    OpenAIRE

    Fiorito Luca; Buss Oliver; Hoefer Axel; Stankovskiy Alexey; Eynde Gert Van den

    2017-01-01

    MYRRHA is a lead-bismuth cooled MOX-fueled accelerator driven system (ADS) currently in the design phase at SCK·CEN in Belgium. The correct evaluation of the decay heat and of its uncertainty level is very important for the safety demonstration of the reactor. In the first part of this work we assessed the decay heat released by the MYRRHA core using the ALEPH-2 burnup code. The second part of the study focused on the nuclear data uncertainty and covariance propagation to the MYRRHA decay hea...

  18. Uncertainties in radioecological assessment models

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.; Ng, Y.C.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because models are inexact representations of real systems. The major sources of this uncertainty are related to bias in model formulation and imprecision in parameter estimation. The magnitude of uncertainty is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, health risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible. 41 references, 4 figures, 4 tables

  19. Summary of existing uncertainty methods

    International Nuclear Information System (INIS)

    Glaeser, Horst

    2013-01-01

    A summary of existing and most used uncertainty methods is presented, and the main features are compared. One of these methods is the order statistics method based on Wilks' formula. It is applied in safety research as well as in licensing. This method has been first proposed by GRS for use in deterministic safety analysis, and is now used by many organisations world-wide. Its advantage is that the number of potential uncertain input and output parameters is not limited to a small number. Such a limitation was necessary for the first demonstration of the Code Scaling Applicability Uncertainty Method (CSAU) by the United States Regulatory Commission (USNRC). They did not apply Wilks' formula in their statistical method propagating input uncertainties to obtain the uncertainty of a single output variable, like peak cladding temperature. A Phenomena Identification and Ranking Table (PIRT) was set up in order to limit the number of uncertain input parameters, and consequently, the number of calculations to be performed. Another purpose of such a PIRT process is to identify the most important physical phenomena which a computer code should be suitable to calculate. The validation of the code should be focused on the identified phenomena. Response surfaces are used in some applications replacing the computer code for performing a high number of calculations. The second well known uncertainty method is the Uncertainty Methodology Based on Accuracy Extrapolation (UMAE) and the follow-up method 'Code with the Capability of Internal Assessment of Uncertainty (CIAU)' developed by the University Pisa. Unlike the statistical approaches, the CIAU does compare experimental data with calculation results. It does not consider uncertain input parameters. Therefore, the CIAU is highly dependent on the experimental database. The accuracy gained from the comparison between experimental data and calculated results are extrapolated to obtain the uncertainty of the system code predictions

  20. Background and Qualification of Uncertainty Methods

    International Nuclear Information System (INIS)

    D'Auria, F.; Petruzzi, A.

    2008-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. The paper reviews the salient features of two independent approaches for estimating uncertainties associated with predictions of complex system codes. Namely the propagation of code input error and the propagation of the calculation output error constitute the key-words for identifying the methods of current interest for industrial applications. Throughout the developed methods, uncertainty bands can be derived (both upper and lower) for any desired quantity of the transient of interest. For the second case, the uncertainty method is coupled with the thermal-hydraulic code to get the Code with capability of Internal Assessment of Uncertainty, whose features are discussed in more detail.

  1. Uncertainty analysis of a low flow model for the Rhine River

    NARCIS (Netherlands)

    Demirel, M.C.; Booij, Martijn J.

    2011-01-01

    It is widely recognized that hydrological models are subject to parameter uncertainty. However, little attention has been paid so far to the uncertainty in parameters of the data-driven models like weights in neural networks. This study aims at applying a structured uncertainty analysis to a

  2. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  3. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  4. Uncertainty in artificial intelligence

    CERN Document Server

    Kanal, LN

    1986-01-01

    How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

  5. Uncertainties in hydrogen combustion

    International Nuclear Information System (INIS)

    Stamps, D.W.; Wong, C.C.; Nelson, L.S.

    1988-01-01

    Three important areas of hydrogen combustion with uncertainties are identified: high-temperature combustion, flame acceleration and deflagration-to-detonation transition, and aerosol resuspension during hydrogen combustion. The uncertainties associated with high-temperature combustion may affect at least three different accident scenarios: the in-cavity oxidation of combustible gases produced by core-concrete interactions, the direct containment heating hydrogen problem, and the possibility of local detonations. How these uncertainties may affect the sequence of various accident scenarios is discussed and recommendations are made to reduce these uncertainties. 40 references

  6. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  7. Uncertainty in social dilemmas

    NARCIS (Netherlands)

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size

  8. Uncertainty and Climate Change

    OpenAIRE

    Berliner, L. Mark

    2003-01-01

    Anthropogenic, or human-induced, climate change is a critical issue in science and in the affairs of humankind. Though the target of substantial research, the conclusions of climate change studies remain subject to numerous uncertainties. This article presents a very brief review of the basic arguments regarding anthropogenic climate change with particular emphasis on uncertainty.

  9. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  10. Uncertainty and simulation

    International Nuclear Information System (INIS)

    Depres, B.; Dossantos-Uzarralde, P.

    2009-01-01

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  11. Conditional uncertainty principle

    Science.gov (United States)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  12. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  13. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  14. The internal propagation of fusion flame with the strong shock of a laser driven plasma block for advanced nuclear fuel ignition

    International Nuclear Information System (INIS)

    Malekynia, B.; Razavipour, S. S.

    2013-01-01

    An accelerated skin layer may be used to ignite solid state fuels. Detailed analyses were clarified by solving the hydrodynamic equations for nonlinear force driven plasma block ignition. In this paper, the complementary mechanisms are included for the advanced fuel ignition: external factors such as lasers, compression, shock waves, and sparks. The other category is created within the plasma fusion as reheating of an alpha particle, the Bremsstrahlung absorption, expansion, conduction, and shock waves generated by explosions. With the new condition for the control of shock waves, the spherical deuterium-tritium fuel density should be increased to 75 times that of the solid state. The threshold ignition energy flux density for advanced fuel ignition may be obtained using temperature equations, including the ones for the density profile obtained through the continuity equation and the expansion velocity for the r ≠ 0 layers. These thresholds are significantly reduced in comparison with the ignition thresholds at x = 0 for solid advanced fuels. The quantum correction for the collision frequency is applied in the case of the delay in ion heating. Under the shock wave condition, the spherical proton-boron and proton-lithium fuel densities should be increased to densities 120 and 180 times that of the solid state. These plasma compressions are achieved through a longer duration laser pulse or X-ray. (physics of gases, plasmas, and electric discharges)

  15. Results of a survey regarding irradiation of internal mammary chain in patients with breast cancer: practice is culture driven rather than evidence based.

    Science.gov (United States)

    Taghian, Alphonse; Jagsi, Reshma; Makris, Andreas; Goldberg, Saveli; Ceilley, Elizabeth; Grignon, Laurent; Powell, Simon

    2004-11-01

    To examine the self-reported practice patterns of radiation oncologists in North America and Europe regarding radiotherapy to the internal mammary lymph node chain (IMC) in breast cancer patients. A survey questionnaire was sent in 2001 to physician members of the American Society for Therapeutic Radiology and Oncology and European Society for Therapeutic Radiology and Oncology regarding their management of breast cancer. Respondents were asked whether they would treat the IMC in several clinical scenarios. A total of 435 responses were obtained from European and 702 responses from North American radiation oncologists. Respondents were increasingly likely to report IMC irradiation in scenarios with greater axillary involvement. Responses varied widely among different European regions, the United States, and Canada (p variation in attitudes regarding treatment of the IMC. The international patterns of variation mirror the divergent conclusions of studies conducted in the different regions, indicating that physicians may rely preferentially on evidence from local studies when making difficult treatment decisions. These variations in self-reported practice patterns indicate the need for greater data in this area, particularly from international cooperative trials. The cultural predispositions documented in this study are important to recognize, because they may continue to affect physician attitudes and practices, even as greater evidence accumulates.

  16. The Importance of Uncertainty and Sensitivity Analysis in Process-based Models of Carbon and Nitrogen Cycling in Terrestrial Ecosystems with Particular Emphasis on Forest Ecosystems — Selected Papers from a Workshop Organized by the International Society for Ecological Modelling (ISEM) at the Third Biennal Meeting of the International Environmental Modelling and Software Society (IEMSS) in Burlington, Vermont, USA, August 9-13, 2006

    Science.gov (United States)

    Larocque, Guy R.; Bhatti, Jagtar S.; Liu, Jinxun; Ascough, James C.; Gordon, Andrew M.

    2008-01-01

    Many process-based models of carbon (C) and nitrogen (N) cycles have been developed for terrestrial ecosystems, including forest ecosystems. They address many basic issues of ecosystems structure and functioning, such as the role of internal feedback in ecosystem dynamics. The critical factor in these phenomena is scale, as these processes operate at scales from the minute (e.g. particulate pollution impacts on trees and other organisms) to the global (e.g. climate change). Research efforts remain important to improve the capability of such models to better represent the dynamics of terrestrial ecosystems, including the C, nutrient, (e.g. N) and water cycles. Existing models are sufficiently well advanced to help decision makers develop sustainable management policies and planning of terrestrial ecosystems, as they make realistic predictions when used appropriately. However, decision makers must be aware of their limitations by having the opportunity to evaluate the uncertainty associated with process-based models (Smith and Heath, 2001 and Allen et al., 2004). The variation in scale of issues currently being addressed by modelling efforts makes the evaluation of uncertainty a daunting task.

  17. Uncertainty Propagation in OMFIT

    Science.gov (United States)

    Smith, Sterling; Meneghini, Orso; Sung, Choongki

    2017-10-01

    A rigorous comparison of power balance fluxes and turbulent model fluxes requires the propagation of uncertainties in the kinetic profiles and their derivatives. Making extensive use of the python uncertainties package, the OMFIT framework has been used to propagate covariant uncertainties to provide an uncertainty in the power balance calculation from the ONETWO code, as well as through the turbulent fluxes calculated by the TGLF code. The covariant uncertainties arise from fitting 1D (constant on flux surface) density and temperature profiles and associated random errors with parameterized functions such as a modified tanh. The power balance and model fluxes can then be compared with quantification of the uncertainties. No effort is made at propagating systematic errors. A case study will be shown for the effects of resonant magnetic perturbations on the kinetic profiles and fluxes at the top of the pedestal. A separate attempt at modeling the random errors with Monte Carlo sampling will be compared to the method of propagating the fitting function parameter covariant uncertainties. Work supported by US DOE under DE-FC02-04ER54698, DE-FG2-95ER-54309, DE-SC 0012656.

  18. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    , and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  19. On uncertainty in information and ignorance in knowledge

    Science.gov (United States)

    Ayyub, Bilal M.

    2010-05-01

    This paper provides an overview of working definitions of knowledge, ignorance, information and uncertainty and summarises formalised philosophical and mathematical framework for their analyses. It provides a comparative examination of the generalised information theory and the generalised theory of uncertainty. It summarises foundational bases for assessing the reliability of knowledge constructed as a collective set of justified true beliefs. It discusses system complexity for ancestor simulation potentials. It offers value-driven communication means of knowledge and contrarian knowledge using memes and memetics.

  20. BEPU methods and combining of uncertainties

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2004-01-01

    After approval of the revised rule on the acceptance of emergency core cooling system (ECCS) performance in 1988 there has been significant interest in the development of codes and methodologies for best-estimate loss-of-coolant accident (LOCAs) analyses. The Code Scaling, Applicability and Uncertainty (CSAU) evaluation method was developed and demonstrated for large-break (LB) LOCA in a pressurized water reactor. Later several new best estimate plus uncertainty methods (BEPUs) were developed in the world. The purpose of the paper is to identify and compare the statistical approaches of BEPU methods and present their important plant and licensing applications. The study showed that uncertainty analysis with random sampling of input parameters and the use of order statistics for desired tolerance limits of output parameters is today commonly accepted approach. The existing BEPU methods seems mature enough while the future research may be focused on the codes with internal assessment of uncertainty. (author)

  1. Information-theoretic approach to uncertainty importance

    International Nuclear Information System (INIS)

    Park, C.K.; Bari, R.A.

    1985-01-01

    A method is presented for importance analysis in probabilistic risk assessments (PRA) for which the results of interest are characterized by full uncertainty distributions and not just point estimates. The method is based on information theory in which entropy is a measure of uncertainty of a probability density function. We define the relative uncertainty importance between two events as the ratio of the two exponents of the entropies. For the log-normal and log-uniform distributions the importance measure is comprised of the median (central tendency) and of the logarithm of the error factor (uncertainty). Thus, if accident sequences are ranked this way, and the error factors are not all equal, then a different rank order would result than if the sequences were ranked by the central tendency measure alone. As an illustration, the relative importance of internal events and in-plant fires was computed on the basis of existing PRA results

  2. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  3. Uncertainty in oil projects

    International Nuclear Information System (INIS)

    Limperopoulos, G.J.

    1995-01-01

    This report presents an oil project valuation under uncertainty by means of two well-known financial techniques: The Capital Asset Pricing Model (CAPM) and The Black-Scholes Option Pricing Formula. CAPM gives a linear positive relationship between expected rate of return and risk but does not take into consideration the aspect of flexibility which is crucial for an irreversible investment as an oil price is. Introduction of investment decision flexibility by using real options can increase the oil project value substantially. Some simple tests for the importance of uncertainty in stock market for oil investments are performed. Uncertainty in stock returns is correlated with aggregate product market uncertainty according to Pindyck (1991). The results of the tests are not satisfactory due to the short data series but introducing two other explanatory variables the interest rate and Gross Domestic Product make the situation better. 36 refs., 18 figs., 6 tabs

  4. Uncertainties and climatic change

    International Nuclear Information System (INIS)

    De Gier, A.M.; Opschoor, J.B.; Van de Donk, W.B.H.J.; Hooimeijer, P.; Jepma, J.; Lelieveld, J.; Oerlemans, J.; Petersen, A.

    2008-01-01

    Which processes in the climate system are misunderstood? How are scientists dealing with uncertainty about climate change? What will be done with the conclusions of the recently published synthesis report of the IPCC? These and other questions were answered during the meeting 'Uncertainties and climate change' that was held on Monday 26 November 2007 at the KNAW in Amsterdam. This report is a compilation of all the presentations and provides some conclusions resulting from the discussions during this meeting. [mk] [nl

  5. Mechanics and uncertainty

    CERN Document Server

    Lemaire, Maurice

    2014-01-01

    Science is a quest for certainty, but lack of certainty is the driving force behind all of its endeavors. This book, specifically, examines the uncertainty of technological and industrial science. Uncertainty and Mechanics studies the concepts of mechanical design in an uncertain setting and explains engineering techniques for inventing cost-effective products. Though it references practical applications, this is a book about ideas and potential advances in mechanical science.

  6. Uncertainty: lotteries and risk

    OpenAIRE

    Ávalos, Eloy

    2011-01-01

    In this paper we develop the theory of uncertainty in a context where the risks assumed by the individual are measurable and manageable. We primarily use the definition of lottery to formulate the axioms of the individual's preferences, and its representation through the utility function von Neumann - Morgenstern. We study the expected utility theorem and its properties, the paradoxes of choice under uncertainty and finally the measures of risk aversion with monetary lotteries.

  7. Uncertainty calculations made easier

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-07-01

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)

  8. Justification for recommended uncertainties

    International Nuclear Information System (INIS)

    Pronyaev, V.G.; Badikov, S.A.; Carlson, A.D.

    2007-01-01

    The uncertainties obtained in an earlier standards evaluation were considered to be unrealistically low by experts of the US Cross Section Evaluation Working Group (CSEWG). Therefore, the CSEWG Standards Subcommittee replaced the covariance matrices of evaluated uncertainties by expanded percentage errors that were assigned to the data over wide energy groups. There are a number of reasons that might lead to low uncertainties of the evaluated data: Underestimation of the correlations existing between the results of different measurements; The presence of unrecognized systematic uncertainties in the experimental data can lead to biases in the evaluated data as well as to underestimations of the resulting uncertainties; Uncertainties for correlated data cannot only be characterized by percentage uncertainties or variances. Covariances between evaluated value at 0.2 MeV and other points obtained in model (RAC R matrix and PADE2 analytical expansion) and non-model (GMA) fits of the 6 Li(n,t) TEST1 data and the correlation coefficients are presented and covariances between the evaluated value at 0.045 MeV and other points (along the line or column of the matrix) as obtained in EDA and RAC R matrix fits of the data available for reactions that pass through the formation of the 7 Li system are discussed. The GMA fit with the GMA database is shown for comparison. The following diagrams are discussed: Percentage uncertainties of the evaluated cross section for the 6 Li(n,t) reaction and the for the 235 U(n,f) reaction; estimation given by CSEWG experts; GMA result with full GMA database, including experimental data for the 6 Li(n,t), 6 Li(n,n) and 6 Li(n,total) reactions; uncertainties in the GMA combined fit for the standards; EDA and RAC R matrix results, respectively. Uncertainties of absolute and 252 Cf fission spectrum averaged cross section measurements, and deviations between measured and evaluated values for 235 U(n,f) cross-sections in the neutron energy range 1

  9. Production of hydrogen driven from biomass waste to power Remote areas away from the electric grid utilizing fuel cells and internal combustion engines vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Tawfik, Hazem [Farmingdale State College, NY (United States)

    2017-03-10

    Recent concerns over the security and reliability of the world’s energy supply has caused a flux towards the research and development of renewable sources. A leading renewable source has been found in the biomass gasification of biological materials derived from organic matters such as wood chips, forest debris, and farm waste that are found in abundance in the USA. Accordingly, there is a very strong interest worldwide in the development of new technologies that provide an in-depth understanding of this economically viable energy source. This work aims to allow the coupling of biomass gasification and fuel cell systems as well as Internal Combustion Engines (ICE) to produce high-energy efficiency, clean environmental performance and near-zero greenhouse gas emissions. Biomass gasification is a process, which produces synthesis gas (syngas) that contains 19% hydrogen and 20% carbon monoxide from inexpensive organic matter waste. This project main goal is to provide cost effective energy to the public utilizing remote farms’ waste and landfill recycling area.

  10. Corporate income taxation uncertainty and foreign direct investment

    OpenAIRE

    Zagler, Martin; Zanzottera, Cristiana

    2012-01-01

    This paper analyzes the effects of legal uncertainty around corporate income taxation on foreign direct investment (FDI). Legal uncertainty can take many forms: double tax agreements, different types of legal systems and corruption. We test the effect of legal uncertainty on foreign direct investment with an international panel. We find that an increase in the ratio of the statutory corporate income tax rate of the destination relative to the source country exhibits a negati...

  11. Dealing with exploration uncertainties

    International Nuclear Information System (INIS)

    Capen, E.

    1992-01-01

    Exploration for oil and gas should fulfill the most adventurous in their quest for excitement and surprise. This paper tries to cover that tall order. The authors will touch on the magnitude of the uncertainty (which is far greater than in most other businesses), the effects of not knowing target sizes very well, how to build uncertainty into analyses naturally, how to tie reserves and chance estimates to economics, and how to look at the portfolio effect of an exploration program. With no apologies, the authors will be using a different language for some readers - the language of uncertainty, which means probability and statistics. These tools allow one to combine largely subjective exploration information with the more analytical data from the engineering and economic side

  12. Uncertainty in artificial intelligence

    CERN Document Server

    Levitt, TS; Lemmer, JF; Shachter, RD

    1990-01-01

    Clearly illustrated in this volume is the current relationship between Uncertainty and AI.It has been said that research in AI revolves around five basic questions asked relative to some particular domain: What knowledge is required? How can this knowledge be acquired? How can it be represented in a system? How should this knowledge be manipulated in order to provide intelligent behavior? How can the behavior be explained? In this volume, all of these questions are addressed. From the perspective of the relationship of uncertainty to the basic questions of AI, the book divides naturally i

  13. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  14. Information-Driven Inspections

    International Nuclear Information System (INIS)

    Laughter, Mark D.; Whitaker, J. Michael; Lockwood, Dunbar

    2010-01-01

    New uranium enrichment capacity is being built worldwide in response to perceived shortfalls in future supply. To meet increasing safeguards responsibilities with limited resources, the nonproliferation community is exploring next-generation concepts to increase the effectiveness and efficiency of safeguards, such as advanced technologies to enable unattended monitoring of nuclear material. These include attribute measurement technologies, data authentication tools, and transmission and security methods. However, there are several conceptual issues with how such data would be used to improve the ability of a safeguards inspectorate such as the International Atomic Energy Agency (IAEA) to reach better safeguards conclusions regarding the activities of a State. The IAEA is pursuing the implementation of information-driven safeguards, whereby all available sources of information are used to make the application of safeguards more effective and efficient. Data from continuous, unattended monitoring systems can be used to optimize on-site inspection scheduling and activities at declared facilities, resulting in fewer, better inspections. Such information-driven inspections are the logical evolution of inspection planning - making use of all available information to enhance scheduled and randomized inspections. Data collection and analysis approaches for unattended monitoring systems can be designed to protect sensitive information while enabling information-driven inspections. A number of such inspections within a predetermined range could reduce inspection frequency while providing an equal or greater level of deterrence against illicit activity, all while meeting operator and technology holder requirements and reducing inspector and operator burden. Three options for using unattended monitoring data to determine an information-driven inspection schedule are to (1) send all unattended monitoring data off-site, which will require advances in data analysis techniques to

  15. A rigorous treatment of uncertainty quantification for Silicon damage metrics

    International Nuclear Information System (INIS)

    Griffin, P.

    2016-01-01

    These report summaries the contributions made by Sandia National Laboratories in support of the International Atomic Energy Agency (IAEA) Nuclear Data Section (NDS) Technical Meeting (TM) on Nuclear Reaction Data and Uncertainties for Radiation Damage. This work focused on a rigorous treatment of the uncertainties affecting the characterization of the displacement damage seen in silicon semiconductors. (author)

  16. A hierarchical approach to multi-project planning under uncertainty

    NARCIS (Netherlands)

    Leus, R.; Wullink, Gerhard; Hans, Elias W.; Herroelen, W.

    2004-01-01

    We survey several viewpoints on the management of the planning complexity of multi-project organisations under uncertainty. A positioning framework is proposed to distinguish between different types of project-driven organisations, which is meant to aid project management in the choice between the

  17. A hierarchical approach to multi-project planning under uncertainty

    NARCIS (Netherlands)

    Hans, Elias W.; Herroelen, W.; Wullink, Gerhard; Leus, R.

    2007-01-01

    We survey several viewpoints on the management of the planning complexity of multi-project organisations under uncertainty. Based on these viewpoints we propose a positioning framework to distinguish between different types of project-driven organisations. This framework is meant to aid project

  18. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  19. Robustness of dynamic systems with parameter uncertainties

    CERN Document Server

    Balemi, S; Truöl, W

    1992-01-01

    Robust Control is one of the fastest growing and promising areas of research today. In many practical systems there exist uncertainties which have to be considered in the analysis and design of control systems. In the last decade methods were developed for dealing with dynamic systems with unstructured uncertainties such as HOO_ and £I-optimal control. For systems with parameter uncertainties, the seminal paper of V. L. Kharitonov has triggered a large amount of very promising research. An international workshop dealing with all aspects of robust control was successfully organized by S. P. Bhattacharyya and L. H. Keel in San Antonio, Texas, USA in March 1991. We organized the second international workshop in this area in Ascona, Switzer­ land in April 1992. However, this second workshop was restricted to robust control of dynamic systems with parameter uncertainties with the objective to concentrate on some aspects of robust control. This book contains a collection of papers presented at the International W...

  20. Uncertainties in Transport Project Evaluation: Editorial

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Nielsen, Otto Anker

    2015-01-01

    University of Denmark, September 2013. The conference was held under the auspices of the project ‘Uncertainties in transport project evaluation’ (UNITE) which is a research project (2009-2014) financed by the Danish Strategic Research Agency. UNITE was coordinated by the Department of Transport......This following special issue of the European Journal of Transport Infrastructure Research (EJTIR) containing five scientific papers is the result of an open call for papers at the 1st International Conference on Uncertainties in Transport Project Evaluation that took place at the Technical...

  1. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    International Nuclear Information System (INIS)

    Kirchner, G.; Peterson, R.

    1996-11-01

    Uncertainty in predictions of environmental transfer models arises from, among other sources, the adequacy of the conceptual model, the approximations made in coding the conceptual model, the quality of the input data, the uncertainty in parameter values, and the assumptions made by the user. In recent years efforts to quantify the confidence that can be placed in predictions have been increasing, but have concentrated on a statistical propagation of the influence of parameter uncertainties on the calculational results. The primary objective of this Working Group of BIOMOVS II was to test user's influence on model predictions on a more systematic basis than has been done before. The main goals were as follows: To compare differences between predictions from different people all using the same model and the same scenario description with the statistical uncertainties calculated by the model. To investigate the main reasons for different interpretations by users. To create a better awareness of the potential influence of the user on the modeling results. Terrestrial food chain models driven by deposition of radionuclides from the atmosphere were used. Three codes were obtained and run with three scenarios by a maximum of 10 users. A number of conclusions can be drawn some of which are general and independent of the type of models and processes studied, while others are restricted to the few processes that were addressed directly: For any set of predictions, the variation in best estimates was greater than one order of magnitude. Often the range increased from deposition to pasture to milk probably due to additional transfer processes. The 95% confidence intervals about the predictions calculated from the parameter distributions prepared by the participants did not always overlap the observations; similarly, sometimes the confidence intervals on the predictions did not overlap. Often the 95% confidence intervals of individual predictions were smaller than the

  2. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Kirchner, G. [Univ. of Bremen (Germany); Peterson, R. [AECL, Chalk River, ON (Canada)] [and others

    1996-11-01

    Uncertainty in predictions of environmental transfer models arises from, among other sources, the adequacy of the conceptual model, the approximations made in coding the conceptual model, the quality of the input data, the uncertainty in parameter values, and the assumptions made by the user. In recent years efforts to quantify the confidence that can be placed in predictions have been increasing, but have concentrated on a statistical propagation of the influence of parameter uncertainties on the calculational results. The primary objective of this Working Group of BIOMOVS II was to test user's influence on model predictions on a more systematic basis than has been done before. The main goals were as follows: To compare differences between predictions from different people all using the same model and the same scenario description with the statistical uncertainties calculated by the model. To investigate the main reasons for different interpretations by users. To create a better awareness of the potential influence of the user on the modeling results. Terrestrial food chain models driven by deposition of radionuclides from the atmosphere were used. Three codes were obtained and run with three scenarios by a maximum of 10 users. A number of conclusions can be drawn some of which are general and independent of the type of models and processes studied, while others are restricted to the few processes that were addressed directly: For any set of predictions, the variation in best estimates was greater than one order of magnitude. Often the range increased from deposition to pasture to milk probably due to additional transfer processes. The 95% confidence intervals about the predictions calculated from the parameter distributions prepared by the participants did not always overlap the observations; similarly, sometimes the confidence intervals on the predictions did not overlap. Often the 95% confidence intervals of individual predictions were smaller than the

  3. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  4. Uncertainties in repository modeling

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1996-01-01

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling

  5. Risks, uncertainty, vagueness

    International Nuclear Information System (INIS)

    Haefele, W.; Renn, O.; Erdmann, G.

    1990-01-01

    The notion of 'risk' is discussed in its social and technological contexts, leading to an investigation of the terms factuality, hypotheticality, uncertainty, and vagueness, and to the problems of acceptance and acceptability especially in the context of political decision finding. (DG) [de

  6. Strategy under uncertainty.

    Science.gov (United States)

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  7. Different approaches to overcome uncertainties of production systems

    Science.gov (United States)

    Azizi, Amir; Sorooshian, Shahryar

    2015-05-01

    This study presented a comprehensive review on the understanding of uncertainty and the current approaches that have been proposed to handle the uncertainties in the production systems. This paper classified proposed approaches into 11 groups. The paper studied 114 scholarly papers through various international journals. The paper added the latest findings to the body of knowledge to the current reservoir of understanding of the production uncertainties. Thus, the paper prepared the needs of researchers and practitioners for easy references in this area. This review also provided an excellent source to continue further studies on how to deal with the uncertainties of production system.

  8. Do oil shocks predict economic policy uncertainty?

    Science.gov (United States)

    Rehman, Mobeen Ur

    2018-05-01

    Oil price fluctuations have influential role in global economic policies for developed as well as emerging countries. I investigate the role of international oil prices disintegrated into structural (i) oil supply shock, (ii) aggregate demand shock and (iii) oil market specific demand shocks, based on the work of Kilian (2009) using structural VAR framework on economic policies uncertainty of sampled markets. Economic policy uncertainty, due to its non-linear behavior is modeled in a regime switching framework with disintegrated structural oil shocks. Our results highlight that Indian, Spain and Japanese economic policy uncertainty responds to the global oil price shocks, however aggregate demand shocks fail to induce any change. Oil specific demand shocks are significant only for China and India in high volatility state.

  9. Zimbabwe: Internally or Externally Driven Meltdown

    Science.gov (United States)

    2010-06-01

    front of parliament” were removed by “riot police us[ing] dogs , batons and tear gas.”202 Though the civil society strikes and protests in this period...Mugabe recognized ZAPU’s unwillingness to be muzzled in their opposition of his policies. He then moved to marginalize ZAPU, even in its own home...similar response to other protests, with police breaking up any protests using dogs , batons, or clubs as necessary to disperse protesters whether they

  10. The Uncertainty estimation of Alanine/ESR dosimetry

    International Nuclear Information System (INIS)

    Kim, Bo Rum; An, Jin Hee; Choi, Hoon; Kim, Young Ki

    2008-01-01

    Machinery, tools and cable etc are in the nuclear power plant which environment is very severe. By measuring actual dose, it needs for extending life expectancy of the machinery and tools and the cable. Therefore, we estimated on dose (gamma ray) of Wolsong nuclear power division 1 by dose estimation technology for three years. The dose estimation technology was secured by ESR(Electron Spin Resonance) dose estimation using regression analysis. We estimate uncertainty for secure a reliability of results. The uncertainty estimation will be able to judge the reliability of measurement results. The estimation of uncertainty referred the international unified guide in order; GUM(Guide to the Expression of Uncertainty in Measurement). It was published by International Standardization for Organization (ISO) in 1993. In this study the uncertainty of e-scan and EMX those are ESR equipment were evaluated and compared. Base on these results, it will improve the reliability of measurement

  11. Implicit knowledge of visual uncertainty guides decisions with asymmetric outcomes

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma; Sahani, Maneesh

    2008-01-01

    under conditions of uncertainty. Here we show that human observers performing a simple visual choice task under an externally imposed loss function approach the optimal strategy, as defined by Bayesian probability and decision theory (Berger, 1985; Cox, 1961). In concert with earlier work, this suggests...... are pre-existing, widespread, and can be propagated to decision-making areas of the brain....... that observers possess a model of their internal uncertainty and can utilize this model in the neural computations that underlie their behavior (Knill & Pouget, 2004). In our experiment, optimal behavior requires that observers integrate the loss function with an estimate of their internal uncertainty rather...

  12. Uncertainty in adaptive capacity

    International Nuclear Information System (INIS)

    Neil Adger, W.; Vincent, K.

    2005-01-01

    The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. (authors)

  13. Uncertainties about climate

    International Nuclear Information System (INIS)

    Laval, Katia; Laval, Guy

    2013-01-01

    Like meteorology, climatology is not an exact science: climate change forecasts necessarily include a share of uncertainty. It is precisely this uncertainty which is brandished and exploited by the opponents to the global warming theory to put into question the estimations of its future consequences. Is it legitimate to predict the future using the past climate data (well documented up to 100000 years BP) or the climates of other planets, taking into account the impreciseness of the measurements and the intrinsic complexity of the Earth's machinery? How is it possible to model a so huge and interwoven system for which any exact description has become impossible? Why water and precipitations play such an important role in local and global forecasts, and how should they be treated? This book written by two physicists answers with simpleness these delicate questions in order to give anyone the possibility to build his own opinion about global warming and the need to act rapidly

  14. Uncertainties in projecting climate-change impacts in marine ecosystems

    DEFF Research Database (Denmark)

    Payne, Mark; Barange, Manuel; Cheung, William W. L.

    2016-01-01

    with a projection and building confidence in its robustness. We review how uncertainties in such projections are handled in marine science. We employ an approach developed in climate modelling by breaking uncertainty down into (i) structural (model) uncertainty, (ii) initialization and internal variability......Projections of the impacts of climate change on marine ecosystems are a key prerequisite for the planning of adaptation strategies, yet they are inevitably associated with uncertainty. Identifying, quantifying, and communicating this uncertainty is key to both evaluating the risk associated...... and highlight the opportunities and challenges associated with doing a better job. We find that even within a relatively small field such as marine science, there are substantial differences between subdisciplines in the degree of attention given to each type of uncertainty. We find that initialization...

  15. Quantifying uncertainty in nuclear analytical measurements

    International Nuclear Information System (INIS)

    2004-07-01

    The lack of international consensus on the expression of uncertainty in measurements was recognised by the late 1970s and led, after the issuance of a series of rather generic recommendations, to the publication of a general publication, known as GUM, the Guide to the Expression of Uncertainty in Measurement. This publication, issued in 1993, was based on co-operation over several years by the Bureau International des Poids et Mesures, the International Electrotechnical Commission, the International Federation of Clinical Chemistry, the International Organization for Standardization (ISO), the International Union of Pure and Applied Chemistry, the International Union of Pure and Applied Physics and the Organisation internationale de metrologie legale. The purpose was to promote full information on how uncertainty statements are arrived at and to provide a basis for harmonized reporting and the international comparison of measurement results. The need to provide more specific guidance to different measurement disciplines was soon recognized and the field of analytical chemistry was addressed by EURACHEM in 1995 in the first edition of a guidance report on Quantifying Uncertainty in Analytical Measurements, produced by a group of experts from the field. That publication translated the general concepts of the GUM into specific applications for analytical laboratories and illustrated the principles with a series of selected examples as a didactic tool. Based on feedback from the actual practice, the EURACHEM publication was extensively reviewed in 1997-1999 under the auspices of the Co-operation on International Traceability in Analytical Chemistry (CITAC), and a second edition was published in 2000. Still, except for a single example on the measurement of radioactivity in GUM, the field of nuclear and radiochemical measurements was not covered. The explicit requirement of ISO standard 17025:1999, General Requirements for the Competence of Testing and Calibration

  16. The uncertainty principle

    International Nuclear Information System (INIS)

    Martens, Hans.

    1991-01-01

    The subject of this thesis is the uncertainty principle (UP). The UP is one of the most characteristic points of differences between quantum and classical mechanics. The starting point of this thesis is the work of Niels Bohr. Besides the discussion the work is also analyzed. For the discussion of the different aspects of the UP the formalism of Davies and Ludwig is used instead of the more commonly used formalism of Neumann and Dirac. (author). 214 refs.; 23 figs

  17. Uncertainty in artificial intelligence

    CERN Document Server

    Shachter, RD; Henrion, M; Lemmer, JF

    1990-01-01

    This volume, like its predecessors, reflects the cutting edge of research on the automation of reasoning under uncertainty.A more pragmatic emphasis is evident, for although some papers address fundamental issues, the majority address practical issues. Topics include the relations between alternative formalisms (including possibilistic reasoning), Dempster-Shafer belief functions, non-monotonic reasoning, Bayesian and decision theoretic schemes, and new inference techniques for belief nets. New techniques are applied to important problems in medicine, vision, robotics, and natural language und

  18. Decision Making Under Uncertainty

    Science.gov (United States)

    2010-11-01

    A sound approach to rational decision making requires a decision maker to establish decision objectives, identify alternatives, and evaluate those...often violate the axioms of rationality when making decisions under uncertainty. The systematic description of such observations may lead to the...which leads to “anchoring” on the initial value. The fact that individuals have been shown to deviate from rationality when making decisions

  19. Economic uncertainty principle?

    OpenAIRE

    Alexander Harin

    2006-01-01

    The economic principle of (hidden) uncertainty is presented. New probability formulas are offered. Examples of solutions of three types of fundamental problems are reviewed.; Principe d'incertitude économique? Le principe économique d'incertitude (cachée) est présenté. De nouvelles formules de chances sont offertes. Les exemples de solutions des trois types de problèmes fondamentaux sont reconsidérés.

  20. Citizen Candidates Under Uncertainty

    OpenAIRE

    Eguia, Jon X.

    2005-01-01

    In this paper we make two contributions to the growing literature on "citizen-candidate" models of representative democracy. First, we add uncertainty about the total vote count. We show that in a society with a large electorate, where the outcome of the election is uncertain and where winning candidates receive a large reward from holding office, there will be a two-candidate equilibrium and no equilibria with a single candidate. Second, we introduce a new concept of equilibrium, which we te...

  1. Understanding Climate Uncertainty with an Ocean Focus

    Science.gov (United States)

    Tokmakian, R. T.

    2009-12-01

    Uncertainty in climate simulations arises from various aspects of the end-to-end process of modeling the Earth’s climate. First, there is uncertainty from the structure of the climate model components (e.g. ocean/ice/atmosphere). Even the most complex models are deficient, not only in the complexity of the processes they represent, but in which processes are included in a particular model. Next, uncertainties arise from the inherent error in the initial and boundary conditions of a simulation. Initial conditions are the state of the weather or climate at the beginning of the simulation and other such things, and typically come from observations. Finally, there is the uncertainty associated with the values of parameters in the model. These parameters may represent physical constants or effects, such as ocean mixing, or non-physical aspects of modeling and computation. The uncertainty in these input parameters propagates through the non-linear model to give uncertainty in the outputs. The models in 2020 will no doubt be better than today’s models, but they will still be imperfect, and development of uncertainty analysis technology is a critical aspect of understanding model realism and prediction capability. Smith [2002] and Cox and Stephenson [2007] discuss the need for methods to quantify the uncertainties within complicated systems so that limitations or weaknesses of the climate model can be understood. In making climate predictions, we need to have available both the most reliable model or simulation and a methods to quantify the reliability of a simulation. If quantitative uncertainty questions of the internal model dynamics are to be answered with complex simulations such as AOGCMs, then the only known path forward is based on model ensembles that characterize behavior with alternative parameter settings [e.g. Rougier, 2007]. The relevance and feasibility of using "Statistical Analysis of Computer Code Output" (SACCO) methods for examining uncertainty in

  2. Integrating uncertainties for climate change mitigation

    Science.gov (United States)

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    The target of keeping global average temperature increase to below 2°C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by

  3. Calibration Under Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  4. Participation under Uncertainty

    International Nuclear Information System (INIS)

    Boudourides, Moses A.

    2003-01-01

    This essay reviews a number of theoretical perspectives about uncertainty and participation in the present-day knowledge-based society. After discussing the on-going reconfigurations of science, technology and society, we examine how appropriate for policy studies are various theories of social complexity. Post-normal science is such an example of a complexity-motivated approach, which justifies civic participation as a policy response to an increasing uncertainty. But there are different categories and models of uncertainties implying a variety of configurations of policy processes. A particular role in all of them is played by expertise whose democratization is an often-claimed imperative nowadays. Moreover, we discuss how different participatory arrangements are shaped into instruments of policy-making and framing regulatory processes. As participation necessitates and triggers deliberation, we proceed to examine the role and the barriers of deliberativeness. Finally, we conclude by referring to some critical views about the ultimate assumptions of recent European policy frameworks and the conceptions of civic participation and politicization that they invoke

  5. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  6. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  7. Acquired experience on organizing 3D S.UN.COP: international course to support nuclear license by user training in the areas of scaling, uncertainty, and 3D thermal-hydraulics/neutron-kinetics coupled codes

    Energy Technology Data Exchange (ETDEWEB)

    Petruzzi, Alessandro; D' Auria, Francesco [University of Pisa, San Piero a Grado (Italy). Nuclear Research Group San Piero a Grado (GRNSPG); Galetti, Regina, E-mail: regina@cnen.gov.b [National Commission for Nuclear Energy (CNEN), Rio de Janeiro, RJ (Brazil); Bajs, Tomislav [University of Zagreb (Croatia). Fac. of Electrical Engineering and Computing. Dept. of Power Systems; Reventos, Francesc [Technical University of Catalonia, Barcelona (Spain). Dept. of Physics and Nuclear Engineering

    2011-07-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers, vendors, and research organizations. Computer code user represents a source of uncertainty that may significantly affect the results of system code calculations. Code user training and qualification represent an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes the experience in applying a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In addition, this paper presents the organization and the main features of the 3D S.UN.COP (scaling, uncertainty, and 3D coupled code calculations) seminars during which particular emphasis is given to practical applications in connection with the licensing process of best estimate plus uncertainty methodologies, showing the designer, utility and regulatory approaches. (author)

  8. Acquired experience on organizing 3D S.UN.COP: international course to support nuclear license by user training in the areas of scaling, uncertainty, and 3D thermal-hydraulics/neutron-kinetics coupled codes

    International Nuclear Information System (INIS)

    Petruzzi, Alessandro; D'Auria, Francesco; Galetti, Regina; Bajs, Tomislav; Reventos, Francesc

    2011-01-01

    Thermal-hydraulic system computer codes are extensively used worldwide for analysis of nuclear facilities by utilities, regulatory bodies, nuclear power plant designers, vendors, and research organizations. Computer code user represents a source of uncertainty that may significantly affect the results of system code calculations. Code user training and qualification represent an effective means for reducing the variation of results caused by the application of the codes by different users. This paper describes the experience in applying a systematic approach to training code users who, upon completion of the training, should be able to perform calculations making the best possible use of the capabilities of best estimate codes. In addition, this paper presents the organization and the main features of the 3D S.UN.COP (scaling, uncertainty, and 3D coupled code calculations) seminars during which particular emphasis is given to practical applications in connection with the licensing process of best estimate plus uncertainty methodologies, showing the designer, utility and regulatory approaches. (author)

  9. Regime-dependent forecast uncertainty of convective precipitation

    Energy Technology Data Exchange (ETDEWEB)

    Keil, Christian; Craig, George C. [Muenchen Univ. (Germany). Meteorologisches Inst.

    2011-04-15

    Forecast uncertainty of convective precipitation is influenced by all scales, but in different ways in different meteorological situations. Forecasts of the high resolution ensemble prediction system COSMO-DE-EPS of Deutscher Wetterdienst (DWD) are used to examine the dominant sources of uncertainty of convective precipitation. A validation with radar data using traditional as well as spatial verification measures highlights differences in precipitation forecast performance in differing weather regimes. When the forecast uncertainty can primarily be associated with local, small-scale processes individual members run with the same variation of the physical parameterisation driven by different global models outperform all other ensemble members. In contrast when the precipitation is governed by the large-scale flow all ensemble members perform similarly. Application of the convective adjustment time scale confirms this separation and shows a regime-dependent forecast uncertainty of convective precipitation. (orig.)

  10. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    The question of how to derive and present uncertainty information in climate data records (CDRs) has received sustained attention within the European Space Agency Climate Change Initiative (CCI), a programme to generate CDRs addressing a range of essential climate variables (ECVs) from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO) should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions). The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the shape of the

  11. Uncertainty analysis in the task of individual monitoring data

    International Nuclear Information System (INIS)

    Molokanov, A.; Badjin, V.; Gasteva, G.; Antipin, E.

    2003-01-01

    Assessment of internal doses is an essential component of individual monitoring programmes for workers and consists of two stages: individual monitoring measurements and interpretation of the monitoring data in terms of annual intake and/or annual internal dose. The overall uncertainty in assessed dose is a combination of the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainties in these stages. An algorithm and a computer code were developed for estimating the uncertainty in the assessment of internal dose in the task of individual monitoring data interpretation. Two main influencing factors are analysed in this paper: the unknown time of the exposure and variability of bioassay measurements. The aim of this analysis is to show that the algorithm is applicable in designing an individual monitoring programme for workers so as to guarantee that the individual dose calculated from individual monitoring measurements does not exceed a required limit with a certain confidence probability. (author)

  12. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  13. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  14. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  15. Managing uncertainty in potential supplier identification

    OpenAIRE

    Ye , Yun ,; Jankovic , Marija; Okudan Kremer , Gül E.; Bocquet , Jean-Claude

    2014-01-01

    International audience; As a benefit of modularization of complex systems, Original Equipment Manufacturers (OEMs) can choose suppliers in a less constricted way when faced with new or evolving requirements. Before a supplier is selected for each module, a group of potential suppliers should be identified in order to control the uncertainty along with other performance measures of the new system development. In modular design, because suppliers are more involved in system development, the pot...

  16. Uncertainty and the de Finetti tables

    OpenAIRE

    Baratgin , Jean; Over , David; Politzer , Guy

    2013-01-01

    International audience; The new paradigm in the psychology of reasoning adopts a Bayesian, or prob-abilistic, model for studying human reasoning. Contrary to the traditional binary approach based on truth functional logic, with its binary values of truth and falsity, a third value that represents uncertainty can be introduced in the new paradigm. A variety of three-valued truth table systems are available in the formal literature, including one proposed by de Finetti. We examine the descripti...

  17. Traceability and uncertainty estimation in coordinate metrology

    DEFF Research Database (Denmark)

    Hansen, Hans Nørgaard; Savio, Enrico; De Chiffre, Leonardo

    2001-01-01

    National and international standards have defined performance verification procedures for coordinate measuring machines (CMMs) that typically involve their ability to measure calibrated lengths and to a certain extent form. It is recognised that, without further analysis or testing, these results...... are required. Depending on the requirements for uncertainty level, different approaches may be adopted to achieve traceability. Especially in the case of complex measurement situations and workpieces the procedures are not trivial. This paper discusses the establishment of traceability in coordinate metrology...

  18. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    NARCIS (Netherlands)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G.; Ring, David; Parisien, Robert

    2016-01-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if

  19. Investment and uncertainty

    DEFF Research Database (Denmark)

    Greasley, David; Madsen, Jakob B.

    2006-01-01

    A severe collapse of fixed capital formation distinguished the onset of the Great Depression from other investment downturns between the world wars. Using a model estimated for the years 1890-2000, we show that the expected profitability of capital measured by Tobin's q, and the uncertainty...... surrounding expected profits indicated by share price volatility, were the chief influences on investment levels, and that heightened share price volatility played the dominant role in the crucial investment collapse in 1930. Investment did not simply follow the downward course of income at the onset...

  20. Optimization under Uncertainty

    KAUST Repository

    Lopez, Rafael H.

    2016-01-06

    The goal of this poster is to present the main approaches to optimization of engineering systems in the presence of uncertainties. We begin by giving an insight about robust optimization. Next, we detail how to deal with probabilistic constraints in optimization, the so called the reliability based design. Subsequently, we present the risk optimization approach, which includes the expected costs of failure in the objective function. After that the basic description of each approach is given, the projects developed by CORE are presented. Finally, the main current topic of research of CORE is described.

  1. Optimizing production under uncertainty

    DEFF Research Database (Denmark)

    Rasmussen, Svend

    This Working Paper derives criteria for optimal production under uncertainty based on the state-contingent approach (Chambers and Quiggin, 2000), and discusses po-tential problems involved in applying the state-contingent approach in a normative context. The analytical approach uses the concept...... of state-contingent production functions and a definition of inputs including both sort of input, activity and alloca-tion technology. It also analyses production decisions where production is combined with trading in state-contingent claims such as insurance contracts. The final part discusses...

  2. Commonplaces and social uncertainty

    DEFF Research Database (Denmark)

    Lassen, Inger

    2008-01-01

    This article explores the concept of uncertainty in four focus group discussions about genetically modified food. In the discussions, members of the general public interact with food biotechnology scientists while negotiating their attitudes towards genetic engineering. Their discussions offer...... an example of risk discourse in which the use of commonplaces seems to be a central feature (Myers 2004: 81). My analyses support earlier findings that commonplaces serve important interactional purposes (Barton 1999) and that they are used for mitigating disagreement, for closing topics and for facilitating...

  3. Principles of Uncertainty

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    An intuitive and mathematical introduction to subjective probability and Bayesian statistics. An accessible, comprehensive guide to the theory of Bayesian statistics, Principles of Uncertainty presents the subjective Bayesian approach, which has played a pivotal role in game theory, economics, and the recent boom in Markov Chain Monte Carlo methods. Both rigorous and friendly, the book contains: Introductory chapters examining each new concept or assumption Just-in-time mathematics -- the presentation of ideas just before they are applied Summary and exercises at the end of each chapter Discus

  4. Mathematical Analysis of Uncertainty

    Directory of Open Access Journals (Sweden)

    Angel GARRIDO

    2016-01-01

    Full Text Available Classical Logic showed early its insufficiencies for solving AI problems. The introduction of Fuzzy Logic aims at this problem. There have been research in the conventional Rough direction alone or in the Fuzzy direction alone, and more recently, attempts to combine both into Fuzzy Rough Sets or Rough Fuzzy Sets. We analyse some new and powerful tools in the study of Uncertainty, as the Probabilistic Graphical Models, Chain Graphs, Bayesian Networks, and Markov Networks, integrating our knowledge of graphs and probability.

  5. Nuclear data sensitivity/uncertainty analysis for XT-ADS

    International Nuclear Information System (INIS)

    Sugawara, Takanori; Sarotto, Massimo; Stankovskiy, Alexey; Van den Eynde, Gert

    2011-01-01

    Highlights: → The sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. → The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. → When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. → To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition. - Abstract: The XT-ADS, an accelerator-driven system for an experimental demonstration, has been investigated in the framework of IP EUROTRANS FP6 project. In this study, the sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. For the sensitivity analysis, it was found that the sensitivity coefficients were significantly different by changing the geometry models and calculation codes. For the uncertainty analysis, it was confirmed that the uncertainties deduced from the covariance data varied significantly by changing them. The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition.

  6. Investment, regulation, and uncertainty

    Science.gov (United States)

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

  7. Probabilistic Mass Growth Uncertainties

    Science.gov (United States)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  8. Embracing uncertainty in applied ecology.

    Science.gov (United States)

    Milner-Gulland, E J; Shea, K

    2017-12-01

    Applied ecologists often face uncertainty that hinders effective decision-making.Common traps that may catch the unwary are: ignoring uncertainty, acknowledging uncertainty but ploughing on, focussing on trivial uncertainties, believing your models, and unclear objectives.We integrate research insights and examples from a wide range of applied ecological fields to illustrate advances that are generally underused, but could facilitate ecologists' ability to plan and execute research to support management.Recommended approaches to avoid uncertainty traps are: embracing models, using decision theory, using models more effectively, thinking experimentally, and being realistic about uncertainty. Synthesis and applications . Applied ecologists can become more effective at informing management by using approaches that explicitly take account of uncertainty.

  9. Oil price uncertainty in Canada

    Energy Technology Data Exchange (ETDEWEB)

    Elder, John [Department of Finance and Real Estate, 1272 Campus Delivery, Colorado State University, Fort Collins, CO 80523 (United States); Serletis, Apostolos [Department of Economics, University of Calgary, Calgary, Alberta (Canada)

    2009-11-15

    Bernanke [Bernanke, Ben S. Irreversibility, uncertainty, and cyclical investment. Quarterly Journal of Economics 98 (1983), 85-106.] shows how uncertainty about energy prices may induce optimizing firms to postpone investment decisions, thereby leading to a decline in aggregate output. Elder and Serletis [Elder, John and Serletis, Apostolos. Oil price uncertainty.] find empirical evidence that uncertainty about oil prices has tended to depress investment in the United States. In this paper we assess the robustness of these results by investigating the effects of oil price uncertainty in Canada. Our results are remarkably similar to existing results for the United States, providing additional evidence that uncertainty about oil prices may provide another explanation for why the sharp oil price declines of 1985 failed to produce rapid output growth. Impulse-response analysis suggests that uncertainty about oil prices may tend to reinforce the negative response of output to positive oil shocks. (author)

  10. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  11. Optimizing integrated airport surface and terminal airspace operations under uncertainty

    Science.gov (United States)

    Bosson, Christabelle S.

    In airports and surrounding terminal airspaces, the integration of surface, arrival and departure scheduling and routing have the potential to improve the operations efficiency. Moreover, because both the airport surface and the terminal airspace are often altered by random perturbations, the consideration of uncertainty in flight schedules is crucial to improve the design of robust flight schedules. Previous research mainly focused on independently solving arrival scheduling problems, departure scheduling problems and surface management scheduling problems and most of the developed models are deterministic. This dissertation presents an alternate method to model the integrated operations by using a machine job-shop scheduling formulation. A multistage stochastic programming approach is chosen to formulate the problem in the presence of uncertainty and candidate solutions are obtained by solving sample average approximation problems with finite sample size. The developed mixed-integer-linear-programming algorithm-based scheduler is capable of computing optimal aircraft schedules and routings that reflect the integration of air and ground operations. The assembled methodology is applied to a Los Angeles case study. To show the benefits of integrated operations over First-Come-First-Served, a preliminary proof-of-concept is conducted for a set of fourteen aircraft evolving under deterministic conditions in a model of the Los Angeles International Airport surface and surrounding terminal areas. Using historical data, a representative 30-minute traffic schedule and aircraft mix scenario is constructed. The results of the Los Angeles application show that the integration of air and ground operations and the use of a time-based separation strategy enable both significant surface and air time savings. The solution computed by the optimization provides a more efficient routing and scheduling than the First-Come-First-Served solution. Additionally, a data driven analysis is

  12. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, C. J.

    2017-12-01

    How to derive and present uncertainty in climate data records (CDRs) has been debated within the European Space Agency Climate Change Initiative, in search of common principles applicable across a range of essential climate variables. Various points of consensus have been reached, including the importance of improving provision of uncertainty information and the benefit of adopting international norms of metrology for language around the distinct concepts of uncertainty and error. Providing an estimate of standard uncertainty per datum (or the means to readily calculate it) emerged as baseline good practice, and should be highly relevant to users of CDRs when the uncertainty in data is variable (the usual case). Given this baseline, the role of quality flags is clarified as being complementary to and not repetitive of uncertainty information. Data with high uncertainty are not poor quality if a valid estimate of the uncertainty is available. For CDRs and their applications, the error correlation properties across spatio-temporal scales present important challenges that are not fully solved. Error effects that are negligible in the uncertainty of a single pixel may dominate uncertainty in the large-scale and long-term. A further principle is that uncertainty estimates should themselves be validated. The concepts of estimating and propagating uncertainty are generally acknowledged in geophysical sciences, but less widely practised in Earth observation and development of CDRs. Uncertainty in a CDR depends in part (and usually significantly) on the error covariance of the radiances and auxiliary data used in the retrieval. Typically, error covariance information is not available in the fundamental CDR (FCDR) (i.e., with the level-1 radiances), since provision of adequate level-1 uncertainty information is not yet standard practice. Those deriving CDRs thus cannot propagate the radiance uncertainty to their geophysical products. The FIDUCEO project (www.fiduceo.eu) is

  13. Heisenberg's principle of uncertainty and the uncertainty relations

    International Nuclear Information System (INIS)

    Redei, Miklos

    1987-01-01

    The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs

  14. Uncertainty as Certaint

    Science.gov (United States)

    Petzinger, Tom

    I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.

  15. Orientation and uncertainties

    International Nuclear Information System (INIS)

    Peters, H.P.; Hennen, L.

    1990-01-01

    The authors report on the results of three representative surveys that made a closer inquiry into perceptions and valuations of information and information sources concering Chernobyl. If turns out that the information sources are generally considered little trustworthy. This was generally attributable to the interpretation of the events being tied to attitudes in the atmonic energy issue. The greatest credit was given to television broadcasting. The authors summarize their discourse as follows: There is good reason to interpret the widespread uncertainty after Chernobyl as proof of the fact that large parts of the population are prepared and willing to assume a critical stance towards information and prefer to draw their information from various sources representing different positions. (orig.) [de

  16. DOD ELAP Lab Uncertainties

    Science.gov (United States)

    2012-03-01

    ISO / IEC   17025  Inspection Bodies – ISO / IEC  17020  RMPs – ISO  Guide 34 (Reference...certify to :  ISO  9001 (QMS),  ISO  14001 (EMS),   TS 16949 (US Automotive)  etc. 2 3 DoD QSM 4.2 standard   ISO / IEC   17025 :2005  Each has uncertainty...IPV6, NLLAP, NEFAP  TRAINING Programs  Certification Bodies – ISO / IEC  17021  Accreditation for  Management System 

  17. Traceability and Measurement Uncertainty

    DEFF Research Database (Denmark)

    Tosello, Guido; De Chiffre, Leonardo

    2004-01-01

    . The project partnership aims (composed by 7 partners in 5 countries, thus covering a real European spread in high tech production technology) to develop and implement an advanced e-learning system that integrates contributions from quite different disciplines into a user-centred approach that strictly....... Machine tool testing 9. The role of manufacturing metrology for QM 10. Inspection planning 11. Quality management of measurements incl. Documentation 12. Advanced manufacturing measurement technology The present report (which represents the section 2 - Traceability and Measurement Uncertainty – of the e-learning......This report is made as a part of the project ‘Metro-E-Learn: European e-Learning in Manufacturing Metrology’, an EU project under the program SOCRATES MINERVA (ODL and ICT in Education), Contract No: 101434-CP-1-2002-1-DE-MINERVA, coordinated by Friedrich-Alexander-University Erlangen...

  18. Decision making under uncertainty

    International Nuclear Information System (INIS)

    Cyert, R.M.

    1989-01-01

    This paper reports on ways of improving the reliability of products and systems in this country if we are to survive as a first-rate industrial power. The use of statistical techniques have, since the 1920s, been viewed as one of the methods for testing quality and estimating the level of quality in a universe of output. Statistical quality control is not relevant, generally, to improving systems in an industry like yours, but certainly the use of probability concepts is of significance. In addition, when it is recognized that part of the problem involves making decisions under uncertainty, it becomes clear that techniques such as sequential decision making and Bayesian analysis become major methodological approaches that must be utilized

  19. Sustainability and uncertainty

    DEFF Research Database (Denmark)

    Jensen, Karsten Klint

    2007-01-01

    The widely used concept of sustainability is seldom precisely defined, and its clarification involves making up one's mind about a range of difficult questions. One line of research (bottom-up) takes sustaining a system over time as its starting point and then infers prescriptions from...... this requirement. Another line (top-down) takes an economical interpretation of the Brundtland Commission's suggestion that the present generation's needsatisfaction should not compromise the need-satisfaction of future generations as its starting point. It then measures sustainability at the level of society...... a clarified ethical goal, disagreements can arise. At present we do not know what substitutions will be possible in the future. This uncertainty clearly affects the prescriptions that follow from the measure of sustainability. Consequently, decisions about how to make future agriculture sustainable...

  20. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  1. Essays on model uncertainty in financial models

    NARCIS (Netherlands)

    Li, Jing

    2018-01-01

    This dissertation studies model uncertainty, particularly in financial models. It consists of two empirical chapters and one theoretical chapter. The first empirical chapter (Chapter 2) classifies model uncertainty into parameter uncertainty and misspecification uncertainty. It investigates the

  2. A new uncertainty importance measure

    International Nuclear Information System (INIS)

    Borgonovo, E.

    2007-01-01

    Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman [A matrix-based approach to uncertainty and sensitivity analysis for fault trees. Risk Anal 1987;7(1):22-33] first introduced uncertainty importance measures

  3. Uncertainty Management and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Rosenbaum, Ralph K.; Georgiadis, Stylianos; Fantke, Peter

    2018-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...

  4. Additivity of entropic uncertainty relations

    Directory of Open Access Journals (Sweden)

    René Schwonnek

    2018-03-01

    Full Text Available We consider the uncertainty between two pairs of local projective measurements performed on a multipartite system. We show that the optimal bound in any linear uncertainty relation, formulated in terms of the Shannon entropy, is additive. This directly implies, against naive intuition, that the minimal entropic uncertainty can always be realized by fully separable states. Hence, in contradiction to proposals by other authors, no entanglement witness can be constructed solely by comparing the attainable uncertainties of entangled and separable states. However, our result gives rise to a huge simplification for computing global uncertainty bounds as they now can be deduced from local ones. Furthermore, we provide the natural generalization of the Maassen and Uffink inequality for linear uncertainty relations with arbitrary positive coefficients.

  5. Light-Driven Alignment

    CERN Document Server

    Antonyuk, Boris P

    2009-01-01

    This book deals with influencing the properties of solids by light-driven electron transport. The theoretical basis of these effects, light-driven ordering and self-organisation, as well as optical motors are presented. With light as a tool, new ways to produce materials are opened.

  6. Realising the Uncertainty Enabled Model Web

    Science.gov (United States)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    conversion between uncertainty types, and between the spatial / temporal support of service inputs / outputs. Finally we describe the tools being generated within the UncertWeb project, considering three main aspects: i) Elicitation of uncertainties on model inputs. We are developing tools to enable domain experts to provide judgements about input uncertainties from UncertWeb model components (e.g. parameters in meteorological models) which allow panels of experts to engage in the process and reach a consensus view on the current knowledge / beliefs about that parameter or variable. We are developing systems for continuous and categorical variables as well as stationary spatial fields. ii) Visualisation of the resulting uncertain outputs from the end of the workflow, but also at intermediate steps. At this point we have prototype implementations driven by the requirements from the use cases that motivate UncertWeb. iii) Sensitivity and uncertainty analysis on model outputs. Here we show the design of the overall system we are developing, including the deployment of an emulator framework to allow computationally efficient approaches. We conclude with a summary of the open issues and remaining challenges we are facing in UncertWeb, and provide a brief overview of how we plan to tackle these.

  7. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib; Galassi, R. Malpica; Valorani, M.

    2016-01-01

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  8. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib

    2016-01-05

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  9. The Uncertainty of Measurement Results

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Factors affecting the uncertainty of measurement are explained, basic statistical formulae given, and the theoretical concept explained in the context of pesticide formulation analysis. Practical guidance is provided on how to determine individual uncertainty components within an analytical procedure. An extended and comprehensive table containing the relevant mathematical/statistical expressions elucidates the relevant underlying principles. Appendix I provides a practical elaborated example on measurement uncertainty estimation, above all utilizing experimental repeatability and reproducibility laboratory data. (author)

  10. Uncertainty analysis of environmental models

    International Nuclear Information System (INIS)

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  11. Calibration and Measurement Uncertainty Estimation of Radiometric Data: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Habte, A.; Sengupta, M.; Reda, I.; Andreas, A.; Konings, J.

    2014-11-01

    Evaluating the performance of photovoltaic cells, modules, and arrays that form large solar deployments relies on accurate measurements of the available solar resource. Therefore, determining the accuracy of these solar radiation measurements provides a better understanding of investment risks. This paper provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements by radiometers using methods that follow the International Bureau of Weights and Measures Guide to the Expression of Uncertainty (GUM). Standardized analysis based on these procedures ensures that the uncertainty quoted is well documented.

  12. Uncertainty quantification in resonance absorption

    International Nuclear Information System (INIS)

    Williams, M.M.R.

    2012-01-01

    We assess the uncertainty in the resonance escape probability due to uncertainty in the neutron and radiation line widths for the first 21 resonances in 232 Th as given by . Simulation, quadrature and polynomial chaos methods are used and the resonance data are assumed to obey a beta distribution. We find the uncertainty in the total resonance escape probability to be the equivalent, in reactivity, of 75–130 pcm. Also shown are pdfs of the resonance escape probability for each resonance and the variation of the uncertainty with temperature. The viability of the polynomial chaos expansion method is clearly demonstrated.

  13. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  14. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Science.gov (United States)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  15. Climate Certainties and Uncertainties

    International Nuclear Information System (INIS)

    Morel, Pierre

    2012-01-01

    In issue 380 of Futuribles in December 2011, Antonin Pottier analysed in detail the workings of what is today termed 'climate scepticism' - namely the propensity of certain individuals to contest the reality of climate change on the basis of pseudo-scientific arguments. He emphasized particularly that what fuels the debate on climate change is, largely, the degree of uncertainty inherent in the consequences to be anticipated from observation of the facts, not the description of the facts itself. In his view, the main aim of climate sceptics is to block the political measures for combating climate change. However, since they do not admit to this political posture, they choose instead to deny the scientific reality. This month, Futuribles complements this socio-psychological analysis of climate-sceptical discourse with an - in this case, wholly scientific - analysis of what we know (or do not know) about climate change on our planet. Pierre Morel gives a detailed account of the state of our knowledge in the climate field and what we are able to predict in the medium/long-term. After reminding us of the influence of atmospheric meteorological processes on the climate, he specifies the extent of global warming observed since 1850 and the main origin of that warming, as revealed by the current state of knowledge: the increase in the concentration of greenhouse gases. He then describes the changes in meteorological regimes (showing also the limits of climate simulation models), the modifications of hydrological regimes, and also the prospects for rises in sea levels. He also specifies the mechanisms that may potentially amplify all these phenomena and the climate disasters that might ensue. Lastly, he shows what are the scientific data that cannot be disregarded, the consequences of which are now inescapable (melting of the ice-caps, rises in sea level etc.), the only remaining uncertainty in this connection being the date at which these things will happen. 'In this

  16. Bureaucratic Politics and Decision Making under Uncertainty in a National Security Crisis: Assessing the Effects of International Relations Theory and the Learning Impact of Role-Playing Simulation at the U.S. Naval Academy

    Science.gov (United States)

    Biziouras, Nikolaos

    2013-01-01

    Using a pre-/posttest research design, this article measures the learning impact of active-learning techniques such as role-playing simulations in an international relations course. Using the students' different responses to the pre- and postsimulation surveys in a quasi-experimental design whereby two sections that were taught by the same…

  17. Planning ATES systems under uncertainty

    Science.gov (United States)

    Jaxa-Rozen, Marc; Kwakkel, Jan; Bloemendal, Martin

    2015-04-01

    Aquifer Thermal Energy Storage (ATES) can contribute to significant reductions in energy use within the built environment, by providing seasonal energy storage in aquifers for the heating and cooling of buildings. ATES systems have experienced a rapid uptake over the last two decades; however, despite successful experiments at the individual level, the overall performance of ATES systems remains below expectations - largely due to suboptimal practices for the planning and operation of systems in urban areas. The interaction between ATES systems and underground aquifers can be interpreted as a common-pool resource problem, in which thermal imbalances or interference could eventually degrade the storage potential of the subsurface. Current planning approaches for ATES systems thus typically follow the precautionary principle. For instance, the permitting process in the Netherlands is intended to minimize thermal interference between ATES systems. However, as shown in recent studies (Sommer et al., 2015; Bakr et al., 2013), a controlled amount of interference may benefit the collective performance of ATES systems. An overly restrictive approach to permitting is instead likely to create an artificial scarcity of available space, limiting the potential of the technology in urban areas. In response, master plans - which take into account the collective arrangement of multiple systems - have emerged as an increasingly popular alternative. However, permits and master plans both take a static, ex ante view of ATES governance, making it difficult to predict the effect of evolving ATES use or climactic conditions on overall performance. In particular, the adoption of new systems by building operators is likely to be driven by the available subsurface space and by the performance of existing systems; these outcomes are themselves a function of planning parameters. From this perspective, the interactions between planning authorities, ATES operators, and subsurface conditions

  18. The Uncertainty Test for the MAAP Computer Code

    International Nuclear Information System (INIS)

    Park, S. H.; Song, Y. M.; Park, S. Y.; Ahn, K. I.; Kim, K. R.; Lee, Y. J.

    2008-01-01

    After the Three Mile Island Unit 2 (TMI-2) and Chernobyl accidents, safety issues for a severe accident are treated in various aspects. Major issues in our research part include a level 2 PSA. The difficulty in expanding the level 2 PSA as a risk information activity is the uncertainty. In former days, it attached a weight to improve the quality in a internal accident PSA, but the effort is insufficient for decrease the phenomenon uncertainty in the level 2 PSA. In our country, the uncertainty degree is high in the case of a level 2 PSA model, and it is necessary to secure a model to decrease the uncertainty. We have not yet experienced the uncertainty assessment technology, the assessment system itself depends on advanced nations. In advanced nations, the severe accident simulator is implemented in the hardware level. But in our case, basic function in a software level can be implemented. In these circumstance at home and abroad, similar instances are surveyed such as UQM and MELCOR. Referred to these instances, SAUNA (Severe Accident UNcertainty Analysis) system is being developed in our project to assess and decrease the uncertainty in a level 2 PSA. It selects the MAAP code to analyze the uncertainty in a severe accident

  19. Assessing concentration uncertainty estimates from passive microwave sea ice products

    Science.gov (United States)

    Meier, W.; Brucker, L.; Miller, J. A.

    2017-12-01

    Sea ice concentration is an essential climate variable and passive microwave derived estimates of concentration are one of the longest satellite-derived climate records. However, until recently uncertainty estimates were not provided. Numerous validation studies provided insight into general error characteristics, but the studies have found that concentration error varied greatly depending on sea ice conditions. Thus, an uncertainty estimate from each observation is desired, particularly for initialization, assimilation, and validation of models. Here we investigate three sea ice products that include an uncertainty for each concentration estimate: the NASA Team 2 algorithm product, the EUMETSAT Ocean and Sea Ice Satellite Application Facility (OSI-SAF) product, and the NOAA/NSIDC Climate Data Record (CDR) product. Each product estimates uncertainty with a completely different approach. The NASA Team 2 product derives uncertainty internally from the algorithm method itself. The OSI-SAF uses atmospheric reanalysis fields and a radiative transfer model. The CDR uses spatial variability from two algorithms. Each approach has merits and limitations. Here we evaluate the uncertainty estimates by comparing the passive microwave concentration products with fields derived from the NOAA VIIRS sensor. The results show that the relationship between the product uncertainty estimates and the concentration error (relative to VIIRS) is complex. This may be due to the sea ice conditions, the uncertainty methods, as well as the spatial and temporal variability of the passive microwave and VIIRS products.

  20. Sketching Uncertainty into Simulations.

    Science.gov (United States)

    Ribicic, H; Waser, J; Gurbat, R; Sadransky, B; Groller, M E

    2012-12-01

    In a variety of application areas, the use of simulation steering in decision making is limited at best. Research focusing on this problem suggests that most user interfaces are too complex for the end user. Our goal is to let users create and investigate multiple, alternative scenarios without the need for special simulation expertise. To simplify the specification of parameters, we move from a traditional manipulation of numbers to a sketch-based input approach. Users steer both numeric parameters and parameters with a spatial correspondence by sketching a change onto the rendering. Special visualizations provide immediate visual feedback on how the sketches are transformed into boundary conditions of the simulation models. Since uncertainty with respect to many intertwined parameters plays an important role in planning, we also allow the user to intuitively setup complete value ranges, which are then automatically transformed into ensemble simulations. The interface and the underlying system were developed in collaboration with experts in the field of flood management. The real-world data they have provided has allowed us to construct scenarios used to evaluate the system. These were presented to a variety of flood response personnel, and their feedback is discussed in detail in the paper. The interface was found to be intuitive and relevant, although a certain amount of training might be necessary.

  1. Uncertainty vs. Information (Invited)

    Science.gov (United States)

    Nearing, Grey

    2017-04-01

    Information theory is the branch of logic that describes how rational epistemic states evolve in the presence of empirical data (Knuth, 2005), and any logic of science is incomplete without such a theory. Developing a formal philosophy of science that recognizes this fact results in essentially trivial solutions to several longstanding problems are generally considered intractable, including: • Alleviating the need for any likelihood function or error model. • Derivation of purely logical falsification criteria for hypothesis testing. • Specification of a general quantitative method for process-level model diagnostics. More generally, I make the following arguments: 1. Model evaluation should not proceed by quantifying and/or reducing error or uncertainty, and instead should be approached as a problem of ensuring that our models contain as much information as our experimental data. I propose that the latter is the only question a scientist actually has the ability to ask. 2. Instead of building geophysical models as solutions to differential equations that represent conservation laws, we should build models as maximum entropy distributions constrained by conservation symmetries. This will allow us to derive predictive probabilities directly from first principles. Knuth, K. H. (2005) 'Lattice duality: The origin of probability and entropy', Neurocomputing, 67, pp. 245-274.

  2. Pandemic influenza: certain uncertainties

    Science.gov (United States)

    Morens, David M.; Taubenberger, Jeffery K.

    2011-01-01

    SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, “wave” patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672

  3. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  4. Correlated quadratures of resonance fluorescence and the generalized uncertainty relation

    Science.gov (United States)

    Arnoldus, Henk F.; George, Thomas F.; Gross, Rolf W. F.

    1994-01-01

    Resonance fluorescence from a two-state atom has been predicted to exhibit quadrature squeezing below the Heisenberg uncertainty limit, provided that the optical parameters (Rabi frequency, detuning, laser linewidth, etc.) are chosen carefully. When the correlation between two quadratures of the radiation field does not vanish, however, the Heisenberg limit for quantum fluctuations might be an unrealistic lower bound. A generalized uncertainty relation, due to Schroedinger, takes into account the possible correlation between the quadrature components of the radiation, and it suggests a modified definition of squeezing. We show that the coherence between the two levels of a laser-driven atom is responsible for the correlation between the quadrature components of the emitted fluorescence, and that the Schrodinger uncertainty limit increases monotonically with the coherence. On the other hand, the fluctuations in the quadrature field diminish with an increasing coherence, and can disappear completely when the coherence reaches 1/2, provided that certain phase relations hold.

  5. Uncertainty enabled Sensor Observation Services

    Science.gov (United States)

    Cornford, Dan; Williams, Matthew; Bastin, Lucy

    2010-05-01

    Almost all observations of reality are contaminated with errors, which introduce uncertainties into the actual observation result. Such uncertainty is often held to be a data quality issue, and quantification of this uncertainty is essential for the principled exploitation of the observations. Many existing systems treat data quality in a relatively ad-hoc manner, however if the observation uncertainty is a reliable estimate of the error on the observation with respect to reality then knowledge of this uncertainty enables optimal exploitation of the observations in further processes, or decision making. We would argue that the most natural formalism for expressing uncertainty is Bayesian probability theory. In this work we show how the Open Geospatial Consortium Sensor Observation Service can be implemented to enable the support of explicit uncertainty about observations. We show how the UncertML candidate standard is used to provide a rich and flexible representation of uncertainty in this context. We illustrate this on a data set of user contributed weather data where the INTAMAP interpolation Web Processing Service is used to help estimate the uncertainty on the observations of unknown quality, using observations with known uncertainty properties. We then go on to discuss the implications of uncertainty for a range of existing Open Geospatial Consortium standards including SWE common and Observations and Measurements. We discuss the difficult decisions in the design of the UncertML schema and its relation and usage within existing standards and show various options. We conclude with some indications of the likely future directions for UncertML in the context of Open Geospatial Consortium services.

  6. Risk and uncertainty.

    Science.gov (United States)

    Carter, Tony

    2010-01-01

    Volatile business conditions have led to drastic corporate downsizing, meaning organizations are expected to do more with less. Managers must be more knowledgeable and possess a more eclectic myriad of business skills, many of which have not even been seen until recently. Many internal and external changes have occurred to organizations that have dictated the need to do business differently. Changes such as technological advances; globalization; catastrophic business crises; a more frantic competitive climate; and more demanding, sophisticated customers are examples of some of the shifts in the external business environment. Internal changes to organizations have been in the form of reengineering, accompanied by structural realignments and downsizing; greater emphasis on quality levels in product and service output; faster communication channels; and a more educated, skilled employee base with higher expectations from management.

  7. Model Driven Engineering

    Science.gov (United States)

    Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan

    A relevant initiative from the software engineering community called Model Driven Engineering (MDE) is being developed in parallel with the Semantic Web (Mellor et al. 2003a). The MDE approach to software development suggests that one should first develop a model of the system under study, which is then transformed into the real thing (i.e., an executable software entity). The most important research initiative in this area is the Model Driven Architecture (MDA), which is Model Driven Architecture being developed under the umbrella of the Object Management Group (OMG). This chapter describes the basic concepts of this software engineering effort.

  8. A commentary on model uncertainty

    International Nuclear Information System (INIS)

    Apostolakis, G.

    1994-01-01

    A framework is proposed for the identification of model and parameter uncertainties in risk assessment models. Two cases are distinguished; in the first case, a set of mutually exclusive and exhaustive hypotheses (models) can be formulated, while, in the second, only one reference model is available. The relevance of this formulation to decision making and the communication of uncertainties is discussed

  9. Mama Software Features: Uncertainty Testing

    Energy Technology Data Exchange (ETDEWEB)

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-30

    This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.

  10. Designing for Uncertainty: Three Approaches

    Science.gov (United States)

    Bennett, Scott

    2007-01-01

    Higher education wishes to get long life and good returns on its investment in learning spaces. Doing this has become difficult because rapid changes in information technology have created fundamental uncertainties about the future in which capital investments must deliver value. Three approaches to designing for this uncertainty are described…

  11. Wave Energy Converter Annual Energy Production Uncertainty Using Simulations

    Directory of Open Access Journals (Sweden)

    Clayton E. Hiles

    2016-09-01

    Full Text Available Critical to evaluating the economic viability of a wave energy project is: (1 a robust estimate of the electricity production throughout the project lifetime and (2 an understanding of the uncertainty associated with said estimate. Standardization efforts have established mean annual energy production (MAEP as the metric for quantification of wave energy converter (WEC electricity production and the performance matrix approach as the appropriate method for calculation. General acceptance of a method for calculating the MAEP uncertainty has not yet been achieved. Several authors have proposed methods based on the standard engineering approach to error propagation, however, a lack of available WEC deployment data has restricted testing of these methods. In this work the magnitude and sensitivity of MAEP uncertainty is investigated. The analysis is driven by data from simulated deployments of 2 WECs of different operating principle at 4 different locations. A Monte Carlo simulation approach is proposed for calculating the variability of MAEP estimates and is used to explore the sensitivity of the calculation. The uncertainty of MAEP ranged from 2%–20% of the mean value. Of the contributing uncertainties studied, the variability in the wave climate was found responsible for most of the uncertainty in MAEP. Uncertainty in MAEP differs considerably between WEC types and between deployment locations and is sensitive to the length of the input data-sets. This implies that if a certain maximum level of uncertainty in MAEP is targeted, the minimum required lengths of the input data-sets will be different for every WEC-location combination.

  12. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  13. Uncertainty in Measurement: A Review of Monte Carlo Simulation Using Microsoft Excel for the Calculation of Uncertainties Through Functional Relationships, Including Uncertainties in Empirically Derived Constants

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-01-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional

  14. Uncertainty in measurement: a review of monte carlo simulation using microsoft excel for the calculation of uncertainties through functional relationships, including uncertainties in empirically derived constants.

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-02-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship

  15. The factualization of uncertainty:

    DEFF Research Database (Denmark)

    Meyer, G.; Folker, A.P.; Jørgensen, R.B.

    2005-01-01

    Mandatory risk assessment is intended to reassure concerned citizens and introduce reason into the heated European controversies on genetically modified crops and food. The authors, examining a case of risk assessment of genetically modified oilseed rape, claim that the new European legislation...... on risk assessment does nothing of the sort and is not likely to present an escape from the international deadlock on the use of genetic modification in agriculture and food production. The new legislation is likely to stimulate the kind of emotive reactions it was intended to prevent. In risk assessment...

  16. Discovery Driven Growth

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj

    2009-01-01

    Anmeldelse af Discovery Driven Growh : A breakthrough process to reduce risk and seize opportunity, af Rita G. McGrath & Ian C. MacMillan, Boston: Harvard Business Press. Udgivelsesdato: 14 august......Anmeldelse af Discovery Driven Growh : A breakthrough process to reduce risk and seize opportunity, af Rita G. McGrath & Ian C. MacMillan, Boston: Harvard Business Press. Udgivelsesdato: 14 august...

  17. Measurement uncertainty: Friend or foe?

    Science.gov (United States)

    Infusino, Ilenia; Panteghini, Mauro

    2018-02-02

    The definition and enforcement of a reference measurement system, based on the implementation of metrological traceability of patients' results to higher order reference methods and materials, together with a clinically acceptable level of measurement uncertainty, are fundamental requirements to produce accurate and equivalent laboratory results. The uncertainty associated with each step of the traceability chain should be governed to obtain a final combined uncertainty on clinical samples fulfilling the requested performance specifications. It is important that end-users (i.e., clinical laboratory) may know and verify how in vitro diagnostics (IVD) manufacturers have implemented the traceability of their calibrators and estimated the corresponding uncertainty. However, full information about traceability and combined uncertainty of calibrators is currently very difficult to obtain. Laboratory professionals should investigate the need to reduce the uncertainty of the higher order metrological references and/or to increase the precision of commercial measuring systems. Accordingly, the measurement uncertainty should not be considered a parameter to be calculated by clinical laboratories just to fulfil the accreditation standards, but it must become a key quality indicator to describe both the performance of an IVD measuring system and the laboratory itself. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  19. Model uncertainty in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U; Huovinen, T [VTT Automation, Espoo (Finland). Industrial Automation

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.).

  20. The cerebellum and decision making under uncertainty.

    Science.gov (United States)

    Blackwood, Nigel; Ffytche, Dominic; Simmons, Andrew; Bentall, Richard; Murray, Robin; Howard, Robert

    2004-06-01

    This study aimed to identify the neural basis of probabilistic reasoning, a type of inductive inference that aids decision making under conditions of uncertainty. Eight normal subjects performed two separate two-alternative-choice tasks (the balls in a bottle and personality survey tasks) while undergoing functional magnetic resonance imaging (fMRI). The experimental conditions within each task were chosen so that they differed only in their requirement to make a decision under conditions of uncertainty (probabilistic reasoning and frequency determination required) or under conditions of certainty (frequency determination required). The same visual stimuli and motor responses were used in the experimental conditions. We provide evidence that the neo-cerebellum, in conjunction with the premotor cortex, inferior parietal lobule and medial occipital cortex, mediates the probabilistic inferences that guide decision making under uncertainty. We hypothesise that the neo-cerebellum constructs internal working models of uncertain events in the external world, and that such probabilistic models subserve the predictive capacity central to induction. Copyright 2004 Elsevier B.V.

  1. Enterprise strategic development under conditions of uncertainty

    Directory of Open Access Journals (Sweden)

    O.L. Truhan

    2016-09-01

    Full Text Available The author points out the necessity to conduct researches in the field of enterprise strategic development under conditions of increased dynamism and uncertainty of external environment. It is determined that under conditions of external uncertainty it’s reasonable to conduct the strategic planning of entities using the life cycle models of organization and planning on the basis of disclosure. Any organization has to react in a flexible way upon external calls applying the cognitive knowledge about its own business model of development and the ability to intensify internal working reserves. The article determines that in the process of long-term business activity planning managers use traditional approaches based on the familiar facts and conditions that the present tendencies will not be subjected to essential changes in the future. Planning a new risky business one has to act when prerequisites and assumptions are predominant over knowledge. The author proves that under such conditions the powerful tool of enterprise strategic development may be such a well-known approach as “planning on the basis of disclosure”. The approach suggested helps take into account numerous factors of uncertainty of external environment that makes the strategic planning process maximum adaptable to the conditions of venture business development.

  2. The Approach of Blended Learning to cope with E and T Needs in the Nuclear Engineering Field in an International Environmental. The experience of the Design and Implementation of a Distance Pilot Course on Accelerator Driven Systems within FP7 ENEN III Project Framework

    Energy Technology Data Exchange (ETDEWEB)

    Alonso, M.; Gonzalez, E. M.; Sanz, J.; Ogando, F.; Sanchez-Elvira, A.

    2013-07-01

    In these days Education and Training (Eand T) worldwide is redirecting towards the design of a balanced combination of face-to-face and distance teaching, taking advantage of the new tools for Information and Communication Technologies (ICT), in what we know as blended learning. Our University is been devoted to blended learning already for 41 years, Thus, our participation in FP7 ENEN III project gave us the opportunity to offer distance teaching and learning for international EandT in the nuclear field taking into account UNED long experience. The development of ENEN III Training Schemes (TS) highlighted a significant lack of international courses in TS-D: Concepts and Design of GEN IV nuclear reactors. Additionally, no distance course was offered. Our long collaboration UNED-CIEMAT on Accelerator Driven Systems (ADS) and the support of our Instituto Universitario de Educacion a Distancia (IUED), experts in online teaching and learning, moved us to develop the full-distance international course Accelerator Driven Systems for advanced nuclear waste transmutation, within the project framework.

  3. The Approach of Blended Learning to cope with E and T Needs in the Nuclear Engineering Field in an International Environmental. The experience of the Design and Implementation of a Distance Pilot Course on Accelerator Driven Systems within FP7 ENEN III Project Framework

    International Nuclear Information System (INIS)

    Alonso, M.; Gonzalez, E. M.; Sanz, J.; Ogando, F.; Sanchez-Elvira, A.

    2013-01-01

    In these days Education and Training (Eand T) worldwide is redirecting towards the design of a balanced combination of face-to-face and distance teaching, taking advantage of the new tools for Information and Communication Technologies (ICT), in what we know as blended learning. Our University is been devoted to blended learning already for 41 years, Thus, our participation in FP7 ENEN III project gave us the opportunity to offer distance teaching and learning for international EandT in the nuclear field taking into account UNED long experience. The development of ENEN III Training Schemes (TS) highlighted a significant lack of international courses in TS-D: Concepts and Design of GEN IV nuclear reactors. Additionally, no distance course was offered. Our long collaboration UNED-CIEMAT on Accelerator Driven Systems (ADS) and the support of our Instituto Universitario de Educacion a Distancia (IUED), experts in online teaching and learning, moved us to develop the full-distance international course Accelerator Driven Systems for advanced nuclear waste transmutation, within the project framework.

  4. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  5. Decision-making under great uncertainty

    International Nuclear Information System (INIS)

    Hansson, S.O.

    1992-01-01

    Five types of decision-uncertainty are distinguished: uncertainty of consequences, of values, of demarcation, of reliance, and of co-ordination. Strategies are proposed for each type of uncertainty. The general conclusion is that it is meaningful for decision theory to treat cases with greater uncertainty than the textbook case of 'decision-making under uncertainty'. (au)

  6. Vector network analyzer (VNA) measurements and uncertainty assessment

    CERN Document Server

    Shoaib, Nosherwan

    2017-01-01

    This book describes vector network analyzer measurements and uncertainty assessments, particularly in waveguide test-set environments, in order to establish their compatibility to the International System of Units (SI) for accurate and reliable characterization of communication networks. It proposes a fully analytical approach to measurement uncertainty evaluation, while also highlighting the interaction and the linear propagation of different uncertainty sources to compute the final uncertainties associated with the measurements. The book subsequently discusses the dimensional characterization of waveguide standards and the quality of the vector network analyzer (VNA) calibration techniques. The book concludes with an in-depth description of the novel verification artefacts used to assess the performance of the VNAs. It offers a comprehensive reference guide for beginners to experts, in both academia and industry, whose work involves the field of network analysis, instrumentation and measurements.

  7. Probabilistic Accident Consequence Uncertainty - A Joint CEC/USNRC Study

    International Nuclear Information System (INIS)

    Gregory, Julie J.; Harper, Frederick T.

    1999-01-01

    The joint USNRC/CEC consequence uncertainty study was chartered after the development of two new probabilistic accident consequence codes, MACCS in the U.S. and COSYMA in Europe. Both the USNRC and CEC had a vested interest in expanding the knowledge base of the uncertainty associated with consequence modeling, and teamed up to co-sponsor a consequence uncertainty study. The information acquired from the study was expected to provide understanding of the strengths and weaknesses of current models as well as a basis for direction of future research. This paper looks at the elicitation process implemented in the joint study and discusses some of the uncertainty distributions provided by eight panels of experts from the U.S. and Europe that were convened to provide responses to the elicitation. The phenomenological areas addressed by the expert panels include atmospheric dispersion and deposition, deposited material and external doses, food chain, early health effects, late health effects and internal dosimetry

  8. Probabilistic Accident Consequence Uncertainty - A Joint CEC/USNRC Study

    Energy Technology Data Exchange (ETDEWEB)

    Gregory, Julie J.; Harper, Frederick T.

    1999-07-28

    The joint USNRC/CEC consequence uncertainty study was chartered after the development of two new probabilistic accident consequence codes, MACCS in the U.S. and COSYMA in Europe. Both the USNRC and CEC had a vested interest in expanding the knowledge base of the uncertainty associated with consequence modeling, and teamed up to co-sponsor a consequence uncertainty study. The information acquired from the study was expected to provide understanding of the strengths and weaknesses of current models as well as a basis for direction of future research. This paper looks at the elicitation process implemented in the joint study and discusses some of the uncertainty distributions provided by eight panels of experts from the U.S. and Europe that were convened to provide responses to the elicitation. The phenomenological areas addressed by the expert panels include atmospheric dispersion and deposition, deposited material and external doses, food chain, early health effects, late health effects and internal dosimetry.

  9. Climate Projections and Uncertainty Communication.

    Science.gov (United States)

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. Copyright © 2015 Cognitive Science Society, Inc.

  10. Relational uncertainty in service dyads

    DEFF Research Database (Denmark)

    Kreye, Melanie

    2017-01-01

    in service dyads and how they resolve it through suitable organisational responses to increase the level of service quality. Design/methodology/approach: We apply the overall logic of Organisational Information-Processing Theory (OIPT) and present empirical insights from two industrial case studies collected...... the relational uncertainty increased the functional quality while resolving the partner’s organisational uncertainty increased the technical quality of the delivered service. Originality: We make two contributions. First, we introduce relational uncertainty to the OM literature as the inability to predict...... and explain the actions of a partnering organisation due to a lack of knowledge about their abilities and intentions. Second, we present suitable organisational responses to relational uncertainty and their effect on service quality....

  11. Advanced LOCA code uncertainty assessment

    International Nuclear Information System (INIS)

    Wickett, A.J.; Neill, A.P.

    1990-11-01

    This report describes a pilot study that identified, quantified and combined uncertainties for the LOBI BL-02 3% small break test. A ''dials'' version of TRAC-PF1/MOD1, called TRAC-F, was used. (author)

  12. Uncertainty As a Trigger for a Paradigm Change in Science Communication

    Science.gov (United States)

    Schneider, S.

    2014-12-01

    Over the last decade, the need to communicate uncertainty increased. Climate sciences and environmental sciences have faced massive propaganda campaigns by global industry and astroturf organizations. These organizations use the deep societal mistrust in uncertainty to point out alleged unethical and intentional delusion of decision makers and the public by scientists and their consultatory function. Scientists, who openly communicate uncertainty of climate model calculations, earthquake occurrence frequencies, or possible side effects of genetic manipulated semen have to face massive campaigns against their research, and sometimes against their person and live as well. Hence, new strategies to communicate uncertainty have to face the societal roots of the misunderstanding of the concept of uncertainty itself. Evolutionary biology has shown, that human mind is well suited for practical decision making by its sensory structures. Therefore, many of the irrational concepts about uncertainty are mitigated if data is presented in formats the brain is adapted to understand. At the end, the impact of uncertainty to the decision-making process is finally dominantly driven by preconceptions about terms such as uncertainty, vagueness or probabilities. Parallel to the increasing role of scientific uncertainty in strategic communication, science communicators for example at the Research and Development Program GEOTECHNOLOGIEN developed a number of techniques to master the challenge of putting uncertainty in the focus. By raising the awareness of scientific uncertainty as a driving force for scientific development and evolution, the public perspective on uncertainty is changing. While first steps to implement this process are under way, the value of uncertainty still is underestimated in the public and in politics. Therefore, science communicators are in need for new and innovative ways to talk about scientific uncertainty.

  13. How to live with uncertainties?

    International Nuclear Information System (INIS)

    Michel, R.

    2012-01-01

    In a short introduction, the problem of uncertainty as a general consequence of incomplete information as well as the approach to quantify uncertainty in metrology are addressed. A little history of the more than 30 years of the working group AK SIGMA is followed by an appraisal of its up-to-now achievements. Then, the potential future of the AK SIGMA is discussed based on its actual tasks and on open scientific questions and future topics. (orig.)

  14. Some remarks on modeling uncertainties

    International Nuclear Information System (INIS)

    Ronen, Y.

    1983-01-01

    Several topics related to the question of modeling uncertainties are considered. The first topic is related to the use of the generalized bias operator method for modeling uncertainties. The method is expanded to a more general form of operators. The generalized bias operator is also used in the inverse problem and applied to determine the anisotropic scattering law. The last topic discussed is related to the question of the limit to accuracy and how to establish its value. (orig.) [de

  15. Uncertainty analysis in safety assessment

    International Nuclear Information System (INIS)

    Lemos, Francisco Luiz de; Sullivan, Terry

    1997-01-01

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author)

  16. Propagation of dynamic measurement uncertainty

    International Nuclear Information System (INIS)

    Hessling, J P

    2011-01-01

    The time-dependent measurement uncertainty has been evaluated in a number of recent publications, starting from a known uncertain dynamic model. This could be defined as the 'downward' propagation of uncertainty from the model to the targeted measurement. The propagation of uncertainty 'upward' from the calibration experiment to a dynamic model traditionally belongs to system identification. The use of different representations (time, frequency, etc) is ubiquitous in dynamic measurement analyses. An expression of uncertainty in dynamic measurements is formulated for the first time in this paper independent of representation, joining upward as well as downward propagation. For applications in metrology, the high quality of the characterization may be prohibitive for any reasonably large and robust model to pass the whiteness test. This test is therefore relaxed by not directly requiring small systematic model errors in comparison to the randomness of the characterization. Instead, the systematic error of the dynamic model is propagated to the uncertainty of the measurand, analogously but differently to how stochastic contributions are propagated. The pass criterion of the model is thereby transferred from the identification to acceptance of the total accumulated uncertainty of the measurand. This increases the relevance of the test of the model as it relates to its final use rather than the quality of the calibration. The propagation of uncertainty hence includes the propagation of systematic model errors. For illustration, the 'upward' propagation of uncertainty is applied to determine if an appliance box is damaged in an earthquake experiment. In this case, relaxation of the whiteness test was required to reach a conclusive result

  17. Optimal Taxation under Income Uncertainty

    OpenAIRE

    Xianhua Dai

    2011-01-01

    Optimal taxation under income uncertainty has been extensively developed in expected utility theory, but it is still open for inseparable utility function between income and effort. As an alternative of decision-making under uncertainty, prospect theory (Kahneman and Tversky (1979), Tversky and Kahneman (1992)) has been obtained empirical support, for example, Kahneman and Tversky (1979), and Camerer and Lowenstein (2003). It is beginning to explore optimal taxation in the context of prospect...

  18. New Perspectives on Policy Uncertainty

    OpenAIRE

    Hlatshwayo, Sandile

    2017-01-01

    In recent years, the ubiquitous and intensifying nature of economic policy uncertainty has made it a popular explanation for weak economic performance in developed and developing markets alike. The primary channel for this effect is decreased and delayed investment as firms adopt a ``wait and see'' approach to irreversible investments (Bernanke, 1983; Dixit and Pindyck, 1994). Deep empirical examination of policy uncertainty's impact is rare because of the difficulty associated in measuring i...

  19. Pharmacological Fingerprints of Contextual Uncertainty.

    Directory of Open Access Journals (Sweden)

    Louise Marshall

    2016-11-01

    Full Text Available Successful interaction with the environment requires flexible updating of our beliefs about the world. By estimating the likelihood of future events, it is possible to prepare appropriate actions in advance and execute fast, accurate motor responses. According to theoretical proposals, agents track the variability arising from changing environments by computing various forms of uncertainty. Several neuromodulators have been linked to uncertainty signalling, but comprehensive empirical characterisation of their relative contributions to perceptual belief updating, and to the selection of motor responses, is lacking. Here we assess the roles of noradrenaline, acetylcholine, and dopamine within a single, unified computational framework of uncertainty. Using pharmacological interventions in a sample of 128 healthy human volunteers and a hierarchical Bayesian learning model, we characterise the influences of noradrenergic, cholinergic, and dopaminergic receptor antagonism on individual computations of uncertainty during a probabilistic serial reaction time task. We propose that noradrenaline influences learning of uncertain events arising from unexpected changes in the environment. In contrast, acetylcholine balances attribution of uncertainty to chance fluctuations within an environmental context, defined by a stable set of probabilistic associations, or to gross environmental violations following a contextual switch. Dopamine supports the use of uncertainty representations to engender fast, adaptive responses.

  20. Measurements of fusion neutron yields by neutron activation technique: Uncertainty due to the uncertainty on activation cross-sections

    Energy Technology Data Exchange (ETDEWEB)

    Stankunas, Gediminas, E-mail: gediminas.stankunas@lei.lt [Lithuanian Energy Institute, Laboratory of Nuclear Installation Safety, Breslaujos str. 3, LT-44403 Kaunas (Lithuania); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Batistoni, Paola [ENEA, Via E. Fermi, 45, 00044 Frascati, Rome (Italy); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Sjöstrand, Henrik; Conroy, Sean [Department of Physics and Astronomy, Uppsala University, PO Box 516, SE-75120 Uppsala (Sweden); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)

    2015-07-11

    The neutron activation technique is routinely used in fusion experiments to measure the neutron yields. This paper investigates the uncertainty on these measurements as due to the uncertainties on dosimetry and activation reactions. For this purpose, activation cross-sections were taken from the International Reactor Dosimetry and Fusion File (IRDFF-v1.05) in 640 groups ENDF-6 format for several reactions of interest for both 2.5 and 14 MeV neutrons. Activation coefficients (reaction rates) have been calculated using the neutron flux spectra at JET vacuum vessel, both for DD and DT plasmas, calculated by MCNP in the required 640-energy group format. The related uncertainties for the JET neutron spectra are evaluated as well using the covariance data available in the library. These uncertainties are in general small, but not negligible when high accuracy is required in the determination of the fusion neutron yields.

  1. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    Buslik, A.

    1994-01-01

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  2. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  3. Challenges for sustainable resource use : Uncertainty, trade and climate policies

    NARCIS (Netherlands)

    Bretschger, L.; Smulders, Sjak A.

    2012-01-01

    We integrate new challenges to thinking about resource markets and sustainable resource use policies in a general framework. The challenges, emerging from six papers that JEEM publishes in a special issue, are (i) demand uncertainty and stockpiling, (ii) international trade and resource dependence,

  4. Psychological Entropy: A Framework for Understanding Uncertainty-Related Anxiety

    Science.gov (United States)

    Hirsh, Jacob B.; Mar, Raymond A.; Peterson, Jordan B.

    2012-01-01

    Entropy, a concept derived from thermodynamics and information theory, describes the amount of uncertainty and disorder within a system. Self-organizing systems engage in a continual dialogue with the environment and must adapt themselves to changing circumstances to keep internal entropy at a manageable level. We propose the entropy model of…

  5. The Relationship of Cultural Similarity, Communication Effectiveness and Uncertainty Reduction.

    Science.gov (United States)

    Koester, Jolene; Olebe, Margaret

    To investigate the relationship of cultural similarity/dissimilarity, communication effectiveness, and communication variables associated with uncertainty reduction theory, a study examined two groups of students--a multinational group living on an "international floor" in a dormitory at a state university and an unrelated group of U.S.…

  6. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    Science.gov (United States)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G; Ring, David; Parisien, Robert

    2016-06-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if other factors are involved. With added experience, uncertainty could be expected to diminish, but perhaps more influential are things like physician confidence, belief in the veracity of what is published, and even one's religious beliefs. In addition, it is plausible that the kind of practice a physician works in can affect the experience of uncertainty. Practicing physicians may not be immediately aware of these effects on how uncertainty is experienced in their clinical decision-making. We asked: (1) Does uncertainty and overconfidence bias decrease with years of practice? (2) What sociodemographic factors are independently associated with less recognition of uncertainty, in particular belief in God or other deity or deities, and how is atheism associated with recognition of uncertainty? (3) Do confidence bias (confidence that one's skill is greater than it actually is), degree of trust in the orthopaedic evidence, and degree of statistical sophistication correlate independently with recognition of uncertainty? We created a survey to establish an overall recognition of uncertainty score (four questions), trust in the orthopaedic evidence base (four questions), confidence bias (three questions), and statistical understanding (six questions). Seven hundred six members of the Science of Variation Group, a collaboration that aims to study variation in the definition and treatment of human illness, were approached to complete our survey. This group represents mainly orthopaedic surgeons specializing in trauma or hand and wrist surgery, practicing in Europe and North America, of whom the majority is involved in teaching. Approximately half of the group has more than 10 years

  7. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    Energy Technology Data Exchange (ETDEWEB)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  8. Ruminations On NDA Measurement Uncertainty Compared TO DA Uncertainty

    International Nuclear Information System (INIS)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-01-01

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  9. Data-driven storytelling

    CERN Document Server

    Hurter, Christophe; Diakopoulos, Nicholas ed.; Carpendale, Sheelagh

    2018-01-01

    This book is an accessible introduction to data-driven storytelling, resulting from discussions between data visualization researchers and data journalists. This book will be the first to define the topic, present compelling examples and existing resources, as well as identify challenges and new opportunities for research.

  10. Pressure Driven Poiseuille Flow

    DEFF Research Database (Denmark)

    Stotz, Ingo Leonardo; Iaffaldano, Giampiero; Davies, D. Rhodri

    2018-01-01

    The Pacific plate is thought to be driven mainly by slab pull, associated with subduction along the Aleutians–Japan, Marianas–Izu–Bonin and Tonga–Kermadec trenches. This implies that viscous flow within the sub–Pacific asthenosphere is mainly generated by overlying plate motion (i.e. Couette flow...

  11. Critical loads - assessment of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Barkman, A.

    1998-10-01

    The effects of data uncertainty in applications of the critical loads concept were investigated on different spatial resolutions in Sweden and northern Czech Republic. Critical loads of acidity (CL) were calculated for Sweden using the biogeochemical model PROFILE. Three methods with different structural complexity were used to estimate the adverse effects of S0{sub 2} concentrations in northern Czech Republic. Data uncertainties in the calculated critical loads/levels and exceedances (EX) were assessed using Monte Carlo simulations. Uncertainties within cumulative distribution functions (CDF) were aggregated by accounting for the overlap between site specific confidence intervals. Aggregation of data uncertainties within CDFs resulted in lower CL and higher EX best estimates in comparison with percentiles represented by individual sites. Data uncertainties were consequently found to advocate larger deposition reductions to achieve non-exceedance based on low critical loads estimates on 150 x 150 km resolution. Input data were found to impair the level of differentiation between geographical units at all investigated resolutions. Aggregation of data uncertainty within CDFs involved more constrained confidence intervals for a given percentile. Differentiation as well as identification of grid cells on 150 x 150 km resolution subjected to EX was generally improved. Calculation of the probability of EX was shown to preserve the possibility to differentiate between geographical units. Re-aggregation of the 95%-ile EX on 50 x 50 km resolution generally increased the confidence interval for each percentile. Significant relationships were found between forest decline and the three methods addressing risks induced by S0{sub 2} concentrations. Modifying S0{sub 2} concentrations by accounting for the length of the vegetation period was found to constitute the most useful trade-off between structural complexity, data availability and effects of data uncertainty. Data

  12. Assessment of the uncertainties in the Radiological Protection Institute of Ireland (RPII) radon measurements service

    Energy Technology Data Exchange (ETDEWEB)

    Hanley, O. [Radiological Protection Institute of Ireland, 3 Clonskeagh Square, Clonskeagh Road, Dublin 14 (Ireland)], E-mail: ohanley@rpii.ie; Gutierrez-Villanueva, J.L. [Laboratorio LIBRA, Edificio I-D, Paseo Belen 3, 47011 Valladolid (Spain); Departamento de Fisica Teorica, Atomica y Optica, Facultad de Ciencias, Paseo Prado de la Magdalena, s/n. 47005 Valladolid (Spain)], E-mail: joselg@libra.uva.es; Currivan, L. [Radiological Protection Institute of Ireland, 3 Clonskeagh Square, Clonskeagh Road, Dublin 14 (Ireland)], E-mail: lcurrivan@rpii.ie; Pollard, D. [Radiological Protection Institute of Ireland, 3 Clonskeagh Square, Clonskeagh Road, Dublin 14 (Ireland)], E-mail: dpollard@rpii.ie

    2008-10-15

    The RPII radon (Rn) laboratory holds accreditation for the International Standard ISO/IEC 17025. A requirement of this standard is an estimate of the uncertainty of measurement. This work shows two approaches to estimate the uncertainty. The bottom-up approach involved identifying the components that were found to contribute to the uncertainty. Estimates were made for each of these components, which were combined to give a combined uncertainty of 13.5% at a Rn concentration of approximately 2500 Bq m{sup -3} at the 68% confidence level. By applying a coverage factor of k = 2, the expanded uncertainty is {+-}27% at the 95% confidence level. The top-down approach used information previously gathered from intercomparison exercises to estimate the uncertainty. This investigation found an expanded uncertainty of {+-}22% at approximately 95% confidence level. This is good agreement for such independent estimates.

  13. Assessment of the uncertainties in the Radiological Protection Institute of Ireland (RPII) radon measurements service.

    Science.gov (United States)

    Hanley, O; Gutiérrez-Villanueva, J L; Currivan, L; Pollard, D

    2008-10-01

    The RPII radon (Rn) laboratory holds accreditation for the International Standard ISO/IEC 17025. A requirement of this standard is an estimate of the uncertainty of measurement. This work shows two approaches to estimate the uncertainty. The bottom-up approach involved identifying the components that were found to contribute to the uncertainty. Estimates were made for each of these components, which were combined to give a combined uncertainty of 13.5% at a Rn concentration of approximately 2500 Bq m(-3) at the 68% confidence level. By applying a coverage factor of k=2, the expanded uncertainty is +/-27% at the 95% confidence level. The top-down approach used information previously gathered from intercomparison exercises to estimate the uncertainty. This investigation found an expanded uncertainty of +/-22% at approximately 95% confidence level. This is good agreement for such independent estimates.

  14. Minimization of energy consumption in HVAC systems with data-driven models and an interior-point method

    International Nuclear Information System (INIS)

    Kusiak, Andrew; Xu, Guanglin; Zhang, Zijun

    2014-01-01

    Highlights: • We study the energy saving of HVAC systems with a data-driven approach. • We conduct an in-depth analysis of the topology of developed Neural Network based HVAC model. • We apply interior-point method to solving a Neural Network based HVAC optimization model. • The uncertain building occupancy is incorporated in the minimization of HVAC energy consumption. • A significant potential of saving HVAC energy is discovered. - Abstract: In this paper, a data-driven approach is applied to minimize energy consumption of a heating, ventilating, and air conditioning (HVAC) system while maintaining the thermal comfort of a building with uncertain occupancy level. The uncertainty of arrival and departure rate of occupants is modeled by the Poisson and uniform distributions, respectively. The internal heating gain is calculated from the stochastic process of the building occupancy. Based on the observed and simulated data, a multilayer perceptron algorithm is employed to model and simulate the HVAC system. The data-driven models accurately predict future performance of the HVAC system based on the control settings and the observed historical information. An optimization model is formulated and solved with the interior-point method. The optimization results are compared with the results produced by the simulation models

  15. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  16. Uncertainty in spatial planning proceedings

    Directory of Open Access Journals (Sweden)

    Aleš Mlakar

    2009-01-01

    Full Text Available Uncertainty is distinctive of spatial planning as it arises from the necessity to co-ordinate the various interests within the area, from the urgency of adopting spatial planning decisions, the complexity of the environment, physical space and society, addressing the uncertainty of the future and from the uncertainty of actually making the right decision. Response to uncertainty is a series of measures that mitigate the effects of uncertainty itself. These measures are based on two fundamental principles – standardization and optimization. The measures are related to knowledge enhancement and spatial planning comprehension, in the legal regulation of changes, in the existence of spatial planning as a means of different interests co-ordination, in the active planning and the constructive resolution of current spatial problems, in the integration of spatial planning and the environmental protection process, in the implementation of the analysis as the foundation of spatial planners activities, in the methods of thinking outside the parameters, in forming clear spatial concepts and in creating a transparent management spatial system and also in the enforcement the participatory processes.

  17. Uncertainty modeling and decision support

    International Nuclear Information System (INIS)

    Yager, Ronald R.

    2004-01-01

    We first formulate the problem of decision making under uncertainty. The importance of the representation of our knowledge about the uncertainty in formulating a decision process is pointed out. We begin with a brief discussion of the case of probabilistic uncertainty. Next, in considerable detail, we discuss the case of decision making under ignorance. For this case the fundamental role of the attitude of the decision maker is noted and its subjective nature is emphasized. Next the case in which a Dempster-Shafer belief structure is used to model our knowledge of the uncertainty is considered. Here we also emphasize the subjective choices the decision maker must make in formulating a decision function. The case in which the uncertainty is represented by a fuzzy measure (monotonic set function) is then investigated. We then return to the Dempster-Shafer belief structure and show its relationship to the fuzzy measure. This relationship allows us to get a deeper understanding of the formulation the decision function used Dempster- Shafer framework. We discuss how this deeper understanding allows a decision analyst to better make the subjective choices needed in the formulation of the decision function

  18. Insurance Applications of Active Fault Maps Showing Epistemic Uncertainty

    Science.gov (United States)

    Woo, G.

    2005-12-01

    Insurance loss modeling for earthquakes utilizes available maps of active faulting produced by geoscientists. All such maps are subject to uncertainty, arising from lack of knowledge of fault geometry and rupture history. Field work to undertake geological fault investigations drains human and monetary resources, and this inevitably limits the resolution of fault parameters. Some areas are more accessible than others; some may be of greater social or economic importance than others; some areas may be investigated more rapidly or diligently than others; or funding restrictions may have curtailed the extent of the fault mapping program. In contrast with the aleatory uncertainty associated with the inherent variability in the dynamics of earthquake fault rupture, uncertainty associated with lack of knowledge of fault geometry and rupture history is epistemic. The extent of this epistemic uncertainty may vary substantially from one regional or national fault map to another. However aware the local cartographer may be, this uncertainty is generally not conveyed in detail to the international map user. For example, an area may be left blank for a variety of reasons, ranging from lack of sufficient investigation of a fault to lack of convincing evidence of activity. Epistemic uncertainty in fault parameters is of concern in any probabilistic assessment of seismic hazard, not least in insurance earthquake risk applications. A logic-tree framework is appropriate for incorporating epistemic uncertainty. Some insurance contracts cover specific high-value properties or transport infrastructure, and therefore are extremely sensitive to the geometry of active faulting. Alternative Risk Transfer (ART) to the capital markets may also be considered. In order for such insurance or ART contracts to be properly priced, uncertainty should be taken into account. Accordingly, an estimate is needed for the likelihood of surface rupture capable of causing severe damage. Especially where a

  19. Addressing Uncertainties in Cost Estimates for Decommissioning Nuclear Facilities

    International Nuclear Information System (INIS)

    Benjamin, Serge; Descures, Sylvain; Du Pasquier, Louis; Francois, Patrice; Buonarotti, Stefano; Mariotti, Giovanni; Tarakonov, Jurij; Daniska, Vladimir; Bergh, Niklas; Carroll, Simon; AaSTRoeM, Annika; Cato, Anna; De La Gardie, Fredrik; Haenggi, Hannes; Rodriguez, Jose; Laird, Alastair; Ridpath, Andy; La Guardia, Thomas; O'Sullivan, Patrick; ); Weber, Inge; )

    2017-01-01

    The cost estimation process of decommissioning nuclear facilities has continued to evolve in recent years, with a general trend towards demonstrating greater levels of detail in the estimate and more explicit consideration of uncertainties, the latter of which may have an impact on decommissioning project costs. The 2012 report on the International Structure for Decommissioning Costing (ISDC) of Nuclear Installations, a joint recommendation by the Nuclear Energy Agency (NEA), the International Atomic Energy Agency (IAEA) and the European Commission, proposes a standardised structure of cost items for decommissioning projects that can be used either directly for the production of cost estimates or for mapping of cost items for benchmarking purposes. The ISDC, however, provides only limited guidance on the treatment of uncertainty when preparing cost estimates. Addressing Uncertainties in Cost Estimates for Decommissioning Nuclear Facilities, prepared jointly by the NEA and IAEA, is intended to complement the ISDC, assisting cost estimators and reviewers in systematically addressing uncertainties in decommissioning cost estimates. Based on experiences gained in participating countries and projects, the report describes how uncertainty and risks can be analysed and incorporated in decommissioning cost estimates, while presenting the outcomes in a transparent manner

  20. Outcome and value uncertainties in global-change policy

    International Nuclear Information System (INIS)

    Hammitt, J.K.

    1995-01-01

    Choices among environmental policies can be informed by analysis of the potential physical, biological, and social outcomes of alternative choices, and analysis of social preferences among these outcomes. Frequently, however, the consequences of alternative policies cannot be accurately predicted because of substantial outcome uncertainties concerning physical, chemical, biological, and social processes linking policy choices to consequences. Similarly, assessments of social preferences among alternative outcomes are limited by value uncertainties arising from limitations of moral principles, the absence of economic markets for many environmental attributes, and other factors. Outcome and value uncertainties relevant to global-change policy are described and their magnitudes are examined for two cases: stratospheric-ozone depletion and global climate change. Analysis of information available in the mid 1980s, when international ozone regulations were adopted, suggests that contemporary uncertainties surrounding CFC emissions and the atmospheric response were so large that plausible ozone depletion, absent regulation, ranged from negligible to catastrophic, a range that exceeded the plausible effect of the regulations considered. Analysis of climate change suggests that, important as outcome uncertainties are, uncertainties about values may be even more important for policy choice. 53 refs., 3 figs., 3 tabs

  1. On the uncertainty principle. V

    International Nuclear Information System (INIS)

    Halpern, O.

    1976-01-01

    The treatment of ideal experiments connected with the uncertainty principle is continued. The author analyzes successively measurements of momentum and position, and discusses the common reason why the results in all cases differ from the conventional ones. A similar difference exists for the measurement of field strengths. The interpretation given by Weizsaecker, who tried to interpret Bohr's complementarity principle by introducing a multi-valued logic is analyzed. The treatment of the uncertainty principle ΔE Δt is deferred to a later paper as is the interpretation of the method of variation of constants. Every ideal experiment discussed shows various lower limits for the value of the uncertainty product which limits depend on the experimental arrangement and are always (considerably) larger than h. (Auth.)

  2. Davis-Besse uncertainty study

    International Nuclear Information System (INIS)

    Davis, C.B.

    1987-08-01

    The uncertainties of calculations of loss-of-feedwater transients at Davis-Besse Unit 1 were determined to address concerns of the US Nuclear Regulatory Commission relative to the effectiveness of feed and bleed cooling. Davis-Besse Unit 1 is a pressurized water reactor of the raised-loop Babcock and Wilcox design. A detailed, quality-assured RELAP5/MOD2 model of Davis-Besse was developed at the Idaho National Engineering Laboratory. The model was used to perform an analysis of the loss-of-feedwater transient that occurred at Davis-Besse on June 9, 1985. A loss-of-feedwater transient followed by feed and bleed cooling was also calculated. The evaluation of uncertainty was based on the comparisons of calculations and data, comparisons of different calculations of the same transient, sensitivity calculations, and the propagation of the estimated uncertainty in initial and boundary conditions to the final calculated results

  3. Correlated uncertainties in integral data

    International Nuclear Information System (INIS)

    McCracken, A.K.

    1978-01-01

    The use of correlated uncertainties in calculational data is shown in cases investigated to lead to a reduction in the uncertainty of calculated quantities of importance to reactor design. It is stressed however that such reductions are likely to be important in a minority of cases of practical interest. The effect of uncertainties in detector cross-sections is considered and is seen to be, in some cases, of equal importance to that in the data used in calculations. Numerical investigations have been limited by the sparse information available on data correlations; some comparisons made of these data reveal quite large inconsistencies for both detector cross-sections and cross-section of interest for reactor calculations

  4. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  5. Update on international uranium and enrichment supply

    International Nuclear Information System (INIS)

    Cleveland, J.M.

    1987-01-01

    Commercial nuclear power generation came upon us in the late 1950s and should have been relatively uneventful due to its similarities to fossil-powered electrical generation. Procurement of nuclear fuel appears to have been treated totally different from the procurement of fossil fuel, however, and only recently have these practices started to change. The degree of utility reliance on US-mined uranium and US Dept. of Energy (DOE)-produced enrichment services has changed since the 1970s as federal government uncertainty, international fuel market opportunity, and public service commission scrutiny has increased. Accordingly, the uranium and enrichment market has recognized that it is international just like the fossil fuel market. There is now oversupply-driven competition in the international nuclear fuel market. Competition is increasing daily, as third-world countries develop their own nuclear resources. American utilities are now diversifying their fuel supply arrangements, as they do with their oil, coal, and gas supply. The degree of foreign fuel arrangements depends on each utility's risk posture and commitment to long-term contracts. In an era of rising capital, retrofit, operating, and maintenance costs, economical nuclear fuel supply is even more important. This economic advantage, however, may be nullified by congressional and judicial actions limiting uranium importation and access to foreign enrichment. Such artificial trade barriers will only defeat US nuclear generation and the US nuclear fuel industry in the long term

  6. Uncertainty analysis in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Lemos, Francisco Luiz de [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil); Sullivan, Terry [Brookhaven National Lab., Upton, NY (United States)

    1997-12-31

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author) 13 refs.; e-mail: lemos at bnl.gov; sulliva1 at bnl.gov

  7. Economic uncertainty, parental selection, and the criminal activity of the 'children of the wall'

    NARCIS (Netherlands)

    Chevalier, A.; Marie, O.

    2013-01-01

    We explore the link between parental selection and criminality of children in a new context. After the collapse of the Berlin Wall in 1989, East Germany experienced a very large, but temporary, drop in birth rates mostly driven by economic uncertainty. We exploit this natural experiment in a

  8. Economic uncertainty, parental selection, and the criminal activity of the ‘children of the wall’

    NARCIS (Netherlands)

    Chevalier, A.; Marie, O.

    2013-01-01

    We explore the link between parental selection and criminality of children in a new context. After the collapse of the Berlin Wall in 1989, East Germany experienced a very large, but temporary, drop in birth rates mostly driven by economic uncertainty. We exploit this natural experiment in a

  9. Can agent based models effectively reduce fisheries management implementation uncertainty?

    Science.gov (United States)

    Drexler, M.

    2016-02-01

    Uncertainty is an inherent feature of fisheries management. Implementation uncertainty remains a challenge to quantify often due to unintended responses of users to management interventions. This problem will continue to plague both single species and ecosystem based fisheries management advice unless the mechanisms driving these behaviors are properly understood. Equilibrium models, where each actor in the system is treated as uniform and predictable, are not well suited to forecast the unintended behaviors of individual fishers. Alternatively, agent based models (AMBs) can simulate the behaviors of each individual actor driven by differing incentives and constraints. This study evaluated the feasibility of using AMBs to capture macro scale behaviors of the US West Coast Groundfish fleet. Agent behavior was specified at the vessel level. Agents made daily fishing decisions using knowledge of their own cost structure, catch history, and the histories of catch and quota markets. By adding only a relatively small number of incentives, the model was able to reproduce highly realistic macro patterns of expected outcomes in response to management policies (catch restrictions, MPAs, ITQs) while preserving vessel heterogeneity. These simulations indicate that agent based modeling approaches hold much promise for simulating fisher behaviors and reducing implementation uncertainty. Additional processes affecting behavior, informed by surveys, are continually being added to the fisher behavior model. Further coupling of the fisher behavior model to a spatial ecosystem model will provide a fully integrated social, ecological, and economic model capable of performing management strategy evaluations to properly consider implementation uncertainty in fisheries management.

  10. Linear Programming Problems for Generalized Uncertainty

    Science.gov (United States)

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

  11. Sources of uncertainty in future changes in local precipitation

    Energy Technology Data Exchange (ETDEWEB)

    Rowell, David P. [Met Office Hadley Centre, Exeter (United Kingdom)

    2012-10-15

    This study considers the large uncertainty in projected changes in local precipitation. It aims to map, and begin to understand, the relative roles of uncertain modelling and natural variability, using 20-year mean data from four perturbed physics or multi-model ensembles. The largest - 280-member - ensemble illustrates a rich pattern in the varying contribution of modelling uncertainty, with similar features found using a CMIP3 ensemble (despite its limited sample size, which restricts it value in this context). The contribution of modelling uncertainty to the total uncertainty in local precipitation change is found to be highest in the deep tropics, particularly over South America, Africa, the east and central Pacific, and the Atlantic. In the moist maritime tropics, the highly uncertain modelling of sea-surface temperature changes is transmitted to a large uncertain modelling of local rainfall changes. Over tropical land and summer mid-latitude continents (and to a lesser extent, the tropical oceans), uncertain modelling of atmospheric processes, land surface processes and the terrestrial carbon cycle all appear to play an additional substantial role in driving the uncertainty of local rainfall changes. In polar regions, inter-model variability of anomalous sea ice drives an uncertain precipitation response, particularly in winter. In all these regions, there is therefore the potential to reduce the uncertainty of local precipitation changes through targeted model improvements and observational constraints. In contrast, over much of the arid subtropical and mid-latitude oceans, over Australia, and over the Sahara in winter, internal atmospheric variability dominates the uncertainty in projected precipitation changes. Here, model improvements and observational constraints will have little impact on the uncertainty of time means shorter than at least 20 years. Last, a supplementary application of the metric developed here is that it can be interpreted as a measure

  12. Uncertainty Quantification in Geomagnetic Field Modeling

    Science.gov (United States)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  13. An ensemble approach to assess hydrological models' contribution to uncertainties in the analysis of climate change impact on water resources

    Science.gov (United States)

    Velázquez, J. A.; Schmid, J.; Ricard, S.; Muerth, M. J.; Gauvin St-Denis, B.; Minville, M.; Chaumont, D.; Caya, D.; Ludwig, R.; Turcotte, R.

    2012-06-01

    Over the recent years, several research efforts investigated the impact of climate change on water resources for different regions of the world. The projection of future river flows is affected by different sources of uncertainty in the hydro-climatic modelling chain. One of the aims of the QBic3 project (Québec-Bavarian International Collaboration on Climate Change) is to assess the contribution to uncertainty of hydrological models by using an ensemble of hydrological models presenting a diversity of structural complexity (i.e. lumped, semi distributed and distributed models). The study investigates two humid, mid-latitude catchments with natural flow conditions; one located in Southern Québec (Canada) and one in Southern Bavaria (Germany). Daily flow is simulated with four different hydrological models, forced by outputs from regional climate models driven by a given number of GCMs' members over a reference (1971-2000) and a future (2041-2070) periods. The results show that the choice of the hydrological model does strongly affect the climate change response of selected hydrological indicators, especially those related to low flows. Indicators related to high flows seem less sensitive on the choice of the hydrological model. Therefore, the computationally less demanding models (usually simple, lumped and conceptual) give a significant level of trust for high and overall mean flows.

  14. Impulsive control of permanent magnet synchronous motors with parameters uncertainties

    International Nuclear Information System (INIS)

    Li Dong; Zhang Xiaohong; Wang Shilong; Yan Dan; Wang Hui

    2008-01-01

    The permanent magnet synchronous motors (PMSMs) may have chaotic behaviours for the uncertain values of parameters or under certain working conditions, which threatens the secure and stable operation of motor-driven. It is important to study methods of controlling or suppressing chaos in PMSMs. In this paper, robust stabilities of PMSM with parameter uncertainties are investigated. After the uncertain matrices which represent the variable system parameters are formulated through matrix analysis, a novel asymptotical stability criterion is established. Some illustrated examples are also given to show the effectiveness of the obtained results

  15. Uncertainty, probability and information-gaps

    International Nuclear Information System (INIS)

    Ben-Haim, Yakov

    2004-01-01

    This paper discusses two main ideas. First, we focus on info-gap uncertainty, as distinct from probability. Info-gap theory is especially suited for modelling and managing uncertainty in system models: we invest all our knowledge in formulating the best possible model; this leaves the modeller with very faulty and fragmentary information about the variation of reality around that optimal model. Second, we examine the interdependence between uncertainty modelling and decision-making. Good uncertainty modelling requires contact with the end-use, namely, with the decision-making application of the uncertainty model. The most important avenue of uncertainty-propagation is from initial data- and model-uncertainties into uncertainty in the decision-domain. Two questions arise. Is the decision robust to the initial uncertainties? Is the decision prone to opportune windfall success? We apply info-gap robustness and opportunity functions to the analysis of representation and propagation of uncertainty in several of the Sandia Challenge Problems

  16. Uncertainty in prostate cancer. Ethnic and family patterns.

    Science.gov (United States)

    Germino, B B; Mishel, M H; Belyea, M; Harris, L; Ware, A; Mohler, J

    1998-01-01

    Prostate cancer occurs 37% more often in African-American men than in white men. Patients and their family care providers (FCPs) may have different experiences of cancer and its treatment. This report addresses two questions: 1) What is the relationship of uncertainty to family coping, psychological adjustment to illness, and spiritual factors? and 2) Are these patterns of relationship similar for patients and their family care givers and for whites and African-Americans? A sample of white and African-American men and their family care givers (N = 403) was drawn from an ongoing study, testing the efficacy of an uncertainty management intervention with men with stage B prostate cancer. Data were collected at study entry, either 1 week after post-surgical catheter removal or at the beginning of primary radiation treatment. Measures of uncertainty, adult role behavior, problem solving, social support, importance of God in one's life, family coping, psychological adjustment to illness, and perceptions of health and illness met standard criteria for internal consistency. Analyses of baseline data using Pearson's product moment correlations were conducted to examine the relationships of person, disease, and contextual factors to uncertainty. For family coping, uncertainty was significantly and positively related to two domains in white family care providers only. In African-American and white family care providers, the more uncertainty experienced, the less positive they felt about treatment. Uncertainty for all care givers was related inversely to positive feelings about the patient recovering from the illness. For all patients and for white family members, uncertainty was related inversely to the quality of the domestic environment. For everyone, uncertainty was related inversely to psychological distress. Higher levels of uncertainty were related to a poorer social environment for African-American patients and for white family members. For white patients and their

  17. Privacy driven internet ecosystem

    OpenAIRE

    Trinh, Tuan Anh; Gyarmati, Laszlo

    2012-01-01

    The dominant business model of today's Internet is built upon advertisements; users can access Internet services while the providers show ads to them. Although significant efforts have been made to model and analyze the economic aspects of this ecosystem, the heart of the current status quo, namely privacy, has not received the attention of the research community yet. Accordingly, we propose an economic model of the privacy driven Internet ecosystem where privacy is handled as an asset that c...

  18. MO-E-BRE-01: Determination, Minimization and Communication of Uncertainties in Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Van Dyk, J; Palta, J; Bortfeld, T; Mijnheer, B [Western University, London, ON (United Kingdom)

    2014-06-15

    Medical Physicists have a general understanding of uncertainties in the radiation treatment process, both with respect to dosimetry and geometry. However, there is a desire to be more quantitative about uncertainty estimation. A recent International Atomic Energy Agency (IAEA) report (about to be published) recommends that we should be as “accurate as reasonably achievable, technical and biological factors being taken into account”. Thus, a single recommendation as a goal for accuracy in radiation therapy is an oversimplification. That report also suggests that individual clinics should determine their own level of uncertainties for their specific treatment protocols. The question is “how do we implement this in clinical practice”? AAPM Monograph 35 (2011 AAPM Summer School) addressed many specific aspects of uncertainties in each of the steps of a course of radiation treatment. The intent of this symposium is: (1) to review uncertainty considerations in the entire radiation treatment process including uncertainty determination for each step and uncertainty propagation for the total process, (2) to consider aspects of robust optimization which optimizes treatment plans while protecting them against uncertainties, and (3) to describe various methods of displaying uncertainties and communicating uncertainties to the relevant professionals. While the theoretical and research aspects will also be described, the emphasis will be on the practical considerations for the medical physicist in clinical practice. Learning Objectives: To review uncertainty determination in the overall radiation treatment process. To consider uncertainty modeling and uncertainty propagation. To highlight the basic ideas and clinical potential of robust optimization procedures to generate optimal treatment plans that are not severely affected by uncertainties. To describe methods of uncertainty communication and display.

  19. MO-E-BRE-01: Determination, Minimization and Communication of Uncertainties in Radiation Therapy

    International Nuclear Information System (INIS)

    Van Dyk, J; Palta, J; Bortfeld, T; Mijnheer, B

    2014-01-01

    Medical Physicists have a general understanding of uncertainties in the radiation treatment process, both with respect to dosimetry and geometry. However, there is a desire to be more quantitative about uncertainty estimation. A recent International Atomic Energy Agency (IAEA) report (about to be published) recommends that we should be as “accurate as reasonably achievable, technical and biological factors being taken into account”. Thus, a single recommendation as a goal for accuracy in radiation therapy is an oversimplification. That report also suggests that individual clinics should determine their own level of uncertainties for their specific treatment protocols. The question is “how do we implement this in clinical practice”? AAPM Monograph 35 (2011 AAPM Summer School) addressed many specific aspects of uncertainties in each of the steps of a course of radiation treatment. The intent of this symposium is: (1) to review uncertainty considerations in the entire radiation treatment process including uncertainty determination for each step and uncertainty propagation for the total process, (2) to consider aspects of robust optimization which optimizes treatment plans while protecting them against uncertainties, and (3) to describe various methods of displaying uncertainties and communicating uncertainties to the relevant professionals. While the theoretical and research aspects will also be described, the emphasis will be on the practical considerations for the medical physicist in clinical practice. Learning Objectives: To review uncertainty determination in the overall radiation treatment process. To consider uncertainty modeling and uncertainty propagation. To highlight the basic ideas and clinical potential of robust optimization procedures to generate optimal treatment plans that are not severely affected by uncertainties. To describe methods of uncertainty communication and display

  20. Uncertainty and sensitivity studies supporting the interpretation of the results of TVO I/II PRA

    International Nuclear Information System (INIS)

    Holmberg, J.

    1992-01-01

    A comprehensive Level 1 probabilistic risk assessment (PRA) has been performed for the TVO I/II nuclear power units. As a part of the PRA project, uncertainties of risk models and methods were systematically studied in order to describe them and to demonstrate their impact by way of results. The uncertainty study was divided into two phases: a qualitative and a quantitative study. The qualitative study contained identification of uncertainties and qualitative assessments of their importance. The PRA was introduced, and identified assumptions and uncertainties behind the models were documented. The most significant uncertainties were selected by importance measures or other judgements for further quantitative studies. The quantitative study included sensitivity studies and propagation of uncertainty ranges. In the sensitivity studies uncertain assumptions or parameters were varied in order to illustrate the sensitivity of the models. The propagation of the uncertainty ranges demonstrated the impact of the statistical uncertainties of the parameter values. The Monte Carlo method was used as a propagation method. The most significant uncertainties were those involved in modelling human interactions, dependences and common cause failures (CCFs), loss of coolant accident (LOCA) frequencies and pressure suppression. The qualitative mapping out of the uncertainty factors turned out to be useful in planning quantitative studies. It also served as internal review of the assumptions made in the PRA. The sensitivity studies were perhaps the most advantageous part of the quantitative study because they allowed individual analyses of the significance of uncertainty sources identified. The uncertainty study was found reasonable in systematically and critically assessing uncertainties in a risk analysis. The usefulness of this study depends on the decision maker (power company) since uncertainty studies are primarily carried out to support decision making when uncertainties are

  1. Water-driven micromotors.

    Science.gov (United States)

    Gao, Wei; Pei, Allen; Wang, Joseph

    2012-09-25

    We demonstrate the first example of a water-driven bubble-propelled micromotor that eliminates the requirement for the common hydrogen peroxide fuel. The new water-driven Janus micromotor is composed of a partially coated Al-Ga binary alloy microsphere prepared via microcontact mixing of aluminum microparticles and liquid gallium. The ejection of hydrogen bubbles from the exposed Al-Ga alloy hemisphere side, upon its contact with water, provides a powerful directional propulsion thrust. Such spontaneous generation of hydrogen bubbles reflects the rapid reaction between the aluminum alloy and water. The resulting water-driven spherical motors can move at remarkable speeds of 3 mm s(-1) (i.e., 150 body length s(-1)), while exerting large forces exceeding 500 pN. Factors influencing the efficiency of the aluminum-water reaction and the resulting propulsion behavior and motor lifetime, including the ionic strength and environmental pH, are investigated. The resulting water-propelled Al-Ga/Ti motors move efficiently in different biological media (e.g., human serum) and hold considerable promise for diverse biomedical or industrial applications.

  2. Competitive Capacity Investment under Uncertainty

    NARCIS (Netherlands)

    X. Li (Xishu); R.A. Zuidwijk (Rob); M.B.M. de Koster (René); R. Dekker (Rommert)

    2016-01-01

    textabstractWe consider a long-term capacity investment problem in a competitive market under demand uncertainty. Two firms move sequentially in the competition and a firm’s capacity decision interacts with the other firm’s current and future capacity. Throughout the investment race, a firm can

  3. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  4. Numerical modeling of economic uncertainty

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2007-01-01

    Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...... are made between alternative modeling methods, and characteristics of the methods are discussed....

  5. Uncertainty covariances in robotics applications

    International Nuclear Information System (INIS)

    Smith, D.L.

    1984-01-01

    The application of uncertainty covariance matrices in the analysis of robot trajectory errors is explored. First, relevant statistical concepts are reviewed briefly. Then, a simple, hypothetical robot model is considered to illustrate methods for error propagation and performance test data evaluation. The importance of including error correlations is emphasized

  6. Regulating renewable resources under uncertainty

    DEFF Research Database (Denmark)

    Hansen, Lars Gårn

    ) that a pro-quota result under uncertainty about prices and marginal costs is unlikely, requiring that the resource growth function is highly concave locally around the optimum and, 3) that quotas are always preferred if uncertainly about underlying structural economic parameters dominates. These results...... showing that quotas are preferred in a number of situations qualify the pro fee message dominating prior studies....

  7. Uncertainty in the Real World

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 2. Uncertainty in the Real World - Fuzzy Sets. Satish Kumar. General Article Volume 4 Issue 2 February 1999 pp 37-47. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/004/02/0037-0047 ...

  8. Uncertainty of dustfall monitoring results

    Directory of Open Access Journals (Sweden)

    Martin A. van Nierop

    2017-06-01

    Full Text Available Fugitive dust has the ability to cause a nuisance and pollute the ambient environment, particularly from human activities including construction and industrial sites and mining operations. As such, dustfall monitoring has occurred for many decades in South Africa; little has been published on the repeatability, uncertainty, accuracy and precision of dustfall monitoring. Repeatability assesses the consistency associated with the results of a particular measurement under the same conditions; the consistency of the laboratory is assessed to determine the uncertainty associated with dustfall monitoring conducted by the laboratory. The aim of this study was to improve the understanding of the uncertainty in dustfall monitoring; thereby improving the confidence in dustfall monitoring. Uncertainty of dustfall monitoring was assessed through a 12-month study of 12 sites that were located on the boundary of the study area. Each site contained a directional dustfall sampler, which was modified by removing the rotating lid, with four buckets (A, B, C and D installed. Having four buckets on one stand allows for each bucket to be exposed to the same conditions, for the same period of time; therefore, should have equal amounts of dust deposited in these buckets. The difference in the weight (mg of the dust recorded from each bucket at each respective site was determined using the American Society for Testing and Materials method D1739 (ASTM D1739. The variability of the dust would provide the confidence level of dustfall monitoring when reporting to clients.

  9. Knowledge Uncertainty and Composed Classifier

    Czech Academy of Sciences Publication Activity Database

    Klimešová, Dana; Ocelíková, E.

    2007-01-01

    Roč. 1, č. 2 (2007), s. 101-105 ISSN 1998-0140 Institutional research plan: CEZ:AV0Z10750506 Keywords : Boosting architecture * contextual modelling * composed classifier * knowledge management, * knowledge * uncertainty Subject RIV: IN - Informatics, Computer Science

  10. Uncertainty propagation in nuclear forensics

    International Nuclear Information System (INIS)

    Pommé, S.; Jerome, S.M.; Venchiarutti, C.

    2014-01-01

    Uncertainty propagation formulae are presented for age dating in support of nuclear forensics. The age of radioactive material in this context refers to the time elapsed since a particular radionuclide was chemically separated from its decay product(s). The decay of the parent radionuclide and ingrowth of the daughter nuclide are governed by statistical decay laws. Mathematical equations allow calculation of the age of specific nuclear material through the atom ratio between parent and daughter nuclides, or through the activity ratio provided that the daughter nuclide is also unstable. The derivation of the uncertainty formulae of the age may present some difficulty to the user community and so the exact solutions, some approximations, a graphical representation and their interpretation are presented in this work. Typical nuclides of interest are actinides in the context of non-proliferation commitments. The uncertainty analysis is applied to a set of important parent–daughter pairs and the need for more precise half-life data is examined. - Highlights: • Uncertainty propagation formulae for age dating with nuclear chronometers. • Applied to parent–daughter pairs used in nuclear forensics. • Investigated need for better half-life data

  11. WASH-1400: quantifying the uncertainties

    International Nuclear Information System (INIS)

    Erdmann, R.C.; Leverenz, F.L. Jr.; Lellouche, G.S.

    1981-01-01

    The purpose of this paper is to focus on the limitations of the WASH-1400 analysis in estimating the risk from light water reactors (LWRs). This assessment attempts to modify the quantification of the uncertainty in and estimate of risk as presented by the RSS (reactor safety study). 8 refs

  12. Model uncertainty in growth empirics

    NARCIS (Netherlands)

    Prüfer, P.

    2008-01-01

    This thesis applies so-called Bayesian model averaging (BMA) to three different economic questions substantially exposed to model uncertainty. Chapter 2 addresses a major issue of modern development economics: the analysis of the determinants of pro-poor growth (PPG), which seeks to combine high

  13. Testing methodologies for quantifying physical models uncertainties. A comparative exercise using CIRCE and IPREM (FFTBM)

    Energy Technology Data Exchange (ETDEWEB)

    Freixa, Jordi, E-mail: jordi.freixa-terradas@upc.edu; Alfonso, Elsa de, E-mail: elsa.de.alfonso@upc.edu; Reventós, Francesc, E-mail: francesc.reventos@upc.edu

    2016-08-15

    Highlights: • Uncertainty of physical models are a key issue in Best estimate plus uncertainty analysis. • Estimation of uncertainties of physical models of thermal hydraulics system codes. • Comparison of CIRCÉ and FFTBM methodologies. • Simulation of reflood experiments in order to evaluate uncertainty of physical models related to the reflood scenario. - Abstract: The increasing importance of Best-Estimate Plus Uncertainty (BEPU) analyses in nuclear safety and licensing processes have lead to several international activities. The latest findings highlighted the uncertainties of physical models as one of the most controversial aspects of BEPU. This type of uncertainties is an important contributor to the total uncertainty of NPP BE calculations. Due to the complexity of estimating this uncertainty, it is often assessed solely by engineering judgment. The present study comprises a comparison of two different state-of-the-art methodologies CIRCÉ and IPREM (FFTBM) capable of quantifying the uncertainty of physical models. Similarities and differences of their results are discussed through the observation of probability distribution functions and envelope calculations. In particular, the analyzed scenario is core reflood. Experimental data from the FEBA and PERICLES test facilities is employed while the thermal hydraulic simulations are carried out with RELAP5/mod3.3. This work is undertaken under the framework of PREMIUM (Post-BEMUSE Reflood Model Input Uncertainty Methods) benchmark.

  14. Uncertainty of slip measurements in a cutting system of converting machinery for diapers production

    Directory of Open Access Journals (Sweden)

    D’Aponte F.

    2015-01-01

    Full Text Available In this paper slip measurements are described between the peripheral surfaces of knife and a not driven anvil cylinders in a high velocity, high quality cutting unit of a diaper production line. Laboratory tests have been carried out on a test bench with real scale components for possible on line application of the method. With reference to both starting and steady state conditions correlations with the process parameters have been found, achieving a very satisfactory reduction of the slip between the knife cylinder and the not driven anvil one. Accuracy evaluation of measurements allowed us to validate the obtained information and to evaluate the detection threshold of the measurement method in the present configuration The analysis of specific uncertainty contributions to the whole uncertainty could be also used, to further reduce the requested uncertainty of the measurement method.

  15. Uncertainty governance: an integrated framework for managing and communicating uncertainties

    International Nuclear Information System (INIS)

    Umeki, H.; Naito, M.; Takase, H.

    2004-01-01

    Treatment of uncertainty, or in other words, reasoning with imperfect information is widely recognised as being of great importance within performance assessment (PA) of the geological disposal mainly because of the time scale of interest and spatial heterogeneity that geological environment exhibits. A wide range of formal methods have been proposed for the optimal processing of incomplete information. Many of these methods rely on the use of numerical information, the frequency based concept of probability in particular, to handle the imperfections. However, taking quantitative information as a base for models that solve the problem of handling imperfect information merely creates another problem, i.e., how to provide the quantitative information. In many situations this second problem proves more resistant to solution, and in recent years several authors have looked at a particularly ingenious way in accordance with the rules of well-founded methods such as Bayesian probability theory, possibility theory, and the Dempster-Shafer theory of evidence. Those methods, while drawing inspiration from quantitative methods, do not require the kind of complete numerical information required by quantitative methods. Instead they provide information that, though less precise than that provided by quantitative techniques, is often, if not sufficient, the best that could be achieved. Rather than searching for the best method for handling all imperfect information, our strategy for uncertainty management, that is recognition and evaluation of uncertainties associated with PA followed by planning and implementation of measures to reduce them, is to use whichever method best fits the problem at hand. Such an eclectic position leads naturally to integration of the different formalisms. While uncertainty management based on the combination of semi-quantitative methods forms an important part of our framework for uncertainty governance, it only solves half of the problem

  16. Policy Uncertainty, Investment and Commitment Periods

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-07-01

    Today's investment decisions in key sectors such as energy, forestry or transport have significant impacts on the levels of greenhouse gas (GHG) emissions over the coming decades. Given the economic and environmental long-term implications of capital investment and retirement, a climate mitigation regime should aim to encourage capital investment in climate-friendly technologies. Many factors affect technology choice and the timing of investment, including investor expectations about future prices and policies. Recent international discussions have focused on the importance of providing more certainty about future climate policy stringency. The design of commitment periods can play a role in creating this environment. This paper assesses how the length of commitment periods influences policy uncertainty and investment decisions. In particular, the paper analyses the relationship between commitment period length and near term investment decisions in climate friendly technology.

  17. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    International Nuclear Information System (INIS)

    Elert, M.

    1996-09-01

    In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root

  18. Risk in International Business

    OpenAIRE

    Canavan, Deirdre; Sharkey Scott, Pamela

    2012-01-01

    Risk in international business can stress risk adverse behaviour to counteract foreign market uncertainty or individual entrepreneurial risk taking behaviour dependent on the characteristics of both the business sector and the individual. International business theory would suggest that the perception of risk may differ in situations including where new market entry is incremental, is taken in larger or earlier stages, or indeed whether it may be experienced in a continually fluctuating manne...

  19. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  20. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  1. Uncertainty in predictions of oil spill trajectories in a coastal zone

    Science.gov (United States)

    Sebastião, P.; Guedes Soares, C.

    2006-12-01

    A method is introduced to determine the uncertainties in the predictions of oil spill trajectories using a classic oil spill model. The method considers the output of the oil spill model as a function of random variables, which are the input parameters, and calculates the standard deviation of the output results which provides a measure of the uncertainty of the model as a result of the uncertainties of the input parameters. In addition to a single trajectory that is calculated by the oil spill model using the mean values of the parameters, a band of trajectories can be defined when various simulations are done taking into account the uncertainties of the input parameters. This band of trajectories defines envelopes of the trajectories that are likely to be followed by the spill given the uncertainties of the input. The method was applied to an oil spill that occurred in 1989 near Sines in the southwestern coast of Portugal. This model represented well the distinction between a wind driven part that remained offshore, and a tide driven part that went ashore. For both parts, the method defined two trajectory envelopes, one calculated exclusively with the wind fields, and the other using wind and tidal currents. In both cases reasonable approximation to the observed results was obtained. The envelope of likely trajectories that is obtained with the uncertainty modelling proved to give a better interpretation of the trajectories that were simulated by the oil spill model.

  2. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Elert, M. [Kemakta Konsult AB, Stockholm (Sweden)] [ed.

    1996-09-01

    In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root

  3. Communicating the Uncertainty in Greenhouse Gas Emissions from Agriculture

    Science.gov (United States)

    Milne, Alice; Glendining, Margaret; Perryman, Sarah; Whitmore, Andy

    2014-05-01

    inventory. Box plots were favoured by a majority of our participants but this result was driven by those with a better understanding of maths. We concluded that the methods chosen to communicate uncertainty in greenhouse gas emissions should be influenced by professional and mathematical background of the end-user. We propose that boxplots annotated with summary statistics such as mean, median, 2.5th and 97.5th percentiles provide a sound method for communicating uncertainty to research scientists as these individuals tend to be familiar with these methods. End-users from other groups may not be so familiar with these methods and so a combination of intuitive methods such as calibrated phrases and shaded arrays with numerate methods would be better suited. Ideally these individuals should be presented with the intuitive qualitative methods with the option to consider a more quantitative description, perhaps presented in an appendix.

  4. Laser-driven polarized sources of hydrogen and deuterium

    International Nuclear Information System (INIS)

    Young, L.; Holt, R.J.; Green, M.C.; Kowalczyk, R.S.

    1988-01-01

    A novel laser-driven polarized source of hydrogen and deuterium which operates on the principle of spin exchange optical pumping is described. The advantages of this method over conventional polarized sources for internal target experiments are presented. Technological difficulties which prevent ideal source operation are outlined along with proposed solutions. At present, the laser-driven polarized hydrogen source delivers 8 /times/ 10 16 atoms/s with a polarization (P/sub z/) of 24%. 9 refs., 2 figs

  5. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  6. Functional Domain Driven Design

    OpenAIRE

    Herrera Guzmán, Sergio

    2016-01-01

    Las tecnologías están en constante expansión y evolución, diseñando nuevas técnicas para cumplir con su fin. En el desarrollo de software, las herramientas y pautas para la elaboración de productos software constituyen una pieza en constante evolución, necesarias para la toma de decisiones sobre los proyectos a realizar. Uno de los arquetipos para el desarrollo de software es el denominado Domain Driven Design, donde es importante conocer ampliamente el negocio que se desea modelar en form...

  7. Constellations-driven innovation

    DEFF Research Database (Denmark)

    Hansbøl, Mikala

    2011-01-01

    The paper presents a science and technology studies and actor-network-theory inspired approach to understanding the development and ongoing re-didactication and re-design of a Danish developed presentation tool called the Theme Board (Tematavlen.dk). It is argued that this approach provides a par...... a particularly useful point of departure for engaging in researching innovation and didactic design of digital teaching and learning instruments such as the Theme Board that are programmed and serviced 'in the sky'. I call this approach: constellation-driven innovations....

  8. Electrostatically Driven Nanoballoon Actuator.

    Science.gov (United States)

    Barzegar, Hamid Reza; Yan, Aiming; Coh, Sinisa; Gracia-Espino, Eduardo; Dunn, Gabriel; Wågberg, Thomas; Louie, Steven G; Cohen, Marvin L; Zettl, Alex

    2016-11-09

    We demonstrate an inflatable nanoballoon actuator based on geometrical transitions between the inflated (cylindrical) and collapsed (flattened) forms of a carbon nanotube. In situ transmission electron microscopy experiments employing a nanoelectromechanical manipulator show that a collapsed carbon nanotube can be reinflated by electrically charging the nanotube, thus realizing an electrostatically driven nanoballoon actuator. We find that the tube actuator can be reliably cycled with only modest control voltages (few volts) with no apparent wear or fatigue. A complementary theoretical analysis identifies critical parameters for nanotube nanoballoon actuation.

  9. Uncertainty of the calibration factor

    International Nuclear Information System (INIS)

    1995-01-01

    According to present definitions, an error is the difference between a measured value and the ''true'' value. Thus an error has both a numerical value and a sign. In contrast, the uncertainly associated with a measurement is a parameter that characterizes the dispersion of the values ''that could reasonably be attributed to the measurand''. This parameter is normally an estimated standard deviation. An uncertainty, therefore, has no known sign and is usually assumed to be symmetrical. It is a measure of our lack of exact knowledge, after all recognized ''systematic'' effects have been eliminated by applying appropriate corrections. If errors were known exactly, the true value could be determined and there would be no problem left. In reality, errors are estimated in the best possible way and corrections made for them. Therefore, after application of all known corrections, errors need no further consideration (their expectation value being zero) and the only quantities of interest are uncertainties. 3 refs, 2 figs

  10. Quantifying the uncertainty in heritability.

    Science.gov (United States)

    Furlotte, Nicholas A; Heckerman, David; Lippert, Christoph

    2014-05-01

    The use of mixed models to determine narrow-sense heritability and related quantities such as SNP heritability has received much recent attention. Less attention has been paid to the inherent variability in these estimates. One approach for quantifying variability in estimates of heritability is a frequentist approach, in which heritability is estimated using maximum likelihood and its variance is quantified through an asymptotic normal approximation. An alternative approach is to quantify the uncertainty in heritability through its Bayesian posterior distribution. In this paper, we develop the latter approach, make it computationally efficient and compare it to the frequentist approach. We show theoretically that, for a sufficiently large sample size and intermediate values of heritability, the two approaches provide similar results. Using the Atherosclerosis Risk in Communities cohort, we show empirically that the two approaches can give different results and that the variance/uncertainty can remain large.

  11. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    applied at the grid scale. Flux and state hydrological outputs which integrate responses over time and space showed more sensitivity to precipitation mean spatial biases and less so on extremes. In the investigated catchments, the projected change of groundwater levels and basin discharge between current......Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...

  12. Visualizing Summary Statistics and Uncertainty

    KAUST Repository

    Potter, K.

    2010-08-12

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  13. Visualizing Summary Statistics and Uncertainty

    KAUST Repository

    Potter, K.; Kniss, J.; Riesenfeld, R.; Johnson, C.R.

    2010-01-01

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  14. Statistical uncertainties and unrecognized relationships

    International Nuclear Information System (INIS)

    Rankin, J.P.

    1985-01-01

    Hidden relationships in specific designs directly contribute to inaccuracies in reliability assessments. Uncertainty factors at the system level may sometimes be applied in attempts to compensate for the impact of such unrecognized relationships. Often uncertainty bands are used to relegate unknowns to a miscellaneous category of low-probability occurrences. However, experience and modern analytical methods indicate that perhaps the dominant, most probable and significant events are sometimes overlooked in statistical reliability assurances. The author discusses the utility of two unique methods of identifying the otherwise often unforeseeable system interdependencies for statistical evaluations. These methods are sneak circuit analysis and a checklist form of common cause failure analysis. Unless these techniques (or a suitable equivalent) are also employed along with the more widely-known assurance tools, high reliability of complex systems may not be adequately assured. This concern is indicated by specific illustrations. 8 references, 5 figures

  15. The uncertainty budget in pharmaceutical industry

    DEFF Research Database (Denmark)

    Heydorn, Kaj

    of their uncertainty, exactly as described in GUM [2]. Pharmaceutical industry has therefore over the last 5 years shown increasing interest in accreditation according to ISO 17025 [3], and today uncertainty budgets are being developed for all so-called critical measurements. The uncertainty of results obtained...... that the uncertainty of a particular result is independent of the method used for its estimation. Several examples of uncertainty budgets for critical parameters based on the bottom-up procedure will be discussed, and it will be shown how the top-down method is used as a means of verifying uncertainty budgets, based...

  16. Improvement of uncertainty relations for mixed states

    International Nuclear Information System (INIS)

    Park, Yong Moon

    2005-01-01

    We study a possible improvement of uncertainty relations. The Heisenberg uncertainty relation employs commutator of a pair of conjugate observables to set the limit of quantum measurement of the observables. The Schroedinger uncertainty relation improves the Heisenberg uncertainty relation by adding the correlation in terms of anti-commutator. However both relations are insensitive whether the state used is pure or mixed. We improve the uncertainty relations by introducing additional terms which measure the mixtureness of the state. For the momentum and position operators as conjugate observables and for the thermal state of quantum harmonic oscillator, it turns out that the equalities in the improved uncertainty relations hold

  17. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  18. Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    Science.gov (United States)

    Boening, C.; Schlegel, N.; Limonadi, D.; Schodlok, M.; Seroussi, H. L.; Larour, E. Y.; Watkins, M. M.

    2017-12-01

    In order to better quantify uncertainties in global mean sea level rise projections and in particular upper bounds, we aim at systematically evaluating the contributions from ice sheets and potential for extreme sea level rise due to sudden ice mass loss. Here, we take advantage of established uncertainty quantification tools embedded within the Ice Sheet System Model (ISSM) as well as sensitivities to ice/ocean interactions using melt rates and melt potential derived from MITgcm/ECCO2. With the use of these tools, we conduct Monte-Carlo style sampling experiments on forward simulations of the Antarctic ice sheet, by varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges. Uncertainty bounds for climate forcing are informed by CMIP5 ensemble precipitation and ice melt estimates for year 2100, and uncertainty bounds for ocean melt rates are derived from a suite of regional sensitivity experiments using MITgcm. Resulting statistics allow us to assess how regional uncertainty in various parameters affect model estimates of century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  19. Effect of activation cross section uncertainties in transmutation analysis of realistic low-activation steels for IFMIF

    Energy Technology Data Exchange (ETDEWEB)

    Cabellos, O.; Garcya-Herranz, N.; Sanz, J. [Institute of Nuclear Fusion, UPM, Madrid (Spain); Cabellos, O.; Garcya-Herranz, N.; Fernandez, P.; Fernandez, B. [Dept. of Nuclear Engineering, UPM, Madrid (Spain); Sanz, J. [Dept. of Power Engineering, UNED, Madrid (Spain); Reyes, S. [Safety, Environment and Health Group, ITER Joint Work Site, Cadarache Center (France)

    2008-07-01

    We address uncertainty analysis to draw conclusions on the reliability of the activation calculation in the International Fusion Materials Irradiation Facility (IFMIF) under the potential impact of activation cross section uncertainties. The Monte Carlo methodology implemented in ACAB code gives the uncertainty estimates due to the synergetic/global effect of the complete set of cross section uncertainties. An element-by-element analysis has been demonstrated as a helpful tool to easily analyse the transmutation performance of irradiated materials.The uncertainty analysis results showed that for times over about 24 h the relative error in the contact dose rate can be as large as 23 per cent. We have calculated the effect of cross section uncertainties in the IFMIF activation of all different elements. For EUROFER, uncertainties in H and He elements are 7.3% and 5.6%, respectively. We have found significant uncertainties in the transmutation response for C, P and Nb.

  20. Conditional Betas and Investor Uncertainty

    OpenAIRE

    Fernando D. Chague

    2013-01-01

    We derive theoretical expressions for market betas from a rational expectation equilibrium model where the representative investor does not observe if the economy is in a recession or an expansion. Market betas in this economy are time-varying and related to investor uncertainty about the state of the economy. The dynamics of betas will also vary across assets according to the assets' cash-flow structure. In a calibration exercise, we show that value and growth firms have cash-flow structures...

  1. Aggregate Uncertainty, Money and Banking

    OpenAIRE

    Hongfei Sun

    2006-01-01

    This paper studies the problem of monitoring the monitor in a model of money and banking with aggregate uncertainty. It shows that when inside money is required as a means of bank loan repayment, a market of inside money is entailed at the repayment stage and generates information-revealing prices that perfectly discipline the bank. The incentive problem of a bank is costlessly overcome simply by involving inside money in repayment. Inside money distinguishes itself from outside money by its ...

  2. Decision Under Uncertainty in Diagnosis

    OpenAIRE

    Kalme, Charles I.

    2013-01-01

    This paper describes the incorporation of uncertainty in diagnostic reasoning based on the set covering model of Reggia et. al. extended to what in the Artificial Intelligence dichotomy between deep and compiled (shallow, surface) knowledge based diagnosis may be viewed as the generic form at the compiled end of the spectrum. A major undercurrent in this is advocating the need for a strong underlying model and an integrated set of support tools for carrying such a model in order to deal with ...

  3. Uncertainty analysis for hot channel

    International Nuclear Information System (INIS)

    Panka, I.; Kereszturi, A.

    2006-01-01

    The fulfillment of the safety analysis acceptance criteria is usually evaluated by separate hot channel calculations using the results of neutronic or/and thermo hydraulic system calculations. In case of an ATWS event (inadvertent withdrawal of control assembly), according to the analysis, a number of fuel rods are experiencing DNB for a longer time and must be regarded as failed. Their number must be determined for a further evaluation of the radiological consequences. In the deterministic approach, the global power history must be multiplied by different hot channel factors (kx) taking into account the radial power peaking factors for each fuel pin. If DNB occurs it is necessary to perform a few number of hot channel calculations to determine the limiting kx leading just to DNB and fuel failure (the conservative DNBR limit is 1.33). Knowing the pin power distribution from the core design calculation, the number of failed fuel pins can be calculated. The above procedure can be performed by conservative assumptions (e.g. conservative input parameters in the hot channel calculations), as well. In case of hot channel uncertainty analysis, the relevant input parameters (k x, mass flow, inlet temperature of the coolant, pin average burnup, initial gap size, selection of power history influencing the gap conductance value) of hot channel calculations and the DNBR limit are varied considering the respective uncertainties. An uncertainty analysis methodology was elaborated combining the response surface method with the one sided tolerance limit method of Wilks. The results of deterministic and uncertainty hot channel calculations are compared regarding to the number of failed fuel rods, max. temperature of the clad surface and max. temperature of the fuel (Authors)

  4. Forecast Accuracy Uncertainty and Momentum

    OpenAIRE

    Bing Han; Dong Hong; Mitch Warachka

    2009-01-01

    We demonstrate that stock price momentum and earnings momentum can result from uncertainty surrounding the accuracy of cash flow forecasts. Our model has multiple information sources issuing cash flow forecasts for a stock. The investor combines these forecasts into an aggregate cash flow estimate that has minimal mean-squared forecast error. This aggregate estimate weights each cash flow forecast by the estimated accuracy of its issuer, which is obtained from their past forecast errors. Mome...

  5. Microeconomic Uncertainty and Macroeconomic Indeterminacy

    OpenAIRE

    Fagnart, Jean-François; Pierrard, Olivier; Sneessens, Henri

    2005-01-01

    The paper proposes a stylized intertemporal macroeconomic model wherein the combination of decentralized trading and microeconomic uncertainty (taking the form of privately observed and uninsured idiosyncratic shocks) creates an information problem between agents and generates indeterminacy of the macroeconomic equilibrium. For a given value of the economic fundamentals, the economy admits a continuum of equilibria that can be indexed by the sales expectations of firms at the time of investme...

  6. LOFT differential pressure uncertainty analysis

    International Nuclear Information System (INIS)

    Evans, R.P.; Biladeau, G.L.; Quinn, P.A.

    1977-03-01

    A performance analysis of the LOFT differential pressure (ΔP) measurement is presented. Along with completed descriptions of test programs and theoretical studies that have been conducted on the ΔP, specific sources of measurement uncertainty are identified, quantified, and combined to provide an assessment of the ability of this measurement to satisfy the SDD 1.4.1C (June 1975) requirement of measurement of differential pressure

  7. Knowledge, decision making, and uncertainty

    International Nuclear Information System (INIS)

    Fox, J.

    1986-01-01

    Artificial intelligence (AI) systems depend heavily upon the ability to make decisions. Decisions require knowledge, yet there is no knowledge-based theory of decision making. To the extent that AI uses a theory of decision-making it adopts components of the traditional statistical view in which choices are made by maximizing some function of the probabilities of decision options. A knowledge-based scheme for reasoning about uncertainty is proposed, which extends the traditional framework but is compatible with it

  8. Accommodating Uncertainty in Prior Distributions

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-19

    A fundamental premise of Bayesian methodology is that a priori information is accurately summarized by a single, precisely de ned prior distribution. In many cases, especially involving informative priors, this premise is false, and the (mis)application of Bayes methods produces posterior quantities whose apparent precisions are highly misleading. We examine the implications of uncertainty in prior distributions, and present graphical methods for dealing with them.

  9. Quantifying the measurement uncertainty of results from environmental analytical methods.

    Science.gov (United States)

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  10. Uncertainties in environmental radiological assessment models and their implications

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible

  11. Evaluating variability and uncertainty in radiological impact assessment using SYMBIOSE

    International Nuclear Information System (INIS)

    Simon-Cornu, M.; Beaugelin-Seiller, K.; Boyer, P.; Calmon, P.; Garcia-Sanchez, L.; Mourlon, C.; Nicoulaud, V.; Sy, M.; Gonze, M.A.

    2015-01-01

    SYMBIOSE is a modelling platform that accounts for variability and uncertainty in radiological impact assessments, when simulating the environmental fate of radionuclides and assessing doses to human populations. The default database of SYMBIOSE is partly based on parameter values that are summarized within International Atomic Energy Agency (IAEA) documents. To characterize uncertainty on the transfer parameters, 331 Probability Distribution Functions (PDFs) were defined from the summary statistics provided within the IAEA documents (i.e. sample size, minimal and maximum values, arithmetic and geometric means, standard and geometric standard deviations) and are made available as spreadsheet files. The methods used to derive the PDFs without complete data sets, but merely the summary statistics, are presented. Then, a simple case-study illustrates the use of the database in a second-order Monte Carlo calculation, separating parametric uncertainty and inter-individual variability. - Highlights: • Parametric uncertainty in radioecology was derived from IAEA documents. • 331 Probability Distribution Functions were defined for transfer parameters. • Parametric uncertainty and inter-individual variability were propagated

  12. Uncertainty quantification of CO2 emission reduction for maritime shipping

    International Nuclear Information System (INIS)

    Yuan, Jun; Ng, Szu Hui; Sou, Weng Sut

    2016-01-01

    The International Maritime Organization (IMO) has recently proposed several operational and technical measures to improve shipping efficiency and reduce the greenhouse gases (GHG) emissions. The abatement potentials estimated for these measures have been further used by many organizations to project future GHG emission reductions and plot Marginal Abatement Cost Curves (MACC). However, the abatement potentials estimated for many of these measures can be highly uncertain as many of these measures are new, with limited sea trial information. Furthermore, the abatements obtained are highly dependent on ocean conditions, trading routes and sailing patterns. When the estimated abatement potentials are used for projections, these ‘input’ uncertainties are often not clearly displayed or accounted for, which can lead to overly optimistic or pessimistic outlooks. In this paper, we propose a methodology to systematically quantify and account for these input uncertainties on the overall abatement potential forecasts. We further propose improvements to MACCs to better reflect the uncertainties in marginal abatement costs and total emissions. This approach provides a fuller and more accurate picture of abatement forecasts and potential reductions achievable, and will be useful to policy makers and decision makers in the shipping industry to better assess the cost effective measures for CO 2 emission reduction. - Highlights: • We propose a systematic method to quantify uncertainty in emission reduction. • Marginal abatement cost curves are improved to better reflect the uncertainties. • Percentage reduction probability is given to determine emission reduction target. • The methodology is applied to a case study on maritime shipping.

  13. Chemical model reduction under uncertainty

    KAUST Repository

    Malpica Galassi, Riccardo

    2017-03-06

    A general strategy for analysis and reduction of uncertain chemical kinetic models is presented, and its utility is illustrated in the context of ignition of hydrocarbon fuel–air mixtures. The strategy is based on a deterministic analysis and reduction method which employs computational singular perturbation analysis to generate simplified kinetic mechanisms, starting from a detailed reference mechanism. We model uncertain quantities in the reference mechanism, namely the Arrhenius rate parameters, as random variables with prescribed uncertainty factors. We propagate this uncertainty to obtain the probability of inclusion of each reaction in the simplified mechanism. We propose probabilistic error measures to compare predictions from the uncertain reference and simplified models, based on the comparison of the uncertain dynamics of the state variables, where the mixture entropy is chosen as progress variable. We employ the construction for the simplification of an uncertain mechanism in an n-butane–air mixture homogeneous ignition case, where a 176-species, 1111-reactions detailed kinetic model for the oxidation of n-butane is used with uncertainty factors assigned to each Arrhenius rate pre-exponential coefficient. This illustration is employed to highlight the utility of the construction, and the performance of a family of simplified models produced depending on chosen thresholds on importance and marginal probabilities of the reactions.

  14. Different top-down approaches to estimate measurement uncertainty of whole blood tacrolimus mass concentration values.

    Science.gov (United States)

    Rigo-Bonnin, Raül; Blanco-Font, Aurora; Canalias, Francesca

    2018-05-08

    Values of mass concentration of tacrolimus in whole blood are commonly used by the clinicians for monitoring the status of a transplant patient and for checking whether the administered dose of tacrolimus is effective. So, clinical laboratories must provide results as accurately as possible. Measurement uncertainty can allow ensuring reliability of these results. The aim of this study was to estimate measurement uncertainty of whole blood mass concentration tacrolimus values obtained by UHPLC-MS/MS using two top-down approaches: the single laboratory validation approach and the proficiency testing approach. For the single laboratory validation approach, we estimated the uncertainties associated to the intermediate imprecision (using long-term internal quality control data) and the bias (utilizing a certified reference material). Next, we combined them together with the uncertainties related to the calibrators-assigned values to obtain a combined uncertainty for, finally, to calculate the expanded uncertainty. For the proficiency testing approach, the uncertainty was estimated in a similar way that the single laboratory validation approach but considering data from internal and external quality control schemes to estimate the uncertainty related to the bias. The estimated expanded uncertainty for single laboratory validation, proficiency testing using internal and external quality control schemes were 11.8%, 13.2%, and 13.0%, respectively. After performing the two top-down approaches, we observed that their uncertainty results were quite similar. This fact would confirm that either two approaches could be used to estimate the measurement uncertainty of whole blood mass concentration tacrolimus values in clinical laboratories. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  15. Development of Probabilistic Internal Dosimetry Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Noh, Siwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kwon, Tae-Eun [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of); Lee, Jai-Ki [Korean Association for Radiation Protection, Seoul (Korea, Republic of)

    2017-02-15

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values (e.g. the 2.5{sup th}, 5{sup th}, median, 95{sup th}, and 97.5{sup th} percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various

  16. Development of Probabilistic Internal Dosimetry Computer Code

    International Nuclear Information System (INIS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-01-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values (e.g. the 2.5 th , 5 th , median, 95 th , and 97.5 th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases

  17. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    Science.gov (United States)

    Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  18. Heat driven pulse pump

    Science.gov (United States)

    Benner, Steve M (Inventor); Martins, Mario S. (Inventor)

    2000-01-01

    A heat driven pulse pump includes a chamber having an inlet port, an outlet port, two check valves, a wick, and a heater. The chamber may include a plurality of grooves inside wall of the chamber. When heated within the chamber, a liquid to be pumped vaporizes and creates pressure head that expels the liquid through the outlet port. As liquid separating means, the wick, disposed within the chamber, is to allow, when saturated with the liquid, the passage of only liquid being forced by the pressure head in the chamber, preventing the vapor from exiting from the chamber through the outlet port. A plurality of grooves along the inside surface wall of the chamber can sustain the liquid, which is amount enough to produce vapor for the pressure head in the chamber. With only two simple moving parts, two check valves, the heat driven pulse pump can effectively function over the long lifetimes without maintenance or replacement. For continuous flow of the liquid to be pumped a plurality of pumps may be connected in parallel.

  19. Plasma-driven liners

    International Nuclear Information System (INIS)

    Kilic, H.; Linhart, J.G.; Bortolotti, A.; Nardi, V.

    1992-01-01

    The deposition of thermal energy by laser or ion beams in an ablator is capable of producing a very large acceleration of the adjacent pusher - for power densities of 100 Terrawatts/cm 2 , ablator pressure in the range of 10 Mbar is attainable. In the case of a plasma drive such driving pressures and accelerations are not directly possible. When a snowplough (SP) is used to accelerate a thin liner, the driving pressure is that of the magnetic piston pushing the SP, i.e. at most 0.1 Mbar. However, the initial radius r 0 of the liner can be a few centimeters, instead of 1 (mm) as in the case in direct pellet implosions. In order to compete with the performance of the beam-driven liners, the plasma drive must demonstrate that a) thin liner retains a high density during the implosion (lasting a fraction of a μsec); b) radial compression ratio r 0 /r min of the order of 100 can be attained. It is also attractive to consider the staging of two or more liners in order to get sharpening and amplifications of the pressure and/or radiation pulse. If a) and b) are verified then the final pressures produced will be comparable with those of the beam-driven implosions. (author) 5 refs., 3 figs

  20. Documentation Driven Software Development

    Science.gov (United States)

    2010-06-01

    governmental organizations, and private companies and volunteer organizations. There is usually very little overlap between each of the aforementioned...Process Variables in Complex B2B Systems Integration Assessment,” Proceedings of the IEEE International Conference on E-Commerce Technology (CEC 2004...Innovation Engine for Online Analytics and Information Assurance in Enterprise B2B Infrastructure,” Paper presented at the Fifth International

  1. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    Wickett, A.J.; Yadigaroglu, G.

    1994-08-01

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  2. A Business Ecosystem Driven Market Analysis

    DEFF Research Database (Denmark)

    Ma, Zheng; Billanes, Joy Dalmacio; Jørgensen, Bo Nørregaard

    2017-01-01

    Due to the huge globally emerging market of the bright green buildings, this paper aims to develop a business-ecosystem driven market analysis approach for the investigation of the bright green building market. This paper develops a five-steps business-ecosystem driven market analysis (definition...... of the business domain, stakeholder listing, integration of the value chain, relationship mapping, and ego innovation ecosystem mapping.). This paper finds the global-local matters influence the market structure, which the technologies for building energy technology are developed and employed globally......, and the market demand is comparatively localized. The market players can be both local and international stakeholders who involve and collaborate for the building projects. This paper also finds that the building extensibility should be considered into the building design due to the gap between current market...

  3. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  4. Uncertainty analysis of the FRAP code

    International Nuclear Information System (INIS)

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the FRAP code (Fuel Rod Analysis Program) and applied to a PWR fuel rod undergoing a LOCA. The method of uncertainty analysis is the Response Surface Method (RSM). (author)

  5. Two multi-dimensional uncertainty relations

    International Nuclear Information System (INIS)

    Skala, L; Kapsa, V

    2008-01-01

    Two multi-dimensional uncertainty relations, one related to the probability density and the other one related to the probability density current, are derived and discussed. Both relations are stronger than the usual uncertainty relations for the coordinates and momentum

  6. Change and uncertainty in quantum systems

    International Nuclear Information System (INIS)

    Franson, J.D.

    1996-01-01

    A simple inequality shows that any change in the expectation value of an observable quantity must be associated with some degree of uncertainty. This inequality is often more restrictive than the Heisenberg uncertainty principle. copyright 1996 The American Physical Society

  7. Measure of uncertainty in regional grade variability

    NARCIS (Netherlands)

    Tutmez, B.; Kaymak, U.; Melin, P.; Castillo, O.; Gomez Ramirez, E.; Kacprzyk, J.; Pedrycz, W.

    2007-01-01

    Because the geological events are neither homogeneous nor isotropic, the geological investigations are characterized by particularly high uncertainties. This paper presents a hybrid methodology for measuring of uncertainty in regional grade variability. In order to evaluate the fuzziness in grade

  8. Uncertainty and its propagation in dynamics models

    International Nuclear Information System (INIS)

    Devooght, J.

    1994-01-01

    The purpose of this paper is to bring together some characteristics due to uncertainty when we deal with dynamic models and therefore to propagation of uncertainty. The respective role of uncertainty and inaccuracy is examined. A mathematical formalism based on Chapman-Kolmogorov equation allows to define a open-quotes subdynamicsclose quotes where the evolution equation takes the uncertainty into account. The problem of choosing or combining models is examined through a loss function associated to a decision

  9. Some illustrative examples of model uncertainty

    International Nuclear Information System (INIS)

    Bier, V.M.

    1994-01-01

    In this paper, we first discuss the view of model uncertainty proposed by Apostolakis. We then present several illustrative examples related to model uncertainty, some of which are not well handled by this formalism. Thus, Apostolakis' approach seems to be well suited to describing some types of model uncertainty, but not all. Since a comprehensive approach for characterizing and quantifying model uncertainty is not yet available, it is hoped that the examples presented here will service as a springboard for further discussion

  10. The Uncertainty Multiplier and Business Cycles

    OpenAIRE

    Saijo, Hikaru

    2013-01-01

    I study a business cycle model where agents learn about the state of the economy by accumulating capital. During recessions, agents invest less, and this generates noisier estimates of macroeconomic conditions and an increase in uncertainty. The endogenous increase in aggregate uncertainty further reduces economic activity, which in turn leads to more uncertainty, and so on. Thus, through changes in uncertainty, learning gives rise to a multiplier effect that amplifies business cycles. I use ...

  11. Development and application of methods to characterize code uncertainty

    International Nuclear Information System (INIS)

    Wilson, G.E.; Burtt, J.D.; Case, G.S.; Einerson, J.J.; Hanson, R.G.

    1985-01-01

    The United States Nuclear Regulatory Commission sponsors both international and domestic studies to assess its safety analysis codes. The Commission staff intends to use the results of these studies to quantify the uncertainty of the codes with a statistically based analysis method. Development of the methodology is underway. The Idaho National Engineering Laboratory contributions to the early development effort, and testing of two candidate methods are the subjects of this paper

  12. Uncertainty Model for Total Solar Irradiance Estimation on Australian Rooftops

    Science.gov (United States)

    Al-Saadi, Hassan; Zivanovic, Rastko; Al-Sarawi, Said

    2017-11-01

    The installations of solar panels on Australian rooftops have been in rise for the last few years, especially in the urban areas. This motivates academic researchers, distribution network operators and engineers to accurately address the level of uncertainty resulting from grid-connected solar panels. The main source of uncertainty is the intermittent nature of radiation, therefore, this paper presents a new model to estimate the total radiation incident on a tilted solar panel. Where a probability distribution factorizes clearness index, the model is driven upon clearness index with special attention being paid for Australia with the utilization of best-fit-correlation for diffuse fraction. The assessment of the model validity is achieved with the adoption of four goodness-of-fit techniques. In addition, the Quasi Monte Carlo and sparse grid methods are used as sampling and uncertainty computation tools, respectively. High resolution data resolution of solar irradiations for Adelaide city were used for this assessment, with an outcome indicating a satisfactory agreement between actual data variation and model.

  13. Uncertainty Characterization of Reactor Vessel Fracture Toughness

    International Nuclear Information System (INIS)

    Li, Fei; Modarres, Mohammad

    2002-01-01

    To perform fracture mechanics analysis of reactor vessel, fracture toughness (K Ic ) at various temperatures would be necessary. In a best estimate approach, K Ic uncertainties resulting from both lack of sufficient knowledge and randomness in some of the variables of K Ic must be characterized. Although it may be argued that there is only one type of uncertainty, which is lack of perfect knowledge about the subject under study, as a matter of practice K Ic uncertainties can be divided into two types: aleatory and epistemic. Aleatory uncertainty is related to uncertainty that is very difficult to reduce, if not impossible; epistemic uncertainty, on the other hand, can be practically reduced. Distinction between aleatory and epistemic uncertainties facilitates decision-making under uncertainty and allows for proper propagation of uncertainties in the computation process. Typically, epistemic uncertainties representing, for example, parameters of a model are sampled (to generate a 'snapshot', single-value of the parameters), but the totality of aleatory uncertainties is carried through the calculation as available. In this paper a description of an approach to account for these two types of uncertainties associated with K Ic has been provided. (authors)

  14. Uncertainty in prediction and in inference

    NARCIS (Netherlands)

    Hilgevoord, J.; Uffink, J.

    1991-01-01

    The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close re-lationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in

  15. Entropic uncertainty relations-a survey

    International Nuclear Information System (INIS)

    Wehner, Stephanie; Winter, Andreas

    2010-01-01

    Uncertainty relations play a central role in quantum mechanics. Entropic uncertainty relations in particular have gained significant importance within quantum information, providing the foundation for the security of many quantum cryptographic protocols. Yet, little is known about entropic uncertainty relations with more than two measurement settings. In the present survey, we review known results and open questions.

  16. Flood modelling : Parameterisation and inflow uncertainty

    NARCIS (Netherlands)

    Mukolwe, M.M.; Di Baldassarre, G.; Werner, M.; Solomatine, D.P.

    2014-01-01

    This paper presents an analysis of uncertainty in hydraulic modelling of floods, focusing on the inaccuracy caused by inflow errors and parameter uncertainty. In particular, the study develops a method to propagate the uncertainty induced by, firstly, application of a stage–discharge rating curve

  17. Adult head CT scans: the uncertainties of effective dose estimates

    International Nuclear Information System (INIS)

    Gregory, Kent J.; Bibbo, Giovanni; Pattison, John E.

    2008-01-01

    Full Text: CT scanning is a high dose imaging modality. Effective dose estimates from CT scans can provide important information to patients and medical professionals. For example, medical practitioners can use the dose to estimate the risk to the patient, and judge whether this risk is outweighed by the benefits of the CT examination, while radiographers can gauge the effect of different scanning protocols on the patient effective dose, and take this into consideration when establishing routine scan settings. Dose estimates also form an important part of epidemiological studies examining the health effects of medical radiation exposures on the wider population. Medical physicists have been devoting significant effort towards estimating patient radiation doses from diagnostic CT scans for some years. The question arises: How accurate are these effective dose estimates? The need for a greater understanding and improvement of the uncertainties in CT dose estimates is now gaining recognition as an important issue (BEIR VII 2006). This study is an attempt to analyse and quantify the uncertainty components relating to effective dose estimates from adult head CT examinations that are calculated with four commonly used methods. The dose estimation methods analysed are the Nagel method, the ImpaCT method, the Wellhoefer method and the Dose-Length Product (DLP) method. The analysis of the uncertainties was performed in accordance with the International Standards Organisation's Guide to the Expression of Uncertainty in Measurement as discussed in Gregory et al (Australas. Phys. Eng. Sci. Med., 28: 131-139, 2005). The uncertainty components vary, depending on the method used to derive the effective dose estimate. Uncertainty components in this study include the statistical and other errors from Monte Carlo simulations, uncertainties in the CT settings and positions of patients in the CT gantry, calibration errors from pencil ionization chambers, the variations in the organ

  18. Consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  19. Employee-driven innovation

    DEFF Research Database (Denmark)

    Kesting, Peter; Ulhøi, John Parm

    2015-01-01

    Purpose – The purpose of this paper is to outline the “grand structure” of the phenomenon in order to identify both the underlying processes and core drivers of employee-driven innovation (EDI). Design/methodology/approach – This is a conceptual paper. It particularly applies the insights...... of contemporary research on routine and organizational decision making to the specific case of EDI. Findings – The main result of the paper is that, from a theoretical point of view, it makes perfect sense to involve ordinary employees in innovation decisions. However, it is also outlined that naıve or ungoverned...... participation is counterproductive, and that it is quite difficult to realize the hidden potential in a supportive way. Research limitations/implications – The main implication is that basic mechanisms for employee participation also apply to innovation decisions, although often in a different way. However...

  20. Temperature-Driven Convection

    Science.gov (United States)

    Bohan, Richard J.; Vandegrift, Guy

    2003-02-01

    Warm air aloft is stable. This explains the lack of strong winds in a warm front and how nighttime radiative cooling can lead to motionless air that can trap smog. The stability of stratospheric air can be attributed to the fact that it is heated from above as ultraviolet radiation strikes the ozone layer. On the other hand, fluid heated from below is unstable and can lead to Bernard convection cells. This explains the generally turbulent nature of the troposphere, which receives a significant fraction of its heat directly from the Earth's warmer surface. The instability of cold fluid aloft explains the violent nature of a cold front, as well as the motion of Earth's magma, which is driven by radioactive heating deep within the Earth's mantle. This paper describes how both effects can be demonstrated using four standard beakers, ice, and a bit of food coloring.

  1. Emotion-driven level generation

    OpenAIRE

    Togelius, Julian; Yannakakis, Georgios N.

    2016-01-01

    This chapter examines the relationship between emotions and level generation. Grounded in the experience-driven procedural content generation framework we focus on levels and introduce a taxonomy of approaches for emotion-driven level generation. We then review four characteristic level generators of our earlier work that exemplify each one of the approaches introduced. We conclude the chapter with our vision on the future of emotion-driven level generation.

  2. The impact of incomplete information on the use of marketing research intelligence in international service settings: An experimental study

    NARCIS (Netherlands)

    Birgelen, van M.; Ruyter, de J.C.; Wetzels, M.G.M.

    2000-01-01

    Unfamiliarity with foreign business environments and cultures will result in higher levels of uncertainty, especially for international service organizations. To effectively deal with international uncertainty, it seems crucial to have access to information that is as complete as possible. In

  3. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  4. Effects-Driven IT Development

    DEFF Research Database (Denmark)

    Hertzum, Morten; Simonsen, Jesper

    2010-01-01

    We present effects-driven IT development as an instrument for pursuing and reinforcing Participatory Design (PD) when it is applied in commercial information technology (IT) projects. Effects-driven IT development supports the management of a sustained PD process throughout design and organizatio......We present effects-driven IT development as an instrument for pursuing and reinforcing Participatory Design (PD) when it is applied in commercial information technology (IT) projects. Effects-driven IT development supports the management of a sustained PD process throughout design...

  5. Quantum Uncertainty and Fundamental Interactions

    Directory of Open Access Journals (Sweden)

    Tosto S.

    2013-04-01

    Full Text Available The paper proposes a simplified theoretical approach to infer some essential concepts on the fundamental interactions between charged particles and their relative strengths at comparable energies by exploiting the quantum uncertainty only. The worth of the present approach relies on the way of obtaining the results, rather than on the results themselves: concepts today acknowledged as fingerprints of the electroweak and strong interactions appear indeed rooted in the same theoretical frame including also the basic principles of special and general relativity along with the gravity force.

  6. Uncertainty analysis in seismic tomography

    Science.gov (United States)

    Owoc, Bartosz; Majdański, Mariusz

    2017-04-01

    Velocity field from seismic travel time tomography depends on several factors like regularization, inversion path, model parameterization etc. The result also strongly depends on an initial velocity model and precision of travel times picking. In this research we test dependence on starting model in layered tomography and compare it with effect of picking precision. Moreover, in our analysis for manual travel times picking the uncertainty distribution is asymmetric. This effect is shifting the results toward faster velocities. For calculation we are using JIVE3D travel time tomographic code. We used data from geo-engineering and industrial scale investigations, which were collected by our team from IG PAS.

  7. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....

  8. Medical Need, Equality, and Uncertainty.

    Science.gov (United States)

    Horne, L Chad

    2016-10-01

    Many hold that distributing healthcare according to medical need is a requirement of equality. Most egalitarians believe, however, that people ought to be equal on the whole, by some overall measure of well-being or life-prospects; it would be a massive coincidence if distributing healthcare according to medical need turned out to be an effective way of promoting equality overall. I argue that distributing healthcare according to medical need is important for reducing individuals' uncertainty surrounding their future medical needs. In other words, distributing healthcare according to medical need is a natural feature of healthcare insurance; it is about indemnity, not equality. © 2016 John Wiley & Sons Ltd.

  9. Analysis of uncertainties in the IAEA/WHO TLD postal dose audit system

    Energy Technology Data Exchange (ETDEWEB)

    Izewska, J. [Department of Nuclear Sciences and Applications, International Atomic Energy Agency, Wagramer Strasse 5, Vienna (Austria)], E-mail: j.izewska@iaea.org; Hultqvist, M. [Department of Medical Radiation Physics, Karolinska Institute, Stockholm University, Stockholm (Sweden); Bera, P. [Department of Nuclear Sciences and Applications, International Atomic Energy Agency, Wagramer Strasse 5, Vienna (Austria)

    2008-02-15

    The International Atomic Energy Agency (IAEA) and the World Health Organisation (WHO) operate the IAEA/WHO TLD postal dose audit programme. Thermoluminescence dosimeters (TLDs) are used as transfer devices in this programme. In the present work the uncertainties in the dose determination from TLD measurements have been evaluated. The analysis of uncertainties comprises uncertainties in the calibration coefficient of the TLD system and uncertainties in factors correcting for dose response non-linearity, fading of TL signal, energy response and influence of TLD holder. The individual uncertainties have been combined to estimate the total uncertainty in the dose evaluated from TLD measurements. The combined relative standard uncertainty in the dose determined from TLD measurements has been estimated to be 1.2% for irradiations with Co-60 {gamma}-rays and 1.6% for irradiations with high-energy X-rays. Results from irradiations by the Bureau international des poids et mesures (BIPM), Primary Standard Dosimetry Laboratories (PSDLs) and Secondary Standards Dosimetry Laboratories (SSDLs) compare favourably with the estimated uncertainties, whereas TLD results of radiotherapy centres show higher standard deviations than those derived theoretically.

  10. Modelling of data uncertainties on hybrid computers

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Anke (ed.)

    2016-06-15

    The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the

  11. The economic implications of carbon cycle uncertainty

    International Nuclear Information System (INIS)

    Smith, Steven J.; Edmonds, James A.

    2006-01-01

    This paper examines the implications of uncertainty in the carbon cycle for the cost of stabilizing carbon dioxide concentrations. Using a state of the art integrated assessment model, we find that uncertainty in our understanding of the carbon cycle has significant implications for the costs of a climate stabilization policy, with cost differences denominated in trillions of dollars. Uncertainty in the carbon cycle is equivalent to a change in concentration target of up to 100 ppmv. The impact of carbon cycle uncertainties are smaller than those for climate sensitivity, and broadly comparable to the effect of uncertainty in technology availability

  12. Uncertainty budget for k0-NAA

    International Nuclear Information System (INIS)

    Robouch, P.; Arana, G.; Eguskiza, M.; Etxebarria, N.

    2000-01-01

    The concepts of the Guide to the expression of Uncertainties in Measurements for chemical measurements (GUM) and the recommendations of the Eurachem document 'Quantifying Uncertainty in Analytical Methods' are applied to set up the uncertainty budget for k 0 -NAA. The 'universally applicable spreadsheet technique', described by KRAGTEN, is applied to the k 0 -NAA basic equations for the computation of uncertainties. The variance components - individual standard uncertainties - highlight the contribution and the importance of the different parameters to be taken into account. (author)

  13. Risk uncertainty analysis methods for NUREG-1150

    International Nuclear Information System (INIS)

    Benjamin, A.S.; Boyd, G.J.

    1987-01-01

    Evaluation and display of risk uncertainties for NUREG-1150 constitute a principal focus of the Severe Accident Risk Rebaselining/Risk Reduction Program (SARRP). Some of the principal objectives of the uncertainty evaluation are: (1) to provide a quantitative estimate that reflects, for those areas considered, a credible and realistic range of uncertainty in risk; (2) to rank the various sources of uncertainty with respect to their importance for various measures of risk; and (3) to characterize the state of understanding of each aspect of the risk assessment for which major uncertainties exist. This paper describes the methods developed to fulfill these objectives

  14. Uncertainty Communication. Issues and good practice

    International Nuclear Information System (INIS)

    Kloprogge, P.; Van der Sluijs, J.; Wardekker, A.

    2007-12-01

    In 2003 the Netherlands Environmental Assessment Agency (MNP) published the RIVM/MNP Guidance for Uncertainty Assessment and Communication. The Guidance assists in dealing with uncertainty in environmental assessments. Dealing with uncertainty is essential because assessment results regarding complex environmental issues are of limited value if the uncertainties have not been taken into account adequately. A careful analysis of uncertainties in an environmental assessment is required, but even more important is the effective communication of these uncertainties in the presentation of assessment results. The Guidance yields rich and differentiated insights in uncertainty, but the relevance of this uncertainty information may vary across audiences and uses of assessment results. Therefore, the reporting of uncertainties is one of the six key issues that is addressed in the Guidance. In practice, users of the Guidance felt a need for more practical assistance in the reporting of uncertainty information. This report explores the issue of uncertainty communication in more detail, and contains more detailed guidance on the communication of uncertainty. In order to make this a 'stand alone' document several questions that are mentioned in the detailed Guidance have been repeated here. This document thus has some overlap with the detailed Guidance. Part 1 gives a general introduction to the issue of communicating uncertainty information. It offers guidelines for (fine)tuning the communication to the intended audiences and context of a report, discusses how readers of a report tend to handle uncertainty information, and ends with a list of criteria that uncertainty communication needs to meet to increase its effectiveness. Part 2 helps writers to analyze the context in which communication takes place, and helps to map the audiences, and their information needs. It further helps to reflect upon anticipated uses and possible impacts of the uncertainty information on the

  15. Uncertainty estimation in nuclear power plant probabilistic safety assessment

    International Nuclear Information System (INIS)

    Guarro, S.B.; Cummings, G.E.

    1989-01-01

    Probabilistic Risk Assessment (PRA) was introduced in the nuclear industry and the nuclear regulatory process in 1975 with the publication of the Reactor Safety Study by the U.S. Nuclear Regulatory Commission. Almost fifteen years later, the state-of-the-art in this field has been expanded and sharpened in many areas, and about thirty-five plant-specific PRAs (Probabilistic Risk Assessments) have been performed by the nuclear utility companies or by the U.S. Nuclear Regulatory commission. Among the areas where the most evident progress has been made in PRA and PSA (Probabilistic Safety Assessment, as these studies are more commonly referred to in the international community outside the U.S.) is the development of a consistent framework for the identification of sources of uncertainty and the estimation of their magnitude as it impacts various risk measures. Techniques to propagate uncertainty in reliability data through the risk models and display its effect on the top level risk estimates were developed in the early PRAs. The Seismic Safety Margin Research Program (SSMRP) study was the first major risk study to develop an approach to deal explicitly with uncertainty in risk estimates introduced not only by uncertainty in component reliability data, but by the incomplete state of knowledge of the assessor(s) with regard to basic phenomena that may trigger and drive a severe accident. More recently NUREG-1150, another major study of reactor risk sponsored by the NRC, has expanded risk uncertainty estimation and analysis into the realm of model uncertainty related to the relatively poorly known post-core-melt phenomena which determine the behavior of the molten core and of the rector containment structures

  16. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    Science.gov (United States)

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them

  17. Causal uncertainty, claimed and behavioural self-handicapping.

    Science.gov (United States)

    Thompson, Ted; Hepburn, Jonathan

    2003-06-01

    Causal uncertainty beliefs involve doubts about the causes of events, and arise as a consequence of non-contingent evaluative feedback: feedback that leaves the individual uncertain about the causes of his or her achievement outcomes. Individuals high in causal uncertainty are frequently unable to confidently attribute their achievement outcomes, experience anxiety in achievement situations and as a consequence are likely to engage in self-handicapping behaviour. Accordingly, we sought to establish links between trait causal uncertainty, claimed and behavioural self-handicapping. Participants were N=72 undergraduate students divided equally between high and low causally uncertain groups. We used a 2 (causal uncertainty status: high, low) x 3 (performance feedback condition: success, non-contingent success, non-contingent failure) between-subjects factorial design to examine the effects of causal uncertainty on achievement behaviour. Following performance feedback, participants completed 20 single-solution anagrams and 12 remote associate tasks serving as performance measures, and 16 unicursal tasks to assess practice effort. Participants also completed measures of claimed handicaps, state anxiety and attributions. Relative to low causally uncertain participants, high causally uncertain participants claimed more handicaps prior to performance on the anagrams and remote associates, reported higher anxiety, attributed their failure to internal, stable factors, and reduced practice effort on the unicursal tasks, evident in fewer unicursal tasks solved. These findings confirm links between trait causal uncertainty and claimed and behavioural self-handicapping, highlighting the need for educators to facilitate means by which students can achieve surety in the manner in which they attribute the causes of their achievement outcomes.

  18. Uncertainty Relations and Possible Experience

    Directory of Open Access Journals (Sweden)

    Gregg Jaeger

    2016-06-01

    Full Text Available The uncertainty principle can be understood as a condition of joint indeterminacy of classes of properties in quantum theory. The mathematical expressions most closely associated with this principle have been the uncertainty relations, various inequalities exemplified by the well known expression regarding position and momentum introduced by Heisenberg. Here, recent work involving a new sort of “logical” indeterminacy principle and associated relations introduced by Pitowsky, expressable directly in terms of probabilities of outcomes of measurements of sharp quantum observables, is reviewed and its quantum nature is discussed. These novel relations are derivable from Boolean “conditions of possible experience” of the quantum realm and have been considered both as fundamentally logical and as fundamentally geometrical. This work focuses on the relationship of indeterminacy to the propositions regarding the values of discrete, sharp observables of quantum systems. Here, reasons for favoring each of these two positions are considered. Finally, with an eye toward future research related to indeterminacy relations, further novel approaches grounded in category theory and intended to capture and reconceptualize the complementarity characteristics of quantum propositions are discussed in relation to the former.

  19. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  20. Uncertainties in the proton lifetime

    International Nuclear Information System (INIS)

    Ellis, J.; Nanopoulos, D.V.; Rudaz, S.; Gaillard, M.K.

    1980-04-01

    We discuss the masses of the leptoquark bosons m(x) and the proton lifetime in Grand Unified Theories based principally on SU(5). It is emphasized that estimates of m(x) based on the QCD coupling and the fine structure constant are probably more reliable than those using the experimental value of sin 2 theta(w). Uncertainties in the QCD Λ parameter and the correct value of α are discussed. We estimate higher order effects on the evolution of coupling constants in a momentum space renormalization scheme. It is shown that increasing the number of generations of fermions beyond the minimal three increases m(X) by almost a factor of 2 per generation. Additional uncertainties exist for each generation of technifermions that may exist. We discuss and discount the possibility that proton decay could be 'Cabibbo-rotated' away, and a speculation that Lorentz invariance may be violated in proton decay at a detectable level. We estimate that in the absence of any substantial new physics beyond that in the minimal SU(5) model the proton lifetimes is 8 x 10 30+-2 years

  1. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander; Matthies, Hermann G.

    2014-01-01

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  2. Inverse problems and uncertainty quantification

    KAUST Repository

    Litvinenko, Alexander

    2013-12-18

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)— the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  3. International Trade. International Business

    OpenAIRE

    Мохнюк, А. М.; Mokhniuk, A. M.

    2015-01-01

    Work programme of the study course “International Trade. International Business” was prepared in accordance with educational and vocational training program for bachelors of training direction 6.030601 “Management”.

  4. Needs of the CSAU uncertainty method

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2000-01-01

    The use of best estimate codes for safety analysis requires quantification of the uncertainties. These uncertainties are inherently linked to the chosen safety analysis methodology. Worldwide, various methods were proposed for this quantification. The purpose of this paper was to identify the needs of the Code Scaling, Applicability, and Uncertainty (CSAU) methodology and then to answer the needs. The specific procedural steps were combined from other methods for uncertainty evaluation and new tools and procedures were proposed. The uncertainty analysis approach and tools were then utilized for confirmatory study. The uncertainty was quantified for the RELAP5/MOD3.2 thermalhydraulic computer code. The results of the adapted CSAU approach to the small-break loss-of-coolant accident (SB LOCA) show that the adapted CSAU can be used for any thermal-hydraulic safety analysis with uncertainty evaluation. However, it was indicated that there are still some limitations in the CSAU approach that need to be resolved. (author)

  5. Decision-Making under Criteria Uncertainty

    Science.gov (United States)

    Kureychik, V. M.; Safronenkova, I. B.

    2018-05-01

    Uncertainty is an essential part of a decision-making procedure. The paper deals with the problem of decision-making under criteria uncertainty. In this context, decision-making under uncertainty, types and conditions of uncertainty were examined. The decision-making problem under uncertainty was formalized. A modification of the mathematical decision support method under uncertainty via ontologies was proposed. A critical distinction of the developed method is ontology usage as its base elements. The goal of this work is a development of a decision-making method under criteria uncertainty with the use of ontologies in the area of multilayer board designing. This method is oriented to improvement of technical-economic values of the examined domain.

  6. Uncertainty in geological and hydrogeological data

    Directory of Open Access Journals (Sweden)

    B. Nilsson

    2007-09-01

    Full Text Available Uncertainty in conceptual model structure and in environmental data is of essential interest when dealing with uncertainty in water resources management. To make quantification of uncertainty possible is it necessary to identify and characterise the uncertainty in geological and hydrogeological data. This paper discusses a range of available techniques to describe the uncertainty related to geological model structure and scale of support. Literature examples on uncertainty in hydrogeological variables such as saturated hydraulic conductivity, specific yield, specific storage, effective porosity and dispersivity are given. Field data usually have a spatial and temporal scale of support that is different from the one on which numerical models for water resources management operate. Uncertainty in hydrogeological data variables is characterised and assessed within the methodological framework of the HarmoniRiB classification.

  7. INTERNAL AUDIT AND RISK MANAGEMENT

    OpenAIRE

    Elena RUSE; Georgiana SUSMANSCHI (BADEA); Daniel DĂNECI-PĂTRĂU

    2014-01-01

    The existence of risk in economic activity can not be denied. In fact, the risk is a concept which exists in every activity, the term of risk being identified with uncertainty, respectively the (un)chance to produce an undesirable event. Internal audit and risk management aim at the same goal, namely the control of risks. Internal Audit performs several roles in risk management plan. The objectives of the internal audit function varies from company to company, but in all economic entities int...

  8. Uncertainties in Steric Sea Level Change Estimation During the Satellite Altimeter Era: Concepts and Practices

    Science.gov (United States)

    MacIntosh, C. R.; Merchant, C. J.; von Schuckmann, K.

    2017-01-01

    This article presents a review of current practice in estimating steric sea level change, focussed on the treatment of uncertainty. Steric sea level change is the contribution to the change in sea level arising from the dependence of density on temperature and salinity. It is a significant component of sea level rise and a reflection of changing ocean heat content. However, tracking these steric changes still remains a significant challenge for the scientific community. We review the importance of understanding the uncertainty in estimates of steric sea level change. Relevant concepts of uncertainty are discussed and illustrated with the example of observational uncertainty propagation from a single profile of temperature and salinity measurements to steric height. We summarise and discuss the recent literature on methodologies and techniques used to estimate steric sea level in the context of the treatment of uncertainty. Our conclusions are that progress in quantifying steric sea level uncertainty will benefit from: greater clarity and transparency in published discussions of uncertainty, including exploitation of international standards for quantifying and expressing uncertainty in measurement; and the development of community "recipes" for quantifying the error covariances in observations and from sparse sampling and for estimating and propagating uncertainty across spatio-temporal scales.

  9. Uncertainties in scaling factors for ab initio vibrational zero-point energies

    Science.gov (United States)

    Irikura, Karl K.; Johnson, Russell D.; Kacker, Raghu N.; Kessel, Rüdiger

    2009-03-01

    Vibrational zero-point energies (ZPEs) determined from ab initio calculations are often scaled by empirical factors. An empirical scaling factor partially compensates for the effects arising from vibrational anharmonicity and incomplete treatment of electron correlation. These effects are not random but are systematic. We report scaling factors for 32 combinations of theory and basis set, intended for predicting ZPEs from computed harmonic frequencies. An empirical scaling factor carries uncertainty. We quantify and report, for the first time, the uncertainties associated with scaling factors for ZPE. The uncertainties are larger than generally acknowledged; the scaling factors have only two significant digits. For example, the scaling factor for B3LYP/6-31G(d) is 0.9757±0.0224 (standard uncertainty). The uncertainties in the scaling factors lead to corresponding uncertainties in predicted ZPEs. The proposed method for quantifying the uncertainties associated with scaling factors is based upon the Guide to the Expression of Uncertainty in Measurement, published by the International Organization for Standardization. We also present a new reference set of 60 diatomic and 15 polyatomic "experimental" ZPEs that includes estimated uncertainties.

  10. Current driven wiggler

    Science.gov (United States)

    Tournes, C.; Aucouturier, J.; Arnaud, B.; Brasile, J. P.; Convert, G.; Simon, M.

    1992-07-01

    A current-driven wiggler is the cornerstone of an innovative, compact, high-efficiency, transportable tunable free-electron laser (FEL), the feasibility of which is currently being evaluated by Thomson-CSF. The salient advantages are: compactness of the FEL, along with the possibility to accelerate the beam through several successive passes through the accelerating section (the number of passes being defined by the final wavelength of the radiation; i.e. visible, MWIR, LWIR); the wiggler can be turned off and be transparent to the beam until the last pass. Wiggler periodicities as small as 5 mm can be achieved, hence contributing to FEL compactness. To achieve overall efficiencies in the range of 10% at visible wavelengths, not only the wiggler periodicity must be variable, but the strength of the magnetic field of each period can be adjusted separately and fine-tuned versus time during the macropulse, so as to take into account the growing contribution of the wave energy in the cavity to the total ponderomotive force. The salient theoretical point of this design is the optimization of the parameters defining each period of the wiggler for each micropacket of the macropulse. The salient technology point is the mechanical and thermal design of the wiggler which allows the required high currents to achieve magnetic fields up to 2T.

  11. Customer-driven competition

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, R. [Ontario Hydro, Toronto, ON (Canada)

    1996-12-31

    Ontario Hydro`s customer-driven strategy, recently approved by Hydro`s Executive Board, was described. The strategy is founded on the following components: (1) the dissolution of the Ontario power pool, i.e., the loss of Hydro`s franchise monopoly on generation, leaving only power transmission in the hands of the Corporation, (2) divestment of Ontario Hydro`s system operations and market operations functions to a new, independent Crown corporation called the Central Market Operator, (3) functional and organizational unbundling of Ontario Hydro into three signature businesses, Genco, Transco, and Retailco, and in the latter two, the functional unbundling of wires from sales and services, (4) a fully commercial Ontario Hydro with normal corporate powers, and (5) a corporate strategy for Ontario Hydro to grow in businesses in an open, symmetrical North American energy market. According to Ontario Hydro management this will allow competition and choice to all customers, have a disciplining effect on prices, and give rise to a retail market of new products and services, while at the same time preserve and enhance the value of public investment in the Corporation.

  12. Digitally-Driven Architecture

    Directory of Open Access Journals (Sweden)

    Henriette Bier

    2014-07-01

    Full Text Available The shift from mechanical to digital forces architects to reposition themselves: Architects generate digital information, which can be used not only in designing and fabricating building components but also in embedding behaviours into buildings. This implies that, similar to the way that industrial design and fabrication with its concepts of standardisation and serial production influenced modernist architecture, digital design and fabrication influences contemporary architecture. While standardisation focused on processes of rationalisation of form, mass-customisation as a new paradigm that replaces mass-production, addresses non-standard, complex, and flexible designs. Furthermore, knowledge about the designed object can be encoded in digital data pertaining not just to the geometry of a design but also to its physical or other behaviours within an environment. Digitally-driven architecture implies, therefore, not only digitally-designed and fabricated architecture, it also implies architecture – built form – that can be controlled, actuated, and animated by digital means.In this context, this sixth Footprint issue examines the influence of digital means as pragmatic and conceptual instruments for actuating architecture. The focus is not so much on computer-based systems for the development of architectural designs, but on architecture incorporating digital control, sens­ing, actuating, or other mechanisms that enable buildings to inter­act with their users and surroundings in real time in the real world through physical or sensory change and variation.

  13. Digitally-Driven Architecture

    Directory of Open Access Journals (Sweden)

    Henriette Bier

    2010-06-01

    Full Text Available The shift from mechanical to digital forces architects to reposition themselves: Architects generate digital information, which can be used not only in designing and fabricating building components but also in embedding behaviours into buildings. This implies that, similar to the way that industrial design and fabrication with its concepts of standardisation and serial production influenced modernist architecture, digital design and fabrication influences contemporary architecture. While standardisa­tion focused on processes of rationalisation of form, mass-customisation as a new paradigm that replaces mass-production, addresses non-standard, complex, and flexible designs. Furthermore, knowledge about the designed object can be encoded in digital data pertaining not just to the geometry of a design but also to its physical or other behaviours within an environment. Digitally-driven architecture implies, therefore, not only digitally-designed and fabricated architecture, it also implies architecture – built form – that can be controlled, actuated, and animated by digital means. In this context, this sixth Footprint issue examines the influence of digital means as prag­matic and conceptual instruments for actuating architecture. The focus is not so much on computer-based systems for the development of architectural designs, but on architecture incorporating digital control, sens­ing, actuating, or other mechanisms that enable buildings to inter­act with their users and surroundings in real time in the real world through physical or sensory change and variation.

  14. Customer-driven competition

    International Nuclear Information System (INIS)

    Taylor, R.

    1996-01-01

    Ontario Hydro's customer-driven strategy, recently approved by Hydro's Executive Board, was described. The strategy is founded on the following components: (1) the dissolution of the Ontario power pool, i.e., the loss of Hydro's franchise monopoly on generation, leaving only power transmission in the hands of the Corporation, (2) divestment of Ontario Hydro's system operations and market operations functions to a new, independent Crown corporation called the Central Market Operator, (3) functional and organizational unbundling of Ontario Hydro into three signature businesses, Genco, Transco, and Retailco, and in the latter two, the functional unbundling of wires from sales and services, (4) a fully commercial Ontario Hydro with normal corporate powers, and (5) a corporate strategy for Ontario Hydro to grow in businesses in an open, symmetrical North American energy market. According to Ontario Hydro management this will allow competition and choice to all customers, have a disciplining effect on prices, and give rise to a retail market of new products and services, while at the same time preserve and enhance the value of public investment in the Corporation

  15. Adaptively Addressing Uncertainty in Estuarine and Near Coastal Restoration Projects

    Energy Technology Data Exchange (ETDEWEB)

    Thom, Ronald M.; Williams, Greg D.; Borde, Amy B.; Southard, John A.; Sargeant, Susan L.; Woodruff, Dana L.; Laufle, Jeffrey C.; Glasoe, Stuart

    2005-03-01

    Restoration projects have an uncertain outcome because of a lack of information about current site conditions, historical disturbance levels, effects of landscape alterations on site development, unpredictable trajectories or patterns of ecosystem structural development, and many other factors. A poor understanding of the factors that control the development and dynamics of a system, such as hydrology, salinity, wave energies, can also lead to an unintended outcome. Finally, lack of experience in restoring certain types of systems (e.g., rare or very fragile habitats) or systems in highly modified situations (e.g., highly urbanized estuaries) makes project outcomes uncertain. Because of these uncertainties, project costs can rise dramatically in an attempt to come closer to project goals. All of the potential sources of error can be addressed to a certain degree through adaptive management. The first step is admitting that these uncertainties can exist, and addressing as many of the uncertainties with planning and directed research prior to implementing the project. The second step is to evaluate uncertainties through hypothesis-driven experiments during project implementation. The third step is to use the monitoring program to evaluate and adjust the project as needed to improve the probability of the project to reach is goal. The fourth and final step is to use the information gained in the project to improve future projects. A framework that includes a clear goal statement, a conceptual model, and an evaluation framework can help in this adaptive restoration process. Projects and programs vary in their application of adaptive management in restoration, and it is very difficult to be highly prescriptive in applying adaptive management to projects that necessarily vary widely in scope, goal, ecosystem characteristics, and uncertainties. Very large ecosystem restoration programs in the Mississippi River delta (Coastal Wetlands Planning, Protection, and Restoration

  16. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  17. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  18. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    Science.gov (United States)

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  19. Diagnostic uncertainty and recall bias in chronic low back pain.

    Science.gov (United States)

    Serbic, Danijela; Pincus, Tamar

    2014-08-01

    Patients' beliefs about the origin of their pain and their cognitive processing of pain-related information have both been shown to be associated with poorer prognosis in low back pain (LBP), but the relationship between specific beliefs and specific cognitive processes is not known. The aim of this study was to examine the relationship between diagnostic uncertainty and recall bias in 2 groups of chronic LBP patients, those who were certain about their diagnosis and those who believed that their pain was due to an undiagnosed problem. Patients (N=68) endorsed and subsequently recalled pain, illness, depression, and neutral stimuli. They also provided measures of pain, diagnostic status, mood, and disability. Both groups exhibited a recall bias for pain stimuli, but only the group with diagnostic uncertainty also displayed a recall bias for illness-related stimuli. This bias remained after controlling for depression and disability. Sensitivity analyses using grouping by diagnosis/explanation received supported these findings. Higher levels of depression and disability were found in the group with diagnostic uncertainty, but levels of pain intensity did not differ between the groups. Although the methodology does not provide information on causality, the results provide evidence for a relationship between diagnostic uncertainty and recall bias for negative health-related stimuli in chronic LBP patients. Copyright © 2014 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  20. Impact of Climate Change. Policy Uncertainty in Power Investment

    International Nuclear Information System (INIS)

    Blyth, W.; Yang, M.

    2006-10-01

    Climate change policies are being introduced or actively considered in all IEA member countries, changing the investment conditions and technology choices in the energy sector. Many of these policies are at a formative stage, and policy uncertainty is currently high. The objective of this paper is to quantify the impacts of climate change policy on power investment. We use Real Options Analysis approach in the study and model uncertain carbon price and fuel price with stochastic variables. The analysis compares the effects of climate policy uncertainty with fuel price uncertainty, showing the relative importance of these sources of risk for different technologies. This paper considers views on the importance of climate policy risk, how it is managed, and how it might affect investment behaviour. The implications for policymakers are analyzed, allowing the key messages to be transferred into policy design decisions. We found that in many cases, the dominant risks facing base-load generation investment decisions will be market risks associated with electricity and fuel prices. However, under certain conditions and for some technologies, climate policy uncertainty can be an important risk factor, creating an incentive to delay investment and raising investment thresholds. This paper concludes that government climate change policies to promote investment in low-carbon technologies should aim to overcome this incentive to delay by sending long-term investment signals backed up by strengthened international policy action to enhance domestic policy credibility