WorldWideScience

Sample records for conditions key uncertainties

  1. Conditional uncertainty principle

    Science.gov (United States)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  2. Mechanistic prediction of fission product release under normal and accident conditions: key uncertainties that need better resolution

    International Nuclear Information System (INIS)

    Rest, J.

    1983-09-01

    A theoretical model has been used for predicting the behavior of fission gas and volatile fission products (VFPs) in UO 2 -base fuels during steady-state and transient conditions. This model represents an attempt to develop an efficient predictive capability for the full range of possible reactor operating conditions. Fission products released from the fuel are assumed to reach the fuel surface by successively diffusing (via atomic and gas-bubble mobility) from the grains to grain faces and then to the grain edges, where the fission products are released through a network of interconnected tunnels of fission-gas induced and fabricated porosity. The model provides for a multi-region calculation and uses only one size class to characterize a distribution of fission gas bubbles

  3. Conditional Betas and Investor Uncertainty

    OpenAIRE

    Fernando D. Chague

    2013-01-01

    We derive theoretical expressions for market betas from a rational expectation equilibrium model where the representative investor does not observe if the economy is in a recession or an expansion. Market betas in this economy are time-varying and related to investor uncertainty about the state of the economy. The dynamics of betas will also vary across assets according to the assets' cash-flow structure. In a calibration exercise, we show that value and growth firms have cash-flow structures...

  4. Unconditional security of quantum key distribution and the uncertainty principle

    International Nuclear Information System (INIS)

    Koashi, Masato

    2006-01-01

    An approach to the unconditional security of quantum key distribution protocols is presented, which is based on the uncertainty principle. The approach applies to every case that has been treated via the argument by Shor and Preskill, but it is not necessary to find quantum error correcting codes. It can also treat the cases with uncharacterized apparatuses. The proof can be applied to cases where the secret key rate is larger than the distillable entanglement

  5. Variability and Uncertainties of Key Hydrochemical Parameters for SKB Sites

    Energy Technology Data Exchange (ETDEWEB)

    Bath, Adrian [Intellisci Ltd, Willoughby on the Wolds, Loughborough (United Kingdom); Hermansson, Hans-Peter [Studsvik Nuclear AB, Nykoeping (Sweden)

    2006-12-15

    The work described in this report is a development of SKI's capability for the review and evaluation of data that will constitute part of SKB's case for selection of a suitable site and application to construct a geological repository for spent nuclear fuel. The aim has been to integrate a number of different approaches to interpreting and evaluating hydrochemical data, especially with respect to the parameters that matter most in assessing the suitability of a site and in understanding the geochemistry and groundwater conditions at a site. It has been focused on taking an independent view of overall uncertainties in reported data, taking account of analytical, sampling and other random and systematic sources of error. This evaluation was carried out initially with a compilation and general inspection of data from the Simpevarp, Forsmark and Laxemar sites plus data from older 'historical' boreholes in the Aespoe area. That was followed by a more specific interpretation by means of geochemical calculations which test the robustness of certain parameters, namely pH and redox/Eh. Geochemical model calculations have been carried out with widely available computer software. Data sources and their handling were also considered, especially access to SKB's SICADA database. In preparation for the use of geochemical modelling programs and to establish comparability of model results with those reported by SKB, the underlying thermodynamic databases were compared with each other and with other generally accepted databases. Comparisons of log K data for selected solid phases and solution complexes from the different thermodynamic databases were made. In general, there is a large degree of comparability between the databases, but there are some significant, and in a few cases large, differences. The present situation is however adequate for present purposes. The interpretation of redox equilibria is dependent on identifying the relevant solid phases and

  6. Enterprise strategic development under conditions of uncertainty

    Directory of Open Access Journals (Sweden)

    O.L. Truhan

    2016-09-01

    Full Text Available The author points out the necessity to conduct researches in the field of enterprise strategic development under conditions of increased dynamism and uncertainty of external environment. It is determined that under conditions of external uncertainty it’s reasonable to conduct the strategic planning of entities using the life cycle models of organization and planning on the basis of disclosure. Any organization has to react in a flexible way upon external calls applying the cognitive knowledge about its own business model of development and the ability to intensify internal working reserves. The article determines that in the process of long-term business activity planning managers use traditional approaches based on the familiar facts and conditions that the present tendencies will not be subjected to essential changes in the future. Planning a new risky business one has to act when prerequisites and assumptions are predominant over knowledge. The author proves that under such conditions the powerful tool of enterprise strategic development may be such a well-known approach as “planning on the basis of disclosure”. The approach suggested helps take into account numerous factors of uncertainty of external environment that makes the strategic planning process maximum adaptable to the conditions of venture business development.

  7. VICKEY: Mining Conditional Keys on Knowledge Bases

    DEFF Research Database (Denmark)

    Symeonidou, Danai; Prado, Luis Antonio Galarraga Del; Pernelle, Nathalie

    2017-01-01

    A conditional key is a key constraint that is valid in only a part of the data. In this paper, we show how such keys can be mined automatically on large knowledge bases (KBs). For this, we combine techniques from key mining with techniques from rule mining. We show that our method can scale to KBs...

  8. VICKEY: Mining Conditional Keys on Knowledge Bases

    OpenAIRE

    Symeonidou , Danai; Galárraga , Luis; Pernelle , Nathalie; Saïs , Fatiha; Suchanek , Fabian

    2017-01-01

    International audience; A conditional key is a key constraint that is valid in only a part of the data. In this paper, we show how such keys can be mined automatically on large knowledge bases (KBs). For this, we combine techniques from key mining with techniques from rule mining. We show that our method can scale to KBs of millions of facts. We also show that the conditional keys we mine can improve the quality of entity linking by up to 47 percentage points.

  9. Forecasting Investment Risks in Conditions of Uncertainty

    Directory of Open Access Journals (Sweden)

    Andrenko Elena A.

    2017-04-01

    Full Text Available The article is aimed at studying the topical problem of evaluation and forecasting risks of investment activity of enterprises in conditions of uncertainty. Generalizing the researches on qualitative and quantitative methods for evaluating investment risks has helped to reveal certain shortcomings of the proposed approaches, to note in most of the publications there are no results as to any practical application, and to allocate promising directions. On the basis of the theory of fuzzy sets, a model of forecasting the expected risk has been proposed, making use of the Gauss membership function, which has certain advantages over the multi-angular membership functions. Dependences of investment risk from the parameters characterizing the investment project have been obtained. Using the formulas obtained, the total risk of investing in innovation project depending on the boundary conditions has been defined. As the researched target, index of profitability has been selected. The model provides the potential investors and developers with forecasting possible scenarios of investment process to make informed managerial decisions about the appropriateness of introduction and implementation of a project.

  10. Key uncertainties in climate change policy: Results from ICAM-2

    Energy Technology Data Exchange (ETDEWEB)

    Dowlatabadi, H.; Kandlikar, M.

    1995-12-31

    A critical aspect of climate change decision-making is uncertainties in current understanding of the socioeconomic, climatic and biogeochemical processes involved. Decision-making processes are much better informed if these uncertainties are characterized and their implications understood. Quantitative analysis of these uncertainties serve to: inform decision makers about the likely outcome of policy initiatives; and help set priorities for research so that outcome ambiguities faced by the decision-makers are reduced. A family of integrated assessment models of climate change have been developed at Carnegie Mellon. These models are distinguished from other integrated assessment efforts in that they were designed from the outset to characterize and propagate parameter, model, value, and decision-rule uncertainties. The most recent of these models is ICAM 2.0. This model includes demographics, economic activities, emissions, atmospheric chemistry, climate change, sea level rise and other impact modules and the numerous associated feedbacks. The model has over 700 objects of which over 1/3 are uncertain. These have been grouped into seven different classes of uncertain items. The impact of uncertainties in each of these items can be considered individually or in combinations with the others. In this paper we demonstrate the relative contribution of various sources of uncertainty to different outcomes in the model. The analysis shows that climatic uncertainties are most important, followed by uncertainties in damage calculations, economic uncertainties and direct aerosol forcing uncertainties. Extreme uncertainties in indirect aerosol forcing and behavioral response to climate change (adaptation) were characterized by using bounding analyses; the results suggest that these extreme uncertainties can dominate the choice of policy outcomes.

  11. Sensitivity of direct global warming potentials to key uncertainties

    International Nuclear Information System (INIS)

    Wuebbles, D.J.; Patten, K.O.; Grant, K.E.; Jain, A.K.

    1992-07-01

    A series of sensitivity studies examines the effect of several uncertainties in Global Wanning Potentials (GWPs). For example, the original evaluation of GWPs for the Intergovernmental Panel on Climate Change (EPCC, 1990) did not attempt to account for the possible sinks of carbon dioxide (CO 2 ) that could balance the carbon cycle and produce atmospheric concentrations of C0 2 that match observations. In this study, a balanced carbon cycle model is applied in calculation of the radiative forcing from C0 2 . Use of the balanced model produces up to 20 percent enhancement of the GWPs for most trace gases compared with the EPCC (1990) values for time horizons up to 100 years, but a decreasing enhancement with longer time horizons. Uncertainty limits of the fertilization feedback parameter contribute a 10 percent range in GWP values. Another systematic uncertainty in GWPs is the assumption of an equilibrium atmosphere (one in which the concentration of trace gases remains constant) versus a disequilibrium atmosphere. The latter gives GWPs that are 15 to 30 percent greater than the former, dependening upon the carbon dioxide emission scenario chosen. Seven scenarios are employed: constant emission past 1990 and the six EPCC (1992) emission scenarios. For the analysis of uncertainties in atmospheric lifetime (τ), the GWP changes in direct proportion to τ for short-lived gases, but to a lesser extent for gases with τ greater than the time horizon for the GWP calculation

  12. Coping with Economic Uncertainty: Focus on Key Risks Essential

    Science.gov (United States)

    Sander, Laura

    2009-01-01

    During this period of continued economic uncertainty, higher-education institutions are facing a variety of challenges that by now are very familiar to governing boards and institutional leaders, including poor investment returns, reduced liquidity, limited choices in how they structure debt issues, and threats to flexibility in tuition pricing.…

  13. Determination of a PWR key neutron parameters uncertainties and conformity studies applications

    International Nuclear Information System (INIS)

    Bernard, D.

    2002-01-01

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and lifetime. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimised. (author)

  14. Sensitivity of direct global warming potentials to key uncertainties

    International Nuclear Information System (INIS)

    Weubbles, D.J.; Jain, A.K.; Palten, K.O.; Grant, K.E.

    1995-01-01

    The concept of global warming potential was developed as a relative measure of the potential effects on climate of a greenhouse gas. In this paper a series of sensitivity studies examines several uncertainties in determination of Global Warming Potentials (GWPs). The original evaluation of GWPs did not attempt to account for the possible sinks of carbon dioxide (CO 2 ) that could balance the carbon cycle and produce atmospheric concentrations of CO 2 that match observations. In this study, a balanced carbon cycle model is applied in calculation of the radiative forcing from CO 2 . Use of the balanced model produces up to 21% enhancement of the GWPs for most trace gases compared with the IPCC (1990) values for time horizons up to 100 years, but a decreasing enhancement with longer time horizons. Uncertainty limits of the fertilization feedback parameter contribute a 20% range in GWP values. Another systematic uncertainty in GWPs is the assumption of an equilibrium atmosphere (one in which the concentration of trace gases remains constant) versus a disequilibrium atmosphere (one in which the concentration of trace gases varies with time). The latter gives GWPs that are 19 to 32% greater than the former for a 100 year time horizons, depending upon the carbon dioxide emission scenario chosen. Five scenarios are employed: constant-concentration, constant-emission past 1990 and the three IPCC (1992) emission scenarios. For the analysis of uncertainties in atmospheric lifetime (tor) of the GWP changes in direct proportion to (tor) for short-lived gases, but to a lesser extent for gases with (tor) greater than the time horizontal for the GWP calculation. 40 refs., 7 figs., 13 tabs

  15. Uncertainty Evaluation of Residential Central Air-conditioning Test System

    Science.gov (United States)

    Li, Haoxue

    2018-04-01

    According to national standards, property tests of air-conditioning are required. However, test results could be influenced by the precision of apparatus or measure errors. Therefore, uncertainty evaluation of property tests should be conducted. In this paper, the uncertainties are calculated on the property tests of Xinfei13.6 kW residential central air-conditioning. The evaluation result shows that the property tests are credible.

  16. Groundwater detection monitoring system design under conditions of uncertainty

    NARCIS (Netherlands)

    Yenigül, N.B.

    2006-01-01

    Landfills represent a wide-spread and significant threat to groundwater quality. In this thesis a methodology was developed for the design of optimal groundwater moni-toring system design at landfill sites under conditions of uncertainty. First a decision analysis approach was presented for optimal

  17. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    Science.gov (United States)

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them

  18. Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies (Final Report)

    Science.gov (United States)

    EPA announced the availability of the final report, Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies. This report summarizes some of the recent progress in characterizing uncertainty and variability in physi...

  19. Adaptive neural network motion control for aircraft under uncertainty conditions

    Science.gov (United States)

    Efremov, A. V.; Tiaglik, M. S.; Tiumentsev, Yu V.

    2018-02-01

    We need to provide motion control of modern and advanced aircraft under diverse uncertainty conditions. This problem can be solved by using adaptive control laws. We carry out an analysis of the capabilities of these laws for such adaptive systems as MRAC (Model Reference Adaptive Control) and MPC (Model Predictive Control). In the case of a nonlinear control object, the most efficient solution to the adaptive control problem is the use of neural network technologies. These technologies are suitable for the development of both a control object model and a control law for the object. The approximate nature of the ANN model was taken into account by introducing additional compensating feedback into the control system. The capabilities of adaptive control laws under uncertainty in the source data are considered. We also conduct simulations to assess the contribution of adaptivity to the behavior of the system.

  20. Optimal Time to Invest Energy Storage System under Uncertainty Conditions

    Directory of Open Access Journals (Sweden)

    Yongma Moon

    2014-04-01

    Full Text Available This paper proposes a model to determine the optimal investment time for energy storage systems (ESSs in a price arbitrage trade application under conditions of uncertainty over future profits. The adoption of ESSs can generate profits from price arbitrage trade, which are uncertain because the future marginal prices of electricity will change depending on supply and demand. In addition, since the investment is optional, an investor can delay adopting an ESS until it becomes profitable, and can decide the optimal time. Thus, when we evaluate this investment, we need to incorporate the investor’s option which is not captured by traditional evaluation methods. In order to incorporate these aspects, we applied real option theory to our proposed model, which provides an optimal investment threshold. Our results concerning the optimal time to invest show that if future profits that are expected to be obtained from arbitrage trade become more uncertain, an investor needs to wait longer to invest. Also, improvement in efficiency of ESSs can reduce the uncertainty of arbitrage profit and, consequently, the reduced uncertainty enables earlier ESS investment, even for the same power capacity. Besides, when a higher rate of profits is expected and ESS costs are higher, an investor needs to wait longer. Also, by comparing a widely used net present value model to our real option model, we show that the net present value method underestimates the value for ESS investment and misleads the investor to make an investment earlier.

  1. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    International Nuclear Information System (INIS)

    Gomes, Daniel S.; Teixeira, Antonio S.

    2017-01-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  2. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Daniel S.; Teixeira, Antonio S., E-mail: dsgomes@ipen.br, E-mail: teixeira@ipen [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  3. Uncertainties in key low carbon power generation technologies - Implication for UK decarbonisation targets

    International Nuclear Information System (INIS)

    Kannan, R.

    2009-01-01

    The UK government's economy-wide 60% carbon dioxide reduction target by 2050 requires a paradigm shift in the whole energy system. Numerous analytical studies have concluded that the power sector is a critical contributor to a low carbon energy system, and electricity generation has dominated the policy discussion on UK decarbonisation scenarios. However, range of technical, social and market challenges, combined with alternate market investment strategies mean that large scale deployment of key classes of low carbon electricity technologies is fraught with uncertainty. The UK MARKAL energy systems model has been used to investigate these long-term uncertainties in key electricity generation options. A range of power sector specific parametric sensitivities have been performed under a 'what-if' framework to provide a systematic exploration of least-cost energy system configurations under a broad, integrated set of input assumptions. In this paper results of six sensitivities, via restricted investments in key low carbon technologies to reflect their technical and political uncertainties, and an alternate investment strategies from perceived risk and other barriers, have been presented. (author)

  4. Evaluation of uncertainties of key neutron parameters of PWR-type reactors with slab fuel, application to neutronic conformity

    International Nuclear Information System (INIS)

    Bernard, D.

    2001-12-01

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and life-time. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then, neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimized. (author)

  5. Decision making under conditions of high complexity and uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Sherwell, J.

    1999-07-01

    There is a trend to move environmental policy away from a command and control position to a more market based approach. Decision making under this new approach is made more difficult for both the regulators and the regulated due in part to the constant conflict arising from the divergent expectations that participants may have for the outcome of policy deliberations and from the complexity and uncertainty inherent in the systems that are to be regulated. This change in policy reflects the maturing of environment issues from a must do towards maintenance and reasonable progress and a condition of Sustainable Development. The emerging science of Complexity Theory and the established methods of Game Theory can provide theoretical tools that can act as an aid to decision-makers as they negotiate the perplexing landscape of conflicting needs and wants. The role of these methods in the development and implementation of policy on issues associated with Sustainable Development is of considerable importance. This paper presents a review of approaches to decision making under uncertainty, from Game Theory and Complexity Theory. Data from simulations, such as the Iterated Prisoner's Dilemma, and Controlled Chaos are discussed as they relate the complexity of the underlying economic and ecological systems to natural resource use and exploitation, pollution control and carrying capacity. The important role for rules and their regular review and implementation is highlighted.

  6. Environmental impact and risk assessments and key factors contributing to the overall uncertainties.

    Science.gov (United States)

    Salbu, Brit

    2016-01-01

    , ignoring sensitive history life stages of organisms and transgenerational effects. To link sources, ecosystem transfer and biological effects to future impact and risks, a series of models are usually interfaced, while uncertainty estimates are seldom given. The model predictions are, however, only valid within the boundaries of the overall uncertainties. Furthermore, the model predictions are only useful and relevant when uncertainties are estimated, communicated and understood. Among key factors contributing most to uncertainties, the present paper focuses especially on structure uncertainties (model bias or discrepancies) as aspects such as particle releases, ecosystem dynamics, mixed exposure, sensitive life history stages and transgenerational effects, are usually ignored in assessment models. Research focus on these aspects should significantly reduce the overall uncertainties in the impact and risk assessment of radioactive contaminated ecosystems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Environmental impact and risk assessments and key factors contributing to the overall uncertainties

    International Nuclear Information System (INIS)

    Salbu, Brit

    2016-01-01

    , ignoring sensitive history life stages of organisms and transgenerational effects. To link sources, ecosystem transfer and biological effects to future impact and risks, a series of models are usually interfaced, while uncertainty estimates are seldom given. The model predictions are, however, only valid within the boundaries of the overall uncertainties. Furthermore, the model predictions are only useful and relevant when uncertainties are estimated, communicated and understood. Among key factors contributing most to uncertainties, the present paper focuses especially on structure uncertainties (model bias or discrepancies) as aspects such as particle releases, ecosystem dynamics, mixed exposure, sensitive life history stages and transgenerational effects, are usually ignored in assessment models. Research focus on these aspects should significantly reduce the overall uncertainties in the impact and risk assessment of radioactive contaminated ecosystems. - Highlights: • Source term uncertainties: ignoring radionuclide speciation radioactive particles, the inventory can be underestimated. • Ecosystem transfer uncertainties: Ignoring time dependent interactions, transfer rates and pathways can be wrongly assessed. • Exposure uncertainties: ignoring stressor interactions and effects, effects cannot be judged. • Response uncertainties: ignoring sensitive stages transgenerational effects impact and risks can be underestimated.

  8. The Thermal Conductivity of Earth's Core: A Key Geophysical Parameter's Constraints and Uncertainties

    Science.gov (United States)

    Williams, Q.

    2018-05-01

    The thermal conductivity of iron alloys at high pressures and temperatures is a critical parameter in governing ( a) the present-day heat flow out of Earth's core, ( b) the inferred age of Earth's inner core, and ( c) the thermal evolution of Earth's core and lowermost mantle. It is, however, one of the least well-constrained important geophysical parameters, with current estimates for end-member iron under core-mantle boundary conditions varying by about a factor of 6. Here, the current state of calculations, measurements, and inferences that constrain thermal conductivity at core conditions are reviewed. The applicability of the Wiedemann-Franz law, commonly used to convert electrical resistivity data to thermal conductivity data, is probed: Here, whether the constant of proportionality, the Lorenz number, is constant at extreme conditions is of vital importance. Electron-electron inelastic scattering and increases in Fermi-liquid-like behavior may cause uncertainties in thermal conductivities derived from both first-principles-associated calculations and electrical conductivity measurements. Additional uncertainties include the role of alloying constituents and local magnetic moments of iron in modulating the thermal conductivity. Thus, uncertainties in thermal conductivity remain pervasive, and hence a broad range of core heat flows and inner core ages appear to remain plausible.

  9. Investigation of the uncertainty of a validation experiment due to uncertainty in its boundary conditions

    International Nuclear Information System (INIS)

    Harris, J.; Nani, D.; Jones, K.; Khodier, M.; Smith, B.L.

    2011-01-01

    Elements contributing to uncertainty in experimental repeatability are quantified for data acquisition in a bank of cylinders. The cylinder bank resembles the lower plenum of a high temperature reactor with cylinders arranged on equilateral triangles with a pitch to diameter ratio of 1.7. The 3-D as-built geometry was measured by imaging reflections off the internal surfaces of the facility. This information is useful for building CFD grids for Validation studies. Time-averaged Particle Image Velocimetry (PIV) measurements were acquired daily over several months along with the pressure drop between two cylinders. The atmospheric pressure was measured along with the data set. The PIV data and pressure drop were correlated with atmospheric conditions and changes in experimental setup. It was found that atmospheric conditions play little role in the channel velocity, but impact the pressure drop significantly. The adjustments made to the experiment setup did not change the results. However, in some cases, the wake behind a cylinder was shifted significantly from one day to the next. These changes did not correlate with ambient pressure, room temperature, nor tear down/rebuilds of the facility. (author)

  10. CITRICULTURE ECONOMIC AND FINANCIAL EVALUATION UNDER CONDITIONS OF UNCERTAINTY

    Directory of Open Access Journals (Sweden)

    DANILO SIMÕES

    2015-12-01

    Full Text Available ABSTRACT The citriculture consists in several environmental risks, as weather changes and pests, and also consists in considerable financial risk, mainly due to the period ofreturn on the initial investment. This study was motivated by the need to assess the risks of a business activity such as citriculture. Our objective was to build a stochastic simulation model to achieve the economic and financial analysis of an orange producer in the Midwest region of the state of Sao Paulo, under conditions of uncertainty. The parameters used were the Net Present Value (NPV, the Modified Internal Rate of Return(MIRR, and the Discounted Payback. To evaluate the risk conditions we built a probabilistic model of pseudorandom numbers generated with Monte Carlo method. The results showed that the activity analyzed provides a risk of 42.8% to reach a NPV negative; however, the yield assessed by MIRR was 7.7%, higher than the yield from the reapplication of the positive cash flows. The financial investment pays itself after the fourteenth year of activity.

  11. Uncertainties in extreme precipitation under climate change conditions

    DEFF Research Database (Denmark)

    Sunyer Pinya, Maria Antonia

    of adaptation strategies, but these changes are subject to uncertainties. The focus of this PhD thesis is the quantification of uncertainties in changes in extreme precipitation. It addresses two of the main sources of uncertainty in climate change impact studies: regional climate models (RCMs) and statistical...... downscaling methods (SDMs). RCMs provide information on climate change at the regional scale. SDMs are used to bias-correct and downscale the outputs of the RCMs to the local scale of interest in adaptation strategies. In the first part of the study, a multi-model ensemble of RCMs from the European ENSEMBLES...... project was used to quantify the uncertainty in RCM projections over Denmark. Three aspects of the RCMs relevant for the uncertainty quantification were first identified and investigated. These are: the interdependency of the RCMs; the performance in current climate; and the change in the performance...

  12. Intolerance of uncertainty and conditioned place preference in opioid addiction

    Directory of Open Access Journals (Sweden)

    Milen L. Radell

    2018-05-01

    Full Text Available Several personality factors have been implicated in vulnerability to addiction by impacting learning and decision making. One such factor is intolerance of uncertainty (IU, the tendency to perceive uncertain situations negatively and avoid them. Conditioned place preference (CPP, which compares preference for contexts paired with reward, has been used to examine the motivation for both drug and non-drug rewards. However, preference for locations associated with non-drug reward, as well as the potential influence of IU, has not been thoroughly studied in individuals with addiction. In the current study, we examined CPP using a computer-based task in a sample of addicted individuals undergoing opioid maintenance treatment and never-addicted controls. Patients were confirmed to have higher IU than controls. In the CPP task, the two groups did not differ in overall time spent in the previously-rewarded context. However, controls were more likely than patients to immediately return to this context. Contrary to our predictions, IU was not a significant predictor of preference for the previously-rewarded context, although higher IU in controls was associated with a higher number of rewards obtained in the task. No such relationship was found in patients.

  13. A risk-based evaluation of the impact of key uncertainties on the prediction of severe accident source terms - STU

    International Nuclear Information System (INIS)

    Ang, M.L.; Grindon, E.; Dutton, L.M.C.; Garcia-Sedano, P.; Santamaria, C.S.; Centner, B.; Auglaire, M.; Routamo, T.; Outa, S.; Jokiniemi, J.; Gustavsson, V.; Wennerstrom, H.; Spanier, L.; Gren, M.; Boschiero, M-H; Droulas, J-L; Friederichs, H-G; Sonnenkalb, M.

    2001-01-01

    The purpose of this project is to address the key uncertainties associated with a number of fission product release and transport phenomena in a wider context and to assess their relevance to key severe accident sequences. This project is a wide-based analysis involving eight reactor designs that are representative of the reactors currently operating in the European Union (EU). In total, 20 accident sequences covering a wide range of conditions have been chosen to provide the basis for sensitivity studies. The appraisal is achieved through a systematic risk-based framework developed within this project. Specifically, this is a quantitative interpretation of the sensitivity calculations on the basis of 'significance indicators', applied above defined threshold values. These threshold values represent a good surrogate for 'large release', which is defined in a number of EU countries. In addition, the results are placed in the context of in-containment source term limits, for advanced light water reactor designs, as defined by international guidelines. Overall, despite the phenomenological uncertainties, the predicted source terms (both into the containment, and subsequently, into the environment) do not display a high degree of sensitivity to the individual fission product issues addressed in this project. This is due, mainly, to the substantial capacity for the attenuation of airborne fission products by the designed safety provisions and the natural fission product retention mechanisms within the containment

  14. Key drivers and economic consequences of high-end climate scenarios: uncertainties and risks

    DEFF Research Database (Denmark)

    Halsnæs, Kirsten; Kaspersen, Per Skougaard; Drews, Martin

    2015-01-01

    The consequences of high-end climate scenarios and the risks of extreme events involve a number of critical assumptions and methodological challenges related to key uncertainties in climate scenarios and modelling, impact analysis, and economics. A methodological framework for integrated analysis...... of extreme events increase beyond scaling, and in combination with economic assumptions we find a very wide range of risk estimates for urban precipitation events. A sensitivity analysis addresses 32 combinations of climate scenarios, damage cost curve approaches, and economic assumptions, including risk...... aversion and equity represented by discount rates. Major impacts of alternative assumptions are investigated. As a result, this study demonstrates that in terms of decision making the actual expectations concerning future climate scenarios and the economic assumptions applied are very important...

  15. A new Method for the Estimation of Initial Condition Uncertainty Structures in Mesoscale Models

    Science.gov (United States)

    Keller, J. D.; Bach, L.; Hense, A.

    2012-12-01

    The estimation of fast growing error modes of a system is a key interest of ensemble data assimilation when assessing uncertainty in initial conditions. Over the last two decades three methods (and variations of these methods) have evolved for global numerical weather prediction models: ensemble Kalman filter, singular vectors and breeding of growing modes (or now ensemble transform). While the former incorporates a priori model error information and observation error estimates to determine ensemble initial conditions, the latter two techniques directly address the error structures associated with Lyapunov vectors. However, in global models these structures are mainly associated with transient global wave patterns. When assessing initial condition uncertainty in mesoscale limited area models, several problems regarding the aforementioned techniques arise: (a) additional sources of uncertainty on the smaller scales contribute to the error and (b) error structures from the global scale may quickly move through the model domain (depending on the size of the domain). To address the latter problem, perturbation structures from global models are often included in the mesoscale predictions as perturbed boundary conditions. However, the initial perturbations (when used) are often generated with a variant of an ensemble Kalman filter which does not necessarily focus on the large scale error patterns. In the framework of the European regional reanalysis project of the Hans-Ertel-Center for Weather Research we use a mesoscale model with an implemented nudging data assimilation scheme which does not support ensemble data assimilation at all. In preparation of an ensemble-based regional reanalysis and for the estimation of three-dimensional atmospheric covariance structures, we implemented a new method for the assessment of fast growing error modes for mesoscale limited area models. The so-called self-breeding is development based on the breeding of growing modes technique

  16. Resolving Key Uncertainties in Subsurface Energy Recovery: One Role of In Situ Experimentation and URLs (Invited)

    Science.gov (United States)

    Elsworth, D.

    2013-12-01

    Significant uncertainties remain and influence the recovery of energy from the subsurface. These uncertainties include the fate and transport of long-lived radioactive wastes that result from the generation of nuclear power and have been the focus of an active network of international underground research laboratories dating back at least 35 years. However, other nascent carbon-free energy technologies including conventional and EGS geothermal methods, carbon-neutral methods such as carbon capture and sequestration and the utilization of reduced-carbon resources such as unconventional gas reservoirs offer significant challenges in their effective deployment. We illustrate the important role that in situ experiments may play in resolving behaviors at extended length- and time-scales for issues related to chemical-mechanical interactions. Significantly, these include the evolution of transport and mechanical characteristics of stress-sensitive fractured media and their influence of the long-term behavior of the system. Importantly, these interests typically relate to either creating reservoirs (hydroshearing in EGS reservoirs, artificial fractures in shales and coals) or maintaining seals at depth where the permeating fluids may include mixed brines, CO2, methane and other hydrocarbons. Critical questions relate to the interaction of these various fluid mixtures and compositions with the fractured substrate. Important needs are in understanding the roles of key processes (transmission, dissolution, precipitation, sorption and dynamic stressing) on the modification of effective stresses and their influence on the evolution of permeability, strength and induced seismicity on the resulting development of either wanted or unwanted fluid pathways. In situ experimentation has already contributed to addressing some crucial issues of these complex interactions at field scale. Important contributions are noted in understanding the fate and transport of long-lived wastes

  17. Key Process Uncertainties in Soil Carbon Dynamics: Comparing Multiple Model Structures and Observational Meta-analysis

    Science.gov (United States)

    Sulman, B. N.; Moore, J.; Averill, C.; Abramoff, R. Z.; Bradford, M.; Classen, A. T.; Hartman, M. D.; Kivlin, S. N.; Luo, Y.; Mayes, M. A.; Morrison, E. W.; Riley, W. J.; Salazar, A.; Schimel, J.; Sridhar, B.; Tang, J.; Wang, G.; Wieder, W. R.

    2016-12-01

    Soil carbon (C) dynamics are crucial to understanding and predicting C cycle responses to global change and soil C modeling is a key tool for understanding these dynamics. While first order model structures have historically dominated this area, a recent proliferation of alternative model structures representing different assumptions about microbial activity and mineral protection is providing new opportunities to explore process uncertainties related to soil C dynamics. We conducted idealized simulations of soil C responses to warming and litter addition using models from five research groups that incorporated different sets of assumptions about processes governing soil C decomposition and stabilization. We conducted a meta-analysis of published warming and C addition experiments for comparison with simulations. Assumptions related to mineral protection and microbial dynamics drove strong differences among models. In response to C additions, some models predicted long-term C accumulation while others predicted transient increases that were counteracted by accelerating decomposition. In experimental manipulations, doubling litter addition did not change soil C stocks in studies spanning as long as two decades. This result agreed with simulations from models with strong microbial growth responses and limited mineral sorption capacity. In observations, warming initially drove soil C loss via increased CO2 production, but in some studies soil C rebounded and increased over decadal time scales. In contrast, all models predicted sustained C losses under warming. The disagreement with experimental results could be explained by physiological or community-level acclimation, or by warming-related changes in plant growth. In addition to the role of microbial activity, assumptions related to mineral sorption and protected C played a key role in driving long-term model responses. In general, simulations were similar in their initial responses to perturbations but diverged over

  18. Uncertainties under emergency conditions and possible application of fuzzy theory for nuclear safety

    International Nuclear Information System (INIS)

    Nishiwaki, Y.

    1996-01-01

    Various uncertainties involved in emergency conditions are discussed, and it is pointed out that uncertainties, in many factors, are fuzzy. As a result, it is proposed to use fuzzy theory as an attempt for analysing cause and effects under emergency conditions such as Hiroshima, Nagasaki and other nuclear accidents and, for fuzzy failure analysis and diagnostics of nuclear power plant

  19. Atmospheric Carbon Dioxide and the Global Carbon Cycle: The Key Uncertainties

    Science.gov (United States)

    Peng, T. H.; Post, W. M.; DeAngelis, D. L.; Dale, V. H.; Farrell, M. P.

    1987-12-01

    The biogeochemical cycling of carbon between its sources and sinks determines the rate of increase in atmospheric CO{sub 2} concentrations. The observed increase in atmospheric CO{sub 2} content is less than the estimated release from fossil fuel consumption and deforestation. This discrepancy can be explained by interactions between the atmosphere and other global carbon reservoirs such as the oceans, and the terrestrial biosphere including soils. Undoubtedly, the oceans have been the most important sinks for CO{sub 2} produced by man. But, the physical, chemical, and biological processes of oceans are complex and, therefore, credible estimates of CO{sub 2} uptake can probably only come from mathematical models. Unfortunately, one- and two-dimensional ocean models do not allow for enough CO{sub 2} uptake to accurately account for known releases. Thus, they produce higher concentrations of atmospheric CO{sub 2} than was historically the case. More complex three-dimensional models, while currently being developed, may make better use of existing tracer data than do one- and two-dimensional models and will also incorporate climate feedback effects to provide a more realistic view of ocean dynamics and CO{sub 2} fluxes. The instability of current models to estimate accurately oceanic uptake of CO{sub 2} creates one of the key uncertainties in predictions of atmospheric CO{sub 2} increases and climate responses over the next 100 to 200 years.

  20. Local conditions and uncertainty bands for Semiscale Test S-02-9

    International Nuclear Information System (INIS)

    Varacalle, D.J. Jr.

    1979-01-01

    Analysis was performed to derive local conditions heat transfer parameters and their uncertainties using computer codes and experimentally derived boundary conditions for the Semiscale core for LOCA Test S-02-9. Calculations performed consisted of nominal code cases using best-estimate input parameters and cases where the specified input parameters were perturbed in accordance with the response surface method of uncertainty analysis. The output parameters of interest were those that are used in film boiling heat transfer correlations including enthalpy, pressure, quality, and coolant flow rate. Large uncertainty deviations occurred during low core mass flow periods where the relative flow uncertainties were large. Utilizing the derived local conditions and their associated uncertainties, a study was then made which showed the uncertainty in film boiling heat transfer coefficient varied between 5 and 250%

  1. Quantifying uncertainty in Gulf of Mexico forecasts stemming from uncertain initial conditions

    KAUST Repository

    Iskandarani, Mohamed; Le Hé naff, Matthieu; Srinivasan, Ashwanth; Knio, Omar

    2016-01-01

    Polynomial Chaos (PC) methods are used to quantify the impacts of initial conditions uncertainties on oceanic forecasts of the Gulf of Mexico circulation. Empirical Orthogonal Functions are used as initial conditions perturbations with their modal

  2. Uncertainties

    Indian Academy of Sciences (India)

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  3. Uncertainty

    International Nuclear Information System (INIS)

    Silva, T.A. da

    1988-01-01

    The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt

  4. Intelligent Approach to Inventory Control in Logistics under Uncertainty Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Więcek, P.

    2016-07-01

    The article presents a proposal for a combined application of fuzzy logic and genetic algorithms to control the procurement process in the enterprise. The approach presented in this paper draws particular attention to the impact of external random factors in the form of demand and lead time uncertainty. The model uses time-variable membership function parameters in a dynamic fashion to describe the modelled output fuzzy (sets) values. An additional element is the use of genetic algorithms for optimisation of fuzzy rule base in the proposed method. The approach presented in this paper was veryfied according to four criteria based on a computer simulation performed on the basis of the actual data from an enterprise. (Author)

  5. Propagation of uncertainties from basic data to key parameters of nuclear reactors

    International Nuclear Information System (INIS)

    Kodeli, I.

    2010-01-01

    The author reports the development of a set of computing software (SUSD3D) and of libraries of nuclear data covariance matrices to assess sensitivities of parameters with respect to basic nuclear data, and the corresponding uncertainties, notably for radiation transport for which uncertainty has various origins: reactivity coefficients or neutron and gamma ray flows. He reports the application to fusion and fission reactors

  6. Quantifying Key Climate Parameter Uncertainties Using an Earth System Model with a Dynamic 3D Ocean

    Science.gov (United States)

    Olson, R.; Sriver, R. L.; Goes, M. P.; Urban, N.; Matthews, D.; Haran, M.; Keller, K.

    2011-12-01

    Climate projections hinge critically on uncertain climate model parameters such as climate sensitivity, vertical ocean diffusivity and anthropogenic sulfate aerosol forcings. Climate sensitivity is defined as the equilibrium global mean temperature response to a doubling of atmospheric CO2 concentrations. Vertical ocean diffusivity parameterizes sub-grid scale ocean vertical mixing processes. These parameters are typically estimated using Intermediate Complexity Earth System Models (EMICs) that lack a full 3D representation of the oceans, thereby neglecting the effects of mixing on ocean dynamics and meridional overturning. We improve on these studies by employing an EMIC with a dynamic 3D ocean model to estimate these parameters. We carry out historical climate simulations with the University of Victoria Earth System Climate Model (UVic ESCM) varying parameters that affect climate sensitivity, vertical ocean mixing, and effects of anthropogenic sulfate aerosols. We use a Bayesian approach whereby the likelihood of each parameter combination depends on how well the model simulates surface air temperature and upper ocean heat content. We use a Gaussian process emulator to interpolate the model output to an arbitrary parameter setting. We use Markov Chain Monte Carlo method to estimate the posterior probability distribution function (pdf) of these parameters. We explore the sensitivity of the results to prior assumptions about the parameters. In addition, we estimate the relative skill of different observations to constrain the parameters. We quantify the uncertainty in parameter estimates stemming from climate variability, model and observational errors. We explore the sensitivity of key decision-relevant climate projections to these parameters. We find that climate sensitivity and vertical ocean diffusivity estimates are consistent with previously published results. The climate sensitivity pdf is strongly affected by the prior assumptions, and by the scaling

  7. Conditional Density Models Integrating Fuzzy and Probabilistic Representations of Uncertainty

    NARCIS (Netherlands)

    R.J. Almeida e Santos Nogueira (Rui Jorge)

    2014-01-01

    markdownabstract__Abstract__ Conditional density estimation is an important problem in a variety of areas such as system identification, machine learning, artificial intelligence, empirical economics, macroeconomic analysis, quantitative finance and risk management. This work considers the

  8. Frequency-Domain Robust Performance Condition for Controller Uncertainty in SISO LTI Systems: A Geometric Approach

    Directory of Open Access Journals (Sweden)

    Vahid Raissi Dehkordi

    2009-01-01

    Full Text Available This paper deals with the robust performance problem of a linear time-invariant control system in the presence of robust controller uncertainty. Assuming that plant uncertainty is modeled as an additive perturbation, a geometrical approach is followed in order to find a necessary and sufficient condition for robust performance in the form of a bound on the magnitude of controller uncertainty. This frequency domain bound is derived by converting the problem into an optimization problem, whose solution is shown to be more time-efficient than a conventional structured singular value calculation. The bound on controller uncertainty can be used in controller order reduction and implementation problems.

  9. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k_e_f_f and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  10. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    Energy Technology Data Exchange (ETDEWEB)

    Wan, Chenghui [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Cao, Liangzhi, E-mail: caolz@mail.xjtu.edu.cn [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Wu, Hongchun [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Shen, Wei [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2017-04-15

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k{sub eff} and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  11. Condition trees as a mechanism for communicating the meaning of uncertainties

    Science.gov (United States)

    Beven, Keith

    2015-04-01

    Uncertainty communication for environmental problems is fraught with difficulty for good epistemic reasons. The fact that most sources of uncertainty are subject to, and often dominated by, epistemic uncertainties means that the unthinking use of probability theory might actually be misleading and lead to false inference (even in some cases where the assumptions of a probabilistic error model might seem to be reasonably valid). This therefore creates problems in communicating the meaning of probabilistic uncertainties of model predictions to potential users (there are many examples in hydrology, hydraulics, climate change and other domains). It is suggested that one way of being more explicit about the meaning of uncertainties is to associate each type of application with a condition tree of assumptions that need to be made in producing an estimate of uncertainty. The condition tree then provides a basis for discussion and communication of assumptions about uncertainties with users. Agreement of assumptions (albeit generally at some institutional level) will provide some buy-in on the part of users, and a basis for commissioning of future studies. Even in some relatively well-defined problems, such as mapping flood risk, such a condition tree can be rather extensive, but by making each step in the tree explicit then an audit trail is established for future reference. This can act to provide focus in the exercise of agreeing more realistic assumptions.

  12. Influence of Met-Ocean Condition Forecasting Uncertainties on Weather Window Predictions for Offshore Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas; Sørensen, John Dalsgaard

    2017-01-01

    The article briefly presents a novel methodology of weather window estimation for offshore operations and mainly focuses on effects of met-ocean condition forecasting uncertainties on weather window predictions when using the proposed methodology. It is demonstrated that the proposed methodology...... to include stochastic variables, representing met-ocean forecasting uncertainties and the results of such modification are given in terms of predicted weather windows for a selected test case....

  13. MODELS OF AIR TRAFFIC CONTROLLERS ERRORS PREVENTION IN TERMINAL CONTROL AREAS UNDER UNCERTAINTY CONDITIONS

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2017-03-01

    Full Text Available Purpose: the aim of this study is to research applied models of air traffic controllers’ errors prevention in terminal control areas (TMA under uncertainty conditions. In this work the theoretical framework descripting safety events and errors of air traffic controllers connected with the operations in TMA is proposed. Methods: optimisation of terminal control area formal description based on the Threat and Error management model and the TMA network model of air traffic flows. Results: the human factors variables associated with safety events in work of air traffic controllers under uncertainty conditions were obtained. The Threat and Error management model application principles to air traffic controller operations and the TMA network model of air traffic flows were proposed. Discussion: Information processing context for preventing air traffic controller errors, examples of threats in work of air traffic controllers, which are relevant for TMA operations under uncertainty conditions.

  14. The uncertainty cascade in flood risk assessment under changing climatic conditions - the Biala Tarnowska case study

    Science.gov (United States)

    Doroszkiewicz, Joanna; Romanowicz, Renata

    2016-04-01

    Uncertainty in the results of the hydraulic model is not only associated with the limitations of that model and the shortcomings of data. An important factor that has a major impact on the uncertainty of the flood risk assessment in a changing climate conditions is associated with the uncertainty of future climate scenarios (IPCC WG I, 2013). Future climate projections provided by global climate models are used to generate future runoff required as an input to hydraulic models applied in the derivation of flood risk maps. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps. One of the aims of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the process, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-section. The study shows that the application of the simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Acknowledgements: This work was supported by the

  15. An Optimization Method for Condition Based Maintenance of Aircraft Fleet Considering Prognostics Uncertainty

    Directory of Open Access Journals (Sweden)

    Qiang Feng

    2014-01-01

    Full Text Available An optimization method for condition based maintenance (CBM of aircraft fleet considering prognostics uncertainty is proposed. The CBM and dispatch process of aircraft fleet is analyzed first, and the alternative strategy sets for single aircraft are given. Then, the optimization problem of fleet CBM with lower maintenance cost and dispatch risk is translated to the combinatorial optimization problem of single aircraft strategy. Remain useful life (RUL distribution of the key line replaceable Module (LRM has been transformed into the failure probability of the aircraft and the fleet health status matrix is established. And the calculation method of the costs and risks for mission based on health status matrix and maintenance matrix is given. Further, an optimization method for fleet dispatch and CBM under acceptable risk is proposed based on an improved genetic algorithm. Finally, a fleet of 10 aircrafts is studied to verify the proposed method. The results shows that it could realize optimization and control of the aircraft fleet oriented to mission success.

  16. Temperature acclimation of photosynthesis and respiration: A key uncertainty in the carbon cycle-climate feedback

    Science.gov (United States)

    Lombardozzi, Danica L.; Bonan, Gordon B.; Smith, Nicholas G.; Dukes, Jeffrey S.; Fisher, Rosie A.

    2015-10-01

    Earth System Models typically use static responses to temperature to calculate photosynthesis and respiration, but experimental evidence suggests that many plants acclimate to prevailing temperatures. We incorporated representations of photosynthetic and leaf respiratory temperature acclimation into the Community Land Model, the terrestrial component of the Community Earth System Model. These processes increased terrestrial carbon pools by 20 Pg C (22%) at the end of the 21st century under a business-as-usual (Representative Concentration Pathway 8.5) climate scenario. Including the less certain estimates of stem and root respiration acclimation increased terrestrial carbon pools by an additional 17 Pg C (~40% overall increase). High latitudes gained the most carbon with acclimation, and tropical carbon pools increased least. However, results from both of these regions remain uncertain; few relevant data exist for tropical and boreal plants or for extreme temperatures. Constraining these uncertainties will produce more realistic estimates of land carbon feedbacks throughout the 21st century.

  17. Amphetamine-induced sensitization and reward uncertainty similarly enhance incentive salience for conditioned cues.

    Science.gov (United States)

    Robinson, Mike J F; Anselme, Patrick; Suchomel, Kristen; Berridge, Kent C

    2015-08-01

    Amphetamine and stress can sensitize mesolimbic dopamine-related systems. In Pavlovian autoshaping, repeated exposure to uncertainty of reward prediction can enhance motivated sign-tracking or attraction to a discrete reward-predicting cue (lever-conditioned stimulus; CS+), as well as produce cross-sensitization to amphetamine. However, it remains unknown how amphetamine sensitization or repeated restraint stress interact with uncertainty in controlling CS+ incentive salience attribution reflected in sign-tracking. Here rats were tested in 3 successive phases. First, different groups underwent either induction of amphetamine sensitization or repeated restraint stress, or else were not sensitized or stressed as control groups (either saline injections only, or no stress or injection at all). All next received Pavlovian autoshaping training under either certainty conditions (100% CS-UCS association) or uncertainty conditions (50% CS-UCS association and uncertain reward magnitude). During training, rats were assessed for sign-tracking to the CS+ lever versus goal-tracking to the sucrose dish. Finally, all groups were tested for psychomotor sensitization of locomotion revealed by an amphetamine challenge. Our results confirm that reward uncertainty enhanced sign-tracking attraction toward the predictive CS+ lever, at the expense of goal-tracking. We also reported that amphetamine sensitization promoted sign-tracking even in rats trained under CS-UCS certainty conditions, raising them to sign-tracking levels equivalent to the uncertainty group. Combining amphetamine sensitization and uncertainty conditions did not add together to elevate sign-tracking further above the relatively high levels induced by either manipulation alone. In contrast, repeated restraint stress enhanced subsequent amphetamine-elicited locomotion, but did not enhance CS+ attraction. (c) 2015 APA, all rights reserved).

  18. Condition monitoring a key component in the preventive maintenance

    International Nuclear Information System (INIS)

    Isar, C.

    2006-01-01

    The preventive maintenance programs are necessary to ensure that nuclear safety significant equipment will function when it is supposed to. Diesel generator, pumps, motor operated valves and air operated control valves are typically operated every three months. When you drive a car, you depend on lot of sounds, the feel of the steering wheel and gauges to determine if the car is running correctly. Similarly with operating equipment for a power plant - sounds or vibration of the equipment or the gauges and test equipment indicate a problem or degradation, actions are taken to correct the deficiency. Due to safety and economical reason diagnostic and monitoring systems are of growing interest in all complex industrial production. Diagnostic systems are requested to detect, diagnose and localize faulty operating conditions at an early stage in order to prevent severe failures and to enable predictive and condition oriented maintenance. In this context it is a need for using various on-line and off-line condition monitoring and diagnostics, non-destructive inspection techniques and surveillance. The condition monitoring technique used in nuclear power plant Cernavoda are presented in this paper. The selection of components and parameters to be monitored, monitoring and diagnostics techniques used are incorporated into a preventive maintenance program. Modern measurement technique in combination with advanced computerized data processing and acquisition show new ways in the field of machine surveillance. The diagnostic capabilities of predictive maintenance technologies have increased recently year with advances made in sensor technologies. The paper will focus on the following condition monitoring technique: - oil analysis - acoustic leakage monitoring - thermography - valve diagnostics: motor operated valve, air operated valve and check valve - motor current signature - vibration monitoring and rotating machine monitoring and diagnostics For each condition monitoring

  19. Amphetamine-induced sensitization and reward uncertainty similarly enhance incentive salience for conditioned cues

    Science.gov (United States)

    Robinson, Mike J.F.; Anselme, Patrick; Suchomel, Kristen; Berridge, Kent C.

    2015-01-01

    Amphetamine and stress can sensitize mesolimbic dopamine-related systems. In Pavlovian autoshaping, repeated exposure to uncertainty of reward prediction can enhance motivated sign-tracking or attraction to a discrete reward-predicting cue (lever CS+), as well as produce cross-sensitization to amphetamine. However, it remains unknown how amphetamine-sensitization or repeated restraint stress interact with uncertainty in controlling CS+ incentive salience attribution reflected in sign-tracking. Here rats were tested in three successive phases. First, different groups underwent either induction of amphetamine sensitization or repeated restraint stress, or else were not sensitized or stressed as control groups (either saline injections only, or no stress or injection at all). All next received Pavlovian autoshaping training under either certainty conditions (100% CS-UCS association) or uncertainty conditions (50% CS-UCS association and uncertain reward magnitude). During training, rats were assessed for sign-tracking to the lever CS+ versus goal-tracking to the sucrose dish. Finally, all groups were tested for psychomotor sensitization of locomotion revealed by an amphetamine challenge. Our results confirm that reward uncertainty enhanced sign-tracking attraction toward the predictive CS+ lever, at the expense of goal-tracking. We also report that amphetamine sensitization promoted sign-tracking even in rats trained under CS-UCS certainty conditions, raising them to sign-tracking levels equivalent to the uncertainty group. Combining amphetamine sensitization and uncertainty conditions together did not add together to elevate sign-tracking further above the relatively high levels induced by either manipulation alone. In contrast, repeated restraint stress enhanced subsequent amphetamine-elicited locomotion, but did not enhance CS+ attraction. PMID:26076340

  20. Multiple unit root tests under uncertainty over the initial condition : some powerful modifications

    NARCIS (Netherlands)

    Hanck, C.

    We modify the union-of-rejection unit root test of Harvey et al. "Unit Root Testing in Practice: Dealing with Uncertainty over the Trend and Initial Condition" (Harvey, Econom Theory 25:587-636, 2009). This test rejects if either of two different unit root tests rejects but controls the inherent

  1. Key Conditions for Successful Serial Entrepreneurship in Healthcare.

    Science.gov (United States)

    Piron, Cameron

    2017-01-01

    As a serial entrepreneur in the medical device industry, the author embraces Snowdon's (2017) effort to create and stimulate dialogue among experts in health system innovation in an effort to define and support Canada's innovation agenda. In this paper, he outlines some of the attributes and skills that companies need to launch their products and scale their companies. He also identifies the main conditions of an innovation ecosystem that create the necessary infrastructure to enable and support highly successful companies while allowing them to accelerate their growth.

  2. 3-D simulations of M9 earthquakes on the Cascadia Megathrust: Key parameters and uncertainty

    Science.gov (United States)

    Wirth, Erin; Frankel, Arthur; Vidale, John; Marafi, Nasser A.; Stephenson, William J.

    2017-01-01

    Geologic and historical records indicate that the Cascadia subduction zone is capable of generating large, megathrust earthquakes up to magnitude 9. The last great Cascadia earthquake occurred in 1700, and thus there is no direct measure on the intensity of ground shaking or specific rupture parameters from seismic recordings. We use 3-D numerical simulations to generate broadband (0-10 Hz) synthetic seismograms for 50 M9 rupture scenarios on the Cascadia megathrust. Slip consists of multiple high-stress drop subevents (~M8) with short rise times on the deeper portion of the fault, superimposed on a background slip distribution with longer rise times. We find a >4x variation in the intensity of ground shaking depending upon several key parameters, including the down-dip limit of rupture, the slip distribution and location of strong-motion-generating subevents, and the hypocenter location. We find that extending the down-dip limit of rupture to the top of the non-volcanic tremor zone results in a ~2-3x increase in peak ground acceleration for the inland city of Seattle, Washington, compared to a completely offshore rupture. However, our simulations show that allowing the rupture to extend to the up-dip limit of tremor (i.e., the deepest rupture extent in the National Seismic Hazard Maps), even when tapering the slip to zero at the down-dip edge, results in multiple areas of coseismic coastal uplift. This is inconsistent with coastal geologic evidence (e.g., buried soils, submerged forests), which suggests predominantly coastal subsidence for the 1700 earthquake and previous events. Defining the down-dip limit of rupture as the 1 cm/yr locking contour (i.e., mostly offshore) results in primarily coseismic subsidence at coastal sites. We also find that the presence of deep subevents can produce along-strike variations in subsidence and ground shaking along the coast. Our results demonstrate the wide range of possible ground motions from an M9 megathrust earthquake in

  3. The Efficacy of Blue-Green Infrastructure for Pluvial Flood Prevention under Conditions of Deep Uncertainty

    Science.gov (United States)

    Babovic, Filip; Mijic, Ana; Madani, Kaveh

    2017-04-01

    Urban areas around the world are growing in size and importance; however, cities experience elevated risks of pluvial flooding due to the prevalence of impermeable land surfaces within them. Urban planners and engineers encounter a great deal of uncertainty when planning adaptations to these flood risks, due to the interaction of multiple factors such as climate change and land use change. This leads to conditions of deep uncertainty. Blue-Green (BG) solutions utilise natural vegetation and processes to absorb and retain runoff while providing a host of other social, economic and environmental services. When utilised in conjunction with Decision Making under Deep Uncertainty (DMDU) methodologies, BG infrastructure provides a flexible and adaptable method of "no-regret" adaptation; resulting in a practical, economically efficient, and socially acceptable solution for flood risk mitigation. This work presents the methodology for analysing the impact of BG infrastructure in the context of the Adaptation Tipping Points approach to protect against pluvial flood risk in an iterative manner. An economic analysis of the adaptation pathways is also conducted in order to better inform decision-makers on the benefits and costs of the adaptation options presented. The methodology was applied to a case study in the Cranbrook Catchment in the North East of London. Our results show that BG infrastructure performs better under conditions of uncertainty than traditional grey infrastructure.

  4. Assessing the social sustainability contribution of an infrastructure project under conditions of uncertainty

    International Nuclear Information System (INIS)

    Sierra, Leonardo A.; Yepes, Víctor; Pellicer, Eugenio

    2017-01-01

    Assessing the viability of a public infrastructure includes economic, technical and environmental aspects; however, on many occasions, the social aspects are not always adequately considered. This article proposes a procedure to estimate the social sustainability of infrastructure projects under conditions of uncertainty, based on a multicriteria deterministic method. The variability of the method inputs is contributed by the decision-makers. Uncertain inputs are treated through uniform and beta PERT distributions. The Monte Carlo method is used to propagate uncertainty in the method. A case study of a road infrastructure improvement in El Salvador is used to illustrate this treatment. The main results determine the variability of the short and long-term social improvement indices by infrastructure and the probability of the position in the prioritization of the alternatives. The proposed mechanism improves the reliability of the decision making early in infrastructure projects, taking their social contribution into account. The results can complement environmental and economic sustainability assessments. - Highlights: •Estimate the social sustainability of infrastructure projects under conditions of uncertainty •The method uses multicriteria and Monte Carlo techniques and beta PERT distributions •Determines variability of the short and long term social improvement •Determines probability in the prioritization of alternatives •Improves reliability of decision making considering the social contribution

  5. Improving Chemical EOR Simulations and Reducing the Subsurface Uncertainty Using Downscaling Conditioned to Tracer Data

    KAUST Repository

    Torrealba, Victor A.

    2017-10-02

    Recovery mechanisms are more likely to be influenced by grid-block size and reservoir heterogeneity in Chemical EOR (CEOR) than in conventional Water Flood (WF) simulations. Grid upscaling based on single-phase flow is a common practice in WF simulation models, where simulation grids are coarsened to perform history matching and sensitivity analyses within affordable computational times. This coarse grid resolution (typically about 100 ft.) could be sufficient in WF, however, it usually fails to capture key physical mechanisms in CEOR. In addition to increased numerical dispersion in coarse models, these models tend to artificially increase the level of mixing between the fluids and may not have enough resolution to capture different length scales of geological features to which EOR processes can be highly sensitive. As a result of which, coarse models usually overestimate the sweep efficiency, and underestimate the displacement efficiency. Grid refinement (simple downscaling) can resolve artificial mixing but appropriately re-creating the fine-scale heterogeneity, without degrading the history-match conducted on the coarse-scale, remains a challenge. Because of the difference in recovery mechanisms involved in CEOR, such as miscibility and thermodynamic phase split, the impact of grid downscaling on CEOR simulations is not well understood. In this work, we introduce a geostatistical downscaling method conditioned to tracer data to refine a coarse history-matched WF model. This downscaling process is necessary for CEOR simulations when the original (fine) earth model is not available or when major disconnects occur between the original earth model and the history-matched coarse WF model. The proposed downscaling method is a process of refining the coarse grid, and populating the relevant properties in the newly created finer grid cells. The method considers the values of rock properties in the coarse grid as hard data, and the corresponding variograms and property

  6. Degradation and performance evaluation of PV module in desert climate conditions with estimate uncertainty in measuring

    Directory of Open Access Journals (Sweden)

    Fezzani Amor

    2017-01-01

    Full Text Available The performance of photovoltaic (PV module is affected by outdoor conditions. Outdoor testing consists installing a module, and collecting electrical performance data and climatic data over a certain period of time. It can also include the study of long-term performance under real work conditions. Tests are operated in URAER located in desert region of Ghardaïa (Algeria characterized by high irradiation and temperature levels. The degradation of PV module with temperature and time exposure to sunlight contributes significantly to the final output from the module, as the output reduces each year. This paper presents a comparative study of different methods to evaluate the degradation of PV module after a long term exposure of more than 12 years in desert region and calculates uncertainties in measuring. Firstly, this evaluation uses three methods: Visual inspection, data given by Solmetric PVA-600 Analyzer translated at Standard Test Condition (STC and based on the investigation results of the translation equations as ICE 60891. Secondly, the degradation rates calculated for all methods. Finally, a comparison between a degradation rates given by Solmetric PVA-600 analyzer, calculated by simulation model and calculated by two methods (ICE 60891 procedures 1, 2. We achieved a detailed uncertainty study in order to improve the procedure and measurement instrument.

  7. ENTERPRISE OPERATION PLANNING IN THE CONDITIONS OF RISK AND UNCERTAINTY IN THE EXTERNAL AND INTERNAL ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Titov V. V.

    2017-09-01

    Full Text Available Optimization of the enterprise activity planning taking into account the risk and uncertainty of the external and internal environment is a complex scientific and methodological problem. Its solution is important for the planning practice. Therefore, the relevance of this research topic is beyond doubt. Planning is based on the use of a multilevel system of models. At the top level, the achievement of key strategic indicators is ensured by the development and implementation of innovations, mainly related to the planning of the release of new high-tech products. However, it is at this level that the risks and uncertainties have the greatest impact on the planning processes for the development, production and marketing of new products. In the scientific literature it is proposed to use the stochastic graphs with returns for this purpose. This idea is also supported in this work. However, the implementation of such an idea requires additional methodological developments and quantitative calculations. The coordination of strategic decisions with tactical plans is based on the idea of eliminating the economic and other risks associated with the economic activity of the enterprise in tactical planning, by creating the stochastic reserves based on the implementation of additional innovations that ensure the receipt of above-target sales volumes, profits and other indicators of the strategic plan. The organization of operational management of production is represented by an iterative, sliding process (reducing risks in production, which is realized taking into account the limitations of tactical control.

  8. Figural properties are prioritized for search under conditions of uncertainty: Setting boundary conditions on claims that figures automatically attract attention.

    Science.gov (United States)

    Peterson, Mary A; Mojica, Andrew J; Salvagio, Elizabeth; Kimchi, Ruth

    2017-01-01

    Nelson and Palmer (2007) concluded that figures/figural properties automatically attract attention, after they found that participants were faster to detect/discriminate targets appearing where a portion of a familiar object was suggested in an otherwise ambiguous display. We investigated whether these effects are truly automatic and whether they generalize to another figural property-convexity. We found that Nelson and Palmer's results do generalize to convexity, but only when participants are uncertain regarding when and where the target will appear. Dependence on uncertainty regarding target location/timing was also observed for familiarity. Thus, although we could replicate and extend Nelson and Palmer's results, our experiments showed that figures do not automatically draw attention. In addition, our research went beyond Nelson and Palmer's, in that we were able to separate figural properties from perceived figures. Because figural properties are regularities that predict where objects lie in the visual field, our results join other evidence that regularities in the environment can attract attention. More generally, our results are consistent with Bayesian theories in which priors are given more weight under conditions of uncertainty.

  9. Directed International Technological Change and Climate Policy: New Methods for Identifying Robust Policies Under Conditions of Deep Uncertainty

    Science.gov (United States)

    Molina-Perez, Edmundo

    It is widely recognized that international environmental technological change is key to reduce the rapidly rising greenhouse gas emissions of emerging nations. In 2010, the United Nations Framework Convention on Climate Change (UNFCCC) Conference of the Parties (COP) agreed to the creation of the Green Climate Fund (GCF). This new multilateral organization has been created with the collective contributions of COP members, and has been tasked with directing over USD 100 billion per year towards investments that can enhance the development and diffusion of clean energy technologies in both advanced and emerging nations (Helm and Pichler, 2015). The landmark agreement arrived at the COP 21 has reaffirmed the key role that the GCF plays in enabling climate mitigation as it is now necessary to align large scale climate financing efforts with the long-term goals agreed at Paris 2015. This study argues that because of the incomplete understanding of the mechanics of international technological change, the multiplicity of policy options and ultimately the presence of climate and technological change deep uncertainty, climate financing institutions such as the GCF, require new analytical methods for designing long-term robust investment plans. Motivated by these challenges, this dissertation shows that the application of new analytical methods, such as Robust Decision Making (RDM) and Exploratory Modeling (Lempert, Popper and Bankes, 2003) to the study of international technological change and climate policy provides useful insights that can be used for designing a robust architecture of international technological cooperation for climate change mitigation. For this study I developed an exploratory dynamic integrated assessment model (EDIAM) which is used as the scenario generator in a large computational experiment. The scope of the experimental design considers an ample set of climate and technological scenarios. These scenarios combine five sources of uncertainty

  10. Effects of Uncertainties in Electric Field Boundary Conditions for Ring Current Simulations

    Science.gov (United States)

    Chen, Margaret W.; O'Brien, T. Paul; Lemon, Colby L.; Guild, Timothy B.

    2018-01-01

    Physics-based simulation results can vary widely depending on the applied boundary conditions. As a first step toward assessing the effect of boundary conditions on ring current simulations, we analyze the uncertainty of cross-polar cap potentials (CPCP) on electric field boundary conditions applied to the Rice Convection Model-Equilibrium (RCM-E). The empirical Weimer model of CPCP is chosen as the reference model and Defense Meteorological Satellite Program CPCP measurements as the reference data. Using temporal correlations from a statistical analysis of the "errors" between the reference model and data, we construct a Monte Carlo CPCP discrete time series model that can be generalized to other model boundary conditions. RCM-E simulations using electric field boundary conditions from the reference model and from 20 randomly generated Monte Carlo discrete time series of CPCP are performed for two large storms. During the 10 August 2000 storm main phase, the proton density at 10 RE at midnight was observed to be low (Dst index is bounded by the simulated Dst values. In contrast, the simulated Dst values during the recovery phases of the 10 August 2000 and 31 August 2005 storms tend to underestimate systematically the observed late Dst recovery. This suggests a need to improve the accuracy of particle loss calculations in the RCM-E model. Application of this technique can aid modelers to make efficient choices on either investing more effort on improving specification of boundary conditions or on improving descriptions of physical processes.

  11. Quantifying uncertainty in Gulf of Mexico forecasts stemming from uncertain initial conditions

    KAUST Repository

    Iskandarani, Mohamed

    2016-06-09

    Polynomial Chaos (PC) methods are used to quantify the impacts of initial conditions uncertainties on oceanic forecasts of the Gulf of Mexico circulation. Empirical Orthogonal Functions are used as initial conditions perturbations with their modal amplitudes considered as uniformly distributed uncertain random variables. These perturbations impact primarily the Loop Current system and several frontal eddies located in its vicinity. A small ensemble is used to sample the space of the modal amplitudes and to construct a surrogate for the evolution of the model predictions via a nonintrusive Galerkin projection. The analysis of the surrogate yields verification measures for the surrogate\\'s reliability and statistical information for the model output. A variance analysis indicates that the sea surface height predictability in the vicinity of the Loop Current is limited to about 20 days. © 2016. American Geophysical Union. All Rights Reserved.

  12. Application of uncertainty analysis method for calculations of accident conditions for RP AES-2006

    International Nuclear Information System (INIS)

    Zajtsev, S.I.; Bykov, M.A.; Zakutaev, M.O.; Siryapin, V.N.; Petkevich, I.G.; Siryapin, N.V.; Borisov, S.L.; Kozlachkov, A.N.

    2015-01-01

    An analysis of some accidents using the uncertainly assessment methods is given. The list of the variable parameters incorporated the model parameters of the computer codes, initial and boundary conditions of reactor plant, neutronics. On the basis of the performed calculations of the accident conditions using the statistical method, errors assessment is presented in the determination of the main parameters comparable with the acceptance criteria. It was shown that in the investigated accidents the values of the calculated parameters with account for their error obtained from TRAP-KS and KORSAR/GP Codes do not exceed the established acceptance criteria. Besides, these values do not exceed the values obtained in the conservative calculations. A possibility in principle of the actual application of the method of estimation of uncertainty was shown to justify the safety of WWER AES-2006 using the thermal-physical codes KORSAR/GP and TRAP-KS, PANDA and SUSA programs [ru

  13. Recent developments in predictive uncertainty assessment based on the model conditional processor approach

    Directory of Open Access Journals (Sweden)

    G. Coccia

    2011-10-01

    Full Text Available The work aims at discussing the role of predictive uncertainty in flood forecasting and flood emergency management, its relevance to improve the decision making process and the techniques to be used for its assessment.

    Real time flood forecasting requires taking into account predictive uncertainty for a number of reasons. Deterministic hydrological/hydraulic forecasts give useful information about real future events, but their predictions, as usually done in practice, cannot be taken and used as real future occurrences but rather used as pseudo-measurements of future occurrences in order to reduce the uncertainty of decision makers. Predictive Uncertainty (PU is in fact defined as the probability of occurrence of a future value of a predictand (such as water level, discharge or water volume conditional upon prior observations and knowledge as well as on all the information we can obtain on that specific future value from model forecasts. When dealing with commensurable quantities, as in the case of floods, PU must be quantified in terms of a probability distribution function which will be used by the emergency managers in their decision process in order to improve the quality and reliability of their decisions.

    After introducing the concept of PU, the presently available processors are introduced and discussed in terms of their benefits and limitations. In this work the Model Conditional Processor (MCP has been extended to the possibility of using two joint Truncated Normal Distributions (TNDs, in order to improve adaptation to low and high flows.

    The paper concludes by showing the results of the application of the MCP on two case studies, the Po river in Italy and the Baron Fork river, OK, USA. In the Po river case the data provided by the Civil Protection of the Emilia Romagna region have been used to implement an operational example, where the predicted variable is the observed water level. In the Baron Fork River

  14. Collaborative testing for key-term definitions under representative conditions: Efficiency costs and no learning benefits.

    Science.gov (United States)

    Wissman, Kathryn T; Rawson, Katherine A

    2018-01-01

    Students are expected to learn key-term definitions across many different grade levels and academic disciplines. Thus, investigating ways to promote understanding of key-term definitions is of critical importance for applied purposes. A recent survey showed that learners report engaging in collaborative practice testing when learning key-term definitions, with outcomes also shedding light on the way in which learners report engaging in collaborative testing in real-world contexts (Wissman & Rawson, 2016, Memory, 24, 223-239). However, no research has directly explored the effectiveness of engaging in collaborative testing under representative conditions. Accordingly, the current research evaluates the costs (with respect to efficiency) and the benefits (with respect to learning) of collaborative testing for key-term definitions under representative conditions. In three experiments (ns = 94, 74, 95), learners individually studied key-term definitions and then completed retrieval practice, which occurred either individually or collaboratively (in dyads). Two days later, all learners completed a final individual test. Results from Experiments 1-2 showed a cost (with respect to efficiency) and no benefit (with respect to learning) of engaging in collaborative testing for key-term definitions. Experiment 3 evaluated a theoretical explanation for why collaborative benefits do not emerge under representative conditions. Collectively, outcomes indicate that collaborative testing versus individual testing is less effective and less efficient when learning key-term definitions under representative conditions.

  15. Predictive Uncertainty Estimation in Water Demand Forecasting Using the Model Conditional Processor

    Directory of Open Access Journals (Sweden)

    Amos O. Anele

    2018-04-01

    Full Text Available In a previous paper, a number of potential models for short-term water demand (STWD prediction have been analysed to find the ones with the best fit. The results obtained in Anele et al. (2017 showed that hybrid models may be considered as the accurate and appropriate forecasting models for STWD prediction. However, such best single valued forecast does not guarantee reliable and robust decisions, which can be properly obtained via model uncertainty processors (MUPs. MUPs provide an estimate of the full predictive densities and not only the single valued expected prediction. Amongst other MUPs, the purpose of this paper is to use the multi-variate version of the model conditional processor (MCP, proposed by Todini (2008, to demonstrate how the estimation of the predictive probability conditional to a number of relatively good predictive models may improve our knowledge, thus reducing the predictive uncertainty (PU when forecasting into the unknown future. Through the MCP approach, the probability distribution of the future water demand can be assessed depending on the forecast provided by one or more deterministic forecasting models. Based on an average weekly data of 168 h, the probability density of the future demand is built conditional on three models’ predictions, namely the autoregressive-moving average (ARMA, feed-forward back propagation neural network (FFBP-NN and hybrid model (i.e., combined forecast from ARMA and FFBP-NN. The results obtained show that MCP may be effectively used for real-time STWD prediction since it brings out the PU connected to its forecast, and such information could help water utilities estimate the risk connected to a decision.

  16. Optimizing an Investment Solution in Conditions of Uncertainty and Risk as a Multicriterial Task

    Directory of Open Access Journals (Sweden)

    Kotsyuba Oleksiy S.

    2017-10-01

    Full Text Available The article is concerned with the methodology for optimizing investment decisions in conditions of uncertainty and risk. The subject area of the study relates, first of all, to real investment. The problem of modeling an optimal investment solution is considered to be a multicriterial task. Also, the constructive part of the publication is based on the position that the multicriteriality of objectives of investment projecting is the result, first, of the complex nature of the category of economic attractiveness (efficiency of real investment, and secondly, of the need to take into account the risk factor, which is a vector measure, in the preparation of an investment solution. An attempt has been made to develop an instrumentarium to optimize investment decisions in a situation of uncertainty and the risk it engenders, based on the use of roll-up of the local criteria. As a result of its implementation, a model has been proposed, which has the advantage that it takes into account, to a greater extent than is the case for standardized roll-up options, the contensive and formal features of the local (detailed criteria.

  17. Automatic Threshold Setting and Its Uncertainty Quantification in Wind Turbine Condition Monitoring System

    DEFF Research Database (Denmark)

    Marhadi, Kun Saptohartyadi; Skrimpas, Georgios Alexandros

    2015-01-01

    Setting optimal alarm thresholds in vibration based condition monitoring system is inherently difficult. There are no established thresholds for many vibration based measurements. Most of the time, the thresholds are set based on statistics of the collected data available. Often times the underly......Setting optimal alarm thresholds in vibration based condition monitoring system is inherently difficult. There are no established thresholds for many vibration based measurements. Most of the time, the thresholds are set based on statistics of the collected data available. Often times...... the underlying probability distribution that describes the data is not known. Choosing an incorrect distribution to describe the data and then setting up thresholds based on the chosen distribution could result in sub-optimal thresholds. Moreover, in wind turbine applications the collected data available may...... not represent the whole operating conditions of a turbine, which results in uncertainty in the parameters of the fitted probability distribution and the thresholds calculated. In this study, Johnson, Normal, and Weibull distributions are investigated; which distribution can best fit vibration data collected...

  18. Theoretical outline of supplier relationship management in conditions of economic uncertainty

    Directory of Open Access Journals (Sweden)

    Vladut Iacob

    2012-12-01

    Full Text Available Internet facilities have created new ways to identify, negotiate and engage suppliers and partners worldwide. Adding value to organizations, supply chain management aims at streamlining all processes and communication channels between them and their main suppliers, to facilitate effective interactions and flawless. Critical conditions generated by the current crisis grew and deepened the importance of supply chain management concept and the leading organizations realized that business partners might be a key element for their success. Meanwhile, global exchange of information can improve business processes for better access to resources.

  19. Determination of a PWR key neutron parameters uncertainties and conformity studies applications; Determination des incertitudes liees aux grandeurs neutroniques d'interet des reacteurs a eau pressurisee a plaques combustible et applications aux etudes de conformite

    Energy Technology Data Exchange (ETDEWEB)

    Bernard, D

    2002-07-01

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and lifetime. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimised. (author)

  20. Determination of a PWR key neutron parameters uncertainties and conformity studies applications; Determination des incertitudes liees aux grandeurs neutroniques d'interet des reacteurs a eau pressurisee a plaques combustible et applications aux etudes de conformite

    Energy Technology Data Exchange (ETDEWEB)

    Bernard, D

    2002-07-01

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and lifetime. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimised. (author)

  1. Investing in Uncertainty: Young Adults with Life-Limiting Conditions Achieving Their Developmental Goals.

    Science.gov (United States)

    Cook, Karen A; Jack, Susan M; Siden, Hal; Thabane, Lehana; Browne, Gina

    2016-08-01

    With improvements in pediatric care and technology, more young adults (YAs) with life-limiting conditions (LLCs) are surviving into adulthood. However, they have limited expectations to live beyond the first decade of adulthood. This study describes the monumental efforts required for YAs with LLCs to achieve their goals in an abbreviated life. The experiences and aspirations of YAs with LLCs to achieve their goals are relatively unknown. This report focuses on their experiences of living with uncertainty and its impact on achieving developmental goals. This study is one component of a larger descriptive study using an innovative bulletin board focus group to examine life experiences of YAs with LLCs. YAs with LLCs share the aspirations and goals of all YAs. Some participants demonstrated a striking capacity to navigate system barriers and achieve their goals, whereas others "got stuck" resulting in lost opportunities. Successful personal life investments were possible if resources were made available, coordinated, navigable, and responsive to new and special requests. Transformative changes to health, social care, and community services are necessary to support their YA ambitions. This study gave voice to those who were previously unheard and demonstrates the monumental hurdles YAs with LLCs face to achieve their goals. A palliative approach to care can mitigate unnecessary hardships and support their goals.

  2. Uncertainty evaluation of EnPIs in industrial applications as a key factor in setting improvement actions

    Science.gov (United States)

    D'Emilia, G.; Di Gasbarro, D.; Gaspari, A.; Natale, E.

    2015-11-01

    A methodology is proposed assuming high-level Energy Performance Indicators (EnPIs) uncertainty as quantitative indicator of the evolution of an Energy Management System (EMS). Motivations leading to the selection of the EnPIs, uncertainty evaluation techniques and criteria supporting decision-making are discussed, in order to plan and pursue reliable measures for energy performance improvement. In this paper, problems, priorities, operative possibilities and reachable improvement limits are examined, starting from the measurement uncertainty assessment. Two different industrial cases are analysed with reference to the following aspects: absence/presence of energy management policy and action plans; responsibility level for the energy issues; employees’ training and motivation in respect of the energy problems; absence/presence of adequate infrastructures for monitoring and sharing of energy information; level of standardization and integration of methods and procedures linked to the energy activities; economic and financial resources for the improvement of energy efficiency. A critic and comparative analysis of the obtained results is realized. The methodology, experimentally validated, allows developing useful considerations for effective, realistic and economically feasible improvement plans, depending on the specific situation. Recursive application of the methodology allows getting reliable and resolved assessment of the EMS status, also in dynamic industrial contexts.

  3. Uncertainties in predicting rice yield by current crop models under a wide range of climatic conditions

    NARCIS (Netherlands)

    Li, T.; Hasegawa, T.; Yin, X.; Zhu, Y.; Boote, K.; Adam, M.; Bregaglio, S.; Buis, S.; Confalonieri, R.; Fumoto, T.; Gaydon, D.; Marcaida III, M.; Nakagawa, H.; Oriol, P.; Ruane, A.C.; Ruget, F.; Singh, B.; Singh, U.; Tang, L.; Yoshida, H.; Zhang, Z.; Bouman, B.

    2015-01-01

    Predicting rice (Oryza sativa) productivity under future climates is important for global food security. Ecophysiological crop models in combination with climate model outputs are commonly used in yield prediction, but uncertainties associated with crop models remain largely unquantified. We

  4. Key Parameters for Operator Diagnosis of BWR Plant Condition during a Severe Accident

    Energy Technology Data Exchange (ETDEWEB)

    Clayton, Dwight A [ORNL; Poore III, Willis P [ORNL

    2015-01-01

    The objective of this research is to examine the key information needed from nuclear power plant instrumentation to guide severe accident management and mitigation for boiling water reactor (BWR) designs (specifically, a BWR/4-Mark I), estimate environmental conditions that the instrumentation will experience during a severe accident, and identify potential gaps in existing instrumentation that may require further research and development. This report notes the key parameters that instrumentation needs to measure to help operators respond to severe accidents. A follow-up report will assess severe accident environmental conditions as estimated by severe accident simulation model analysis for a specific US BWR/4-Mark I plant for those instrumentation systems considered most important for accident management purposes.

  5. Uncertainties in radioactivity release from LWR plants under LOCA conditions - magnitude and consequences

    International Nuclear Information System (INIS)

    Mattila, L.J.

    1977-01-01

    Standardized, deterministic, and supposedly conservative calculation methods and parameter values are applied in radiological safety analyses required for licensing individual nuclear power plants. As realistic as possible and comprehensive analyses are, however, absolutely necessary for many purposes, such as developing improved designs, comparisons between nuclear and non-nuclear power plant alternatives or entire energy production strategies, and also formulating improved acceptance criteria for plant licensing. A specific type of LOCA, called design basis accident (DBA), has obtained an exceptionally important status in the licensing procedure of light water reactor nuclear power plants. This postulated accident has a decisive influence on plant siting and on the design of the various engineered safety features. To avoid certain potential negative effects of the highly standardized guideline-based accident analysis procedure - such as introduction of apparent design ''improvements'', wrong priorization of research efforts, etc. - and to provide a realistic view about the safety of light water reactors to supplement the conservative results from regulatory analyses, a comprehensive understanding of the radiological consequences of LOCA's is indispensable. Estimates of fission product release from LWR plants under different LOCA conditions are associated with uncertainties due to deficient knowledge and truly random variability. The following steps of the fission product transport chain are discussed: generation of activity, fission product release from fuel to fuel pin voids prior to the accident, fuel rod puncturing and fission product release from punctured rods during the accident, further release from fuel during the transient, transport to the containment and finally removal in and leakage from the containment. Numerical examples are given by comparing assumptions, parameter values, and results from the following four analyses: the present guideline

  6. Which key properties controls the preferential transport in the vadose zone under transient hydrological conditions

    Science.gov (United States)

    Groh, J.; Vanderborght, J.; Puetz, T.; Gerke, H. H.; Rupp, H.; Wollschlaeger, U.; Stumpp, C.; Priesack, E.; Vereecken, H.

    2015-12-01

    % arrival time and potential key soil properties, site factors and boundary conditions will be presented in order to identify key properties which control the preferential transport in the vadose zone under transient hydrological conditions.

  7. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  8. Economic-mathematical substantiation of optimizing the use of technical means, to perform tasks in conditions of uncertainty

    Directory of Open Access Journals (Sweden)

    I. V. Kuksova

    2017-01-01

    Full Text Available In this article a variant of the economic-mathematical substantiation of optimization approaches choice of tools for the survey of airfields, the mechanism of the use of multiple statistical criteria for optimality and usefulness of the decisions taken in this matter, when operating in conditions of uncertainty. Lately in the modern world in many socio-economic areas of human life quite often there are thematic challenges of managerial decision-making in a conflict environment and competition, when several in the General case, reasonable working actors perform collective decision-making, and the benefits of each depends not only on the chosen business strategies, but also from management decisions of other partners and the success of the experiments. Therefore, it is necessary to develop and substantiation of optimum variants of decision of choice of forces and means to perform tasks in conditions of uncertainty, that is also acceptable for military units. The actual problem currently is to optimize system control engineering-airfield security, the components of which perform their tasks under conditions of uncertainty. Analysis of opportunities of technical means (unmanned aerial vehicles shows that under the condition of equipping them with the appropriate equipment can be considered about the possibility of their use as part of a complex of technical means for inspection of airfields after the who enemy action in the runway. Therefore, the scientific goal in this article is to examine the possibilities of using technical means for inspection of airfield engineering and airfield services, and the aim of the study is using mathematical methods to justify the choice of the most effective means, from the point of view of economic cost of its introduction and use when performing tasks in conditions of uncertainty.

  9. Establishing key components of yoga interventions for musculoskeletal conditions: a Delphi survey

    Science.gov (United States)

    2014-01-01

    Background Evidence suggests yoga is a safe and effective intervention for the management of physical and psychosocial symptoms associated with musculoskeletal conditions. However, heterogeneity in the components and reporting of clinical yoga trials impedes both the generalization of study results and the replication of study protocols. The aim of this Delphi survey was to address these issues of heterogeneity, by developing a list of recommendations of key components for the design and reporting of yoga interventions for musculoskeletal conditions. Methods Recognised experts involved in the design, conduct, and teaching of yoga for musculoskeletal conditions were identified from a systematic review, and invited to contribute to the Delphi survey. Forty-one of the 58 experts contacted, representing six countries, agreed to participate. A three-round Delphi was conducted via electronic surveys. Round 1 presented an open-ended question, allowing panellists to individually identify components they considered key to the design and reporting of yoga interventions for musculoskeletal conditions. Thematic analysis of Round 1 identified items for quantitative rating in Round 2; items not reaching consensus were forwarded to Round 3 for re-rating. Results Thirty-six panellists (36/41; 88%) completed the three rounds of the Delphi survey. Panellists provided 348 comments to the Round 1 question. These comments were reduced to 49 items, grouped under five themes, for rating in subsequent rounds. A priori group consensus of ≥80% was reached on 28 items related to five themes concerning defining the yoga intervention, types of yoga practices to include in an intervention, delivery of the yoga protocol, domains of outcome measures, and reporting of yoga interventions for musculoskeletal conditions. Additionally, a priori consensus of ≥50% was reached on five items relating to minimum values for intervention parameters. Conclusions Expert consensus has provided a non

  10. A simulation study of organizational decision making under conditions of uncertainty and ambiguity .

    OpenAIRE

    Athens, Arthur J.

    1983-01-01

    Approved for public release; distribution in unlimited. The usual frameworks applied to the analysis of military decision making describe the decision process according to the rational model. The assumptions inherent in this model. however, are not consistent with the reality of warfare's inherent uncertainty and complexity. A better model is needed to address the ambiguilty actually confronting the combat commander. The garbage can model of organizational choice, a nonrational approach to...

  11. Uncertainties under emergency conditions in Hiroshima and Nagasaki in 1945 and Bikini accident in 1954

    International Nuclear Information System (INIS)

    Nishiwaki, Y.; Kawai, H.; Shono, N.; Fujita, S.; Matsuoka, H.; Fujiwara, S.; Hosoda, T.

    2000-01-01

    In exploding an atomic bomb, in addition to ionizing radiation, strong non-ionizing radiation, such as infrared, ultraviolet light, visible light, electromagnetic pulse radiation, as well as heat and shock waves are produced. The survivors and those who visited Hiroshima immediately after the atomic bombing could have been subjected to a number of other possible noxious effects in addition to atomic radiation. Hospitals, laboratories, drugstores, pharmaceutical works, storehouses of chemicals, factories, etc. that were situated close to the hypocenter were all completely destroyed and various mutagenic, carcinogenic or teratogenic substances must have been released, many doctors, nurses and chemists were killed. There was no medical care and no food in the region of high dose exposure and the drinking water was contaminated. There would have been various possibilities of infection. Mental stress would also have been much higher in the survivors closer to the hypocenter. It is confusing which factor played a dominant role. In addition, there would be problems in accurately identifying the position of the exposed persons at the time of the atomic bombing and also in estimating the shielding factors. There may be considerable uncertainty in human memory under such conditions. It is also possible that there could have been a large storage of gasoline to be used for transportation of the army corps in Hiroshima. Therefore there is a possibility that various toxic substances, mutagenic or carcinogenic agents such as benzopyrene and other radiomimetic substances, chemical weapons (Yperit, Lewisite, etc.) could have been released from various facilities which were destroyed at the time of the atomic bombing. After the German surrender, in May 1945, it was reported in June, in Japan, that the USA might attempt landing on Japan mainland, and that they might be planning massive use of chemical weapons all over Japan on that occasion. Preparing for such case chemical officers

  12. Conditional inevitability: Expert perceptions of carbon capture and storage uncertainties in the UK context

    International Nuclear Information System (INIS)

    Evar, Benjamin

    2011-01-01

    This paper presents findings on expert perceptions of uncertainty in carbon capture and storage (CCS) technology and policy in the UK, through survey data and semi-structured interviews with 19 individual participants. Experts were interviewed in industry, research, and non-governmental organisations (NGOs) in the summer of 2009 and were asked to comment on a range of technical processes as well as policy concerns. The survey revealed that perceptions of the technology conform to a 'certainty trough' with users expressing the lowest level of uncertainty, and outsiders expressing the highest level of uncertainty. The interviews revealed that experts express certitude in the prospects for deploying large-scale CCS technology in the UK, all the while questioning several underlying technical and policy premises that are necessary to ensure this goal. - Highlights: → Expert perceptions of CCS in the UK are reported in interviews and a survey. → Surveyed perceptions conform to a 'certainty trough'. → Experts express certitude in prospect of large-scale CCS deployment in the UK. → Experts state that several technical and policy premises are necessary to ensure this goal. → Prospect of large-scale CCS deployment is observed to be highly belief-based.

  13. Conditional inevitability: Expert perceptions of carbon capture and storage uncertainties in the UK context

    Energy Technology Data Exchange (ETDEWEB)

    Evar, Benjamin, E-mail: ben.evar@ed.ac.uk [Scottish Carbon Capture and Storage, School of Geosciences, University of Edinburgh, Drummond Street, Edinburgh EH8 9XP (United Kingdom)

    2011-06-15

    This paper presents findings on expert perceptions of uncertainty in carbon capture and storage (CCS) technology and policy in the UK, through survey data and semi-structured interviews with 19 individual participants. Experts were interviewed in industry, research, and non-governmental organisations (NGOs) in the summer of 2009 and were asked to comment on a range of technical processes as well as policy concerns. The survey revealed that perceptions of the technology conform to a 'certainty trough' with users expressing the lowest level of uncertainty, and outsiders expressing the highest level of uncertainty. The interviews revealed that experts express certitude in the prospects for deploying large-scale CCS technology in the UK, all the while questioning several underlying technical and policy premises that are necessary to ensure this goal. - Highlights: > Expert perceptions of CCS in the UK are reported in interviews and a survey. > Surveyed perceptions conform to a 'certainty trough'. > Experts express certitude in prospect of large-scale CCS deployment in the UK. > Experts state that several technical and policy premises are necessary to ensure this goal. > Prospect of large-scale CCS deployment is observed to be highly belief-based.

  14. Influence of postharvest processing and storage conditions on key antioxidants in pūhā (Sonchus oleraceus L.)

    DEFF Research Database (Denmark)

    Ou, Zong-Quan; Schmierer, David M; Strachan, Clare J

    2014-01-01

    To investigate effects of different postharvest drying processes and storage conditions on key antioxidants in Sonchus oleraceus L. leaves.......To investigate effects of different postharvest drying processes and storage conditions on key antioxidants in Sonchus oleraceus L. leaves....

  15. The fertility response to the Great Recession in Europe and the United States: Structural economic conditions and perceived economic uncertainty

    Directory of Open Access Journals (Sweden)

    Chiara Ludovica Comolli

    2017-05-01

    Full Text Available Background: This study further develops Goldstein et al.'s (2013 analysis of the fertility response to the Great Recession in western economies. Objective: The purpose of this paper is to shed light on the fertility reaction to different indicators of the crisis. Beyond the structural labor market conditions, I investigate the dependence of fertility rates on economic policy uncertainty, government financial risk, and consumer confidence. Methods: Following Goldstein et al. (2013, I use log-log models to assess the elasticity of age-, parity-, and education-specific fertility rates to an array of indicators. Besides the inclusion of a wider set of explanatory variables, I include more recent data (2000−2013 and I enlarge the sample to 31 European countries plus the United States. Results: Fertility response to unemployment in some age- and parity-specific groups has been, in more recent years, larger than estimated by Goldstein et al. (2013. Female unemployment has also been significantly reducing fertility rates. Among uncertainty measures, the drop in consumer confidence is strongly related to fertility decline and in Southern European countries the fertility response to sovereign debt risk is comparable to that of unemployment. Economic policy uncertainty is negatively related to TFR even when controlling for unemployment. Conclusions: Theoretical and empirical investigation is needed to develop more tailored measures of economic and financial insecurity and their impact on birth rates. Contribution: The study shows the nonnegligible influence of economic and financial uncertainty on birth rates during the Great Recession in Western economies, over and above that of structural labor market conditions.

  16. Assessing River Low-Flow Uncertainties Related to Hydrological Model Calibration and Structure under Climate Change Conditions

    Directory of Open Access Journals (Sweden)

    Mélanie Trudel

    2017-03-01

    Full Text Available Low-flow is the flow of water in a river during prolonged dry weather. This paper investigated the uncertainty originating from hydrological model calibration and structure in low-flow simulations under climate change conditions. Two hydrological models of contrasting complexity, GR4J and SWAT, were applied to four sub-watersheds of the Yamaska River, Canada. The two models were calibrated using seven different objective functions including the Nash-Sutcliffe coefficient (NSEQ and six other objective functions more related to low flows. The uncertainty in the model parameters was evaluated using a PARAmeter SOLutions procedure (PARASOL. Twelve climate projections from different combinations of General Circulation Models (GCMs and Regional Circulation Models (RCMs were used to simulate low-flow indices in a reference (1970–2000 and future (2040–2070 horizon. Results indicate that the NSEQ objective function does not properly represent low-flow indices for either model. The NSE objective function applied to the log of the flows shows the lowest total variance for all sub-watersheds. In addition, these hydrological models should be used with care for low-flow studies, since they both show some inconsistent results. The uncertainty is higher for SWAT than for GR4J. With GR4J, the uncertainties in the simulations for the 7Q2 index (the 7-day low-flow value with a 2-year return period are lower for the future period than for the reference period. This can be explained by the analysis of hydrological processes. In the future horizon, a significant worsening of low-flow conditions was projected.

  17. Transient flow conditions in probabilistic wellhead protection: importance and ways to manage spatial and temporal uncertainty in capture zone delineation

    Science.gov (United States)

    Enzenhoefer, R.; Rodriguez-Pretelin, A.; Nowak, W.

    2012-12-01

    "From an engineering standpoint, the quantification of uncertainty is extremely important not only because it allows estimating risk but mostly because it allows taking optimal decisions in an uncertain framework" (Renard, 2007). The most common way to account for uncertainty in the field of subsurface hydrology and wellhead protection is to randomize spatial parameters, e.g. the log-hydraulic conductivity or porosity. This enables water managers to take robust decisions in delineating wellhead protection zones with rationally chosen safety margins in the spirit of probabilistic risk management. Probabilistic wellhead protection zones are commonly based on steady-state flow fields. However, several past studies showed that transient flow conditions may substantially influence the shape and extent of catchments. Therefore, we believe they should be accounted for in the probabilistic assessment and in the delineation process. The aim of our work is to show the significance of flow transients and to investigate the interplay between spatial uncertainty and flow transients in wellhead protection zone delineation. To this end, we advance our concept of probabilistic capture zone delineation (Enzenhoefer et al., 2012) that works with capture probabilities and other probabilistic criteria for delineation. The extended framework is able to evaluate the time fraction that any point on a map falls within a capture zone. In short, we separate capture probabilities into spatial/statistical and time-related frequencies. This will provide water managers additional information on how to manage a well catchment in the light of possible hazard conditions close to the capture boundary under uncertain and time-variable flow conditions. In order to save computational costs, we take advantage of super-positioned flow components with time-variable coefficients. We assume an instantaneous development of steady-state flow conditions after each temporal change in driving forces, following

  18. Investment and Decommissioning Decisions under Conditions of Uncertainty: An Application to the Electricity Sector

    International Nuclear Information System (INIS)

    Chaton, Corinne

    2001-01-01

    The purpose of this study is to use real options theory to answer the following question: Is it necessary, in France, to invest in new nuclear power units or should some of the existing units be decommissioned? The theoretical model developed establishes two price thresholds which determine investment or decommissioning rules for a regulated risk-neutral firm which does not know the future price of its input. It also provides an empirical reading of past choices in construction of French nuclear power plants. The main finding is that, on a certain number of theoretical and empirical assumptions, it is optimal at present is to leave French nuclear power capacity unchanged. Other more general findings follow from the theoretical model. Thus an increase in uncertainty facilitates investment, defers decommissioning and extends the range of input prices for which there is no change in capacity

  19. Life prediction of different commercial dental implants as influence by uncertainties in their fatigue material properties and loading conditions.

    Science.gov (United States)

    Pérez, M A

    2012-12-01

    Probabilistic analyses allow the effect of uncertainty in system parameters to be determined. In the literature, many researchers have investigated static loading effects on dental implants. However, the intrinsic variability and uncertainty of most of the main problem parameters are not accounted for. The objective of this research was to apply a probabilistic computational approach to predict the fatigue life of three different commercial dental implants considering the variability and uncertainty in their fatigue material properties and loading conditions. For one of the commercial dental implants, the influence of its diameter in the fatigue life performance was also studied. This stochastic technique was based on the combination of a probabilistic finite element method (PFEM) and a cumulative damage approach known as B-model. After 6 million of loading cycles, local failure probabilities of 0.3, 0.4 and 0.91 were predicted for the Lifecore, Avinent and GMI implants, respectively (diameter of 3.75mm). The influence of the diameter for the GMI implant was studied and the results predicted a local failure probability of 0.91 and 0.1 for the 3.75mm and 5mm, respectively. In all cases the highest failure probability was located at the upper screw-threads. Therefore, the probabilistic methodology proposed herein may be a useful tool for performing a qualitative comparison between different commercial dental implants. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  20. Representational uncertainty in the brain during threat conditioning and the link with psychopathic traits

    NARCIS (Netherlands)

    Brazil, I.A.; Mathys, C.D.; Popma, A.; Hoppenbrouwers, S.S.; Cohn, M.D.

    2017-01-01

    Background: Psychopathy has repeatedly been linked to disturbed associative learning from aversive events (i.e., threat conditioning). Optimal threat conditioning requires the generation of internal representations of stimulus-outcome contingencies and the rate with which these may change. Because

  1. Modelling management process of key drivers for economic sustainability in the modern conditions of economic development

    Directory of Open Access Journals (Sweden)

    Pishchulina E.S.

    2017-01-01

    Full Text Available The text is about issues concerning the management of driver for manufacturing enterprise economic sustainability and manufacturing enterprise sustainability assessment as the key aspect of the management of enterprise economic sustainability. The given issues become topical as new requirements for the methods of manufacturing enterprise management in the modern conditions of market economy occur. An economic sustainability model that is considered in the article is an integration of enterprise economic growth, economic balance of external and internal environment and economic sustainability. The method of assessment of economic sustainability of a manufacturing enterprise proposed in the study allows to reveal some weaknesses in the enterprise performance, and untapped reserves, which can be further used to improve the economic sustainability and efficiency of the enterprise. The management of manufacturing enterprise economic sustainability is one of the most important factors of business functioning and development in modern market economy. The relevance of this trend is increasing in accordance with the objective requirements of the growing volumes of production and sale, the increasing complexity of economic relations, changing external environment of an enterprise.

  2. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  3. Improving Chemical EOR Simulations and Reducing the Subsurface Uncertainty Using Downscaling Conditioned to Tracer Data

    KAUST Repository

    Torrealba, Victor A.; Hoteit, Hussein; Chawathe, Adwait

    2017-01-01

    and thermodynamic phase split, the impact of grid downscaling on CEOR simulations is not well understood. In this work, we introduce a geostatistical downscaling method conditioned to tracer data to refine a coarse history-matched WF model. This downscaling process

  4. Effective Strategy Formation Models for Inventory Management under the Conditions of Uncertainty

    Science.gov (United States)

    Kosorukov, Oleg Anatolyevich; Sviridova, Olga Alexandrovna

    2015-01-01

    The article deals with the problem of modeling the commodity flows management of a trading company under the conditions of uncertain demand and long supply. The Author presents an analysis of modifications of diversified inventory management system with random demand, for which one can find the optimal inventory control strategies, including those…

  5. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NARCIS (Netherlands)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D A; Brogaard, Sara; Van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-01-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS

  6. Khorasan wheat population researching (Triticum turgidum, ssp. Turanicum (McKey in the minimum tillage conditions

    Directory of Open Access Journals (Sweden)

    Ikanović Jela

    2014-01-01

    Full Text Available Khorasan wheat occupies a special place in the group of new-old cereals (Triticum turgidum, ssp. Turanicum McKey. It is an ancient species, native to eastern Persia, that is very close to durum wheat by morphological characteristics. Investigations were carried out in agro ecological conditions of the eastern Srem, with two wheat populations with dark and bright awns as objects of study. The following morphological and productive characteristics were investigated: plant height (PH, spike length (SH, number of spikelets per spike (NSS, absolute weight (AW and grain weight per spike (GW, seed germination (G and grains yield (YG. Field micro-experiments were set on the carbonate chernozem soil type on loess plateau in 2011 and 2012. Hand wheat sowing was conducted in early March with drill row spacing of 12 cm. The experiment was established as complete randomized block system with four replications. Tending crops measures were not applied during the growing season. Plants were grown without usage of NPK mineral nutrients. Chemical crop protection measures were not applied, although powdery mildew (Erysiphe graminis was appeared before plants spike formation in a small extent. The results showed that both populations have a genetic yield potential. In general, both populations manifested a satisfactory tolerance on lodging and there was no seed dispersal. Plants from bright awns population were higher, had longer spikes and larger number of spikelet’s per spike. However, plants from dark awns population had higher absolute weight and grains weight per spike, as well as grain yield per plant. Strong correlation connections were identified among the investigated characteristics. The determination of correlations, as well as direct and indirect affects, enabled easier understanding of the mutual relationships and their balancing in order to improve the yield per unit area. [Projekat Ministarstva nauke Republike Srbije, br. TR 31078 i br. TR 31022

  7. Key risk indicators for accident assessment conditioned on pre-crash vehicle trajectory.

    Science.gov (United States)

    Shi, X; Wong, Y D; Li, M Z F; Chai, C

    2018-08-01

    Accident events are generally unexpected and occur rarely. Pre-accident risk assessment by surrogate indicators is an effective way to identify risk levels and thus boost accident prediction. Herein, the concept of Key Risk Indicator (KRI) is proposed, which assesses risk exposures using hybrid indicators. Seven metrics are shortlisted as the basic indicators in KRI, with evaluation in terms of risk behaviour, risk avoidance, and risk margin. A typical real-world chain-collision accident and its antecedent (pre-crash) road traffic movements are retrieved from surveillance video footage, and a grid remapping method is proposed for data extraction and coordinates transformation. To investigate the feasibility of each indicator in risk assessment, a temporal-spatial case-control is designed. By comparison, Time Integrated Time-to-collision (TIT) performs better in identifying pre-accident risk conditions; while Crash Potential Index (CPI) is helpful in further picking out the severest ones (the near-accident). Based on TIT and CPI, the expressions of KRIs are developed, which enable us to evaluate risk severity with three levels, as well as the likelihood. KRI-based risk assessment also reveals predictive insights about a potential accident, including at-risk vehicles, locations and time. Furthermore, straightforward thresholds are defined flexibly in KRIs, since the impact of different threshold values is found not to be very critical. For better validation, another independent real-world accident sample is examined, and the two results are in close agreement. Hierarchical indicators such as KRIs offer new insights about pre-accident risk exposures, which is helpful for accident assessment and prediction. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. THE TAKING OF MARKETING DECISIONS IN CONDITIONS OF UNCERTAINTY AND RISK

    OpenAIRE

    Vladimir GROSU

    2011-01-01

    The choice of method of developing marketing decision alternatives and their evaluation depends significantly on the availability of initial information. The decision maker may not possess all the information or possess insufficiently precise information about external environmental conditions and their changes in the future. Such a situation is often observed in the case of macro-environment factors, but can also refer to the company's microenvironment. Given these circumstances, in differen...

  9. The improvement in management of material and production reserves of the enterprise in conditions of uncertainty using the fuzzy sets

    Directory of Open Access Journals (Sweden)

    Raskatova M.I.

    2017-01-01

    Full Text Available In the article a method of management of material and production reserves of industrial enterprises in conditions of uncertainty has been suggested. The method proposes that a part of a benchmark in economic and mathematical models of determining the volume and completion time for orders for raw products and materials be represented as fuzzy numbers. The application of fuzzy-set theory allows taking into account the uncertainty of the external environment without using the theory of probability. The use of the latter is often difficult. The criterion for choosing the optimal management strategy is the minimization of the objective function of the total cost connected with reserves management. It is suggested to obtain the benchmarks that are expressed by fuzzy numbers with a method of expert evaluation. Using the results obtained with the suggested approach the decision-maker will be able to form a concrete strategy for reserves management in a constantly changing market situation. The suggested technique is universal; its application is possible in various industries.

  10. Uncertainty of the Soil–Water Characteristic Curve and Its Effects on Slope Seepage and Stability Analysis under Conditions of Rainfall Using the Markov Chain Monte Carlo Method

    Directory of Open Access Journals (Sweden)

    Weiping Liu

    2017-10-01

    Full Text Available It is important to determine the soil–water characteristic curve (SWCC for analyzing slope seepage and stability under the conditions of rainfall. However, SWCCs exhibit high uncertainty because of complex influencing factors, which has not been previously considered in slope seepage and stability analysis under conditions of rainfall. This study aimed to evaluate the uncertainty of the SWCC and its effects on the seepage and stability analysis of an unsaturated soil slope under conditions of rainfall. The SWCC model parameters were treated as random variables. An uncertainty evaluation of the parameters was conducted based on the Bayesian approach and the Markov chain Monte Carlo (MCMC method. Observed data from granite residual soil were used to test the uncertainty of the SWCC. Then, different confidence intervals for the model parameters of the SWCC were constructed. The slope seepage and stability analysis under conditions of rainfall with the SWCC of different confidence intervals was investigated using finite element software (SEEP/W and SLOPE/W. The results demonstrated that SWCC uncertainty had significant effects on slope seepage and stability. In general, the larger the percentile value, the greater the reduction of negative pore-water pressure in the soil layer and the lower the safety factor of the slope. Uncertainties in the model parameters of the SWCC can lead to obvious errors in predicted pore-water pressure profiles and the estimated safety factor of the slope under conditions of rainfall.

  11. Robust Production Planning in Fashion Apparel Industry under Demand Uncertainty via Conditional Value at Risk

    Directory of Open Access Journals (Sweden)

    Abderrahim Ait-Alla

    2014-01-01

    Full Text Available This paper presents a mathematical model for robust production planning. The model helps fashion apparel suppliers in making decisions concerning allocation of production orders to different production plants characterized by different lead times and production costs, and in proper time scheduling and sequencing of these production orders. The model aims at optimizing these decisions concerning objectives of minimal production costs and minimal tardiness. It considers several factors such as the stochastic nature of customer demand, differences in production and transport costs and transport times between production plants in different regions. Finally, the model is applied to a case study. The results of numerical computations are presented. The implications of the model results on different fashion related product types and delivery strategies, as well as the model’s limitations and potentials for expansion, are discussed. Results indicate that the production planning model using conditional value at risk (CVaR as the risk measure performs robustly and provides flexibility in decision analysis between different scenarios.

  12. SIMULATION OF CARS ACCUMULATION PROCESSES FOR SOLVING TASKS OF OPERATIONAL PLANNING IN CONDITIONS OF INITIAL INFORMATION UNCERTAINTY

    Directory of Open Access Journals (Sweden)

    О. A. Tereshchenko

    2017-06-01

    Full Text Available Purpose. The article highlights development of the methodological basis for simulation the processes of cars accumulation in solving operational planning problems under conditions of initial information uncertainty for assessing the sustainability of the adopted planning scenario and calculating the associated technological risks. Methodology. The solution of the problem under investigation is based on the use of general scientific approaches, the apparatus of probability theory and the theory of fuzzy sets. To achieve this purpose, the factors influencing the entropy of operational plans are systematized. It is established that when planning the operational work of railway stations, sections and nodes, the most significant factors that cause uncertainty in the initial information are: a external conditions with respect to the railway ground in question, expressed by the uncertainty of the timing of cars arrivals; b external, hard-to-identify goals for the functioning of other participants in the logistics chain (primarily customers, expressed by the uncertainty of the completion time with the freight cars. These factors are suggested to be taken into account in automated planning through statistical analysis – the establishment and study of the remaining time (prediction errors. As a result, analytical dependencies are proposed for rational representation of the probability density functions of the time residual distribution in the form of point, piecewise-defined and continuous analytic models. The developed models of cars accumulation, the application of which depends on the identified states of the predicted incoming car flow to the accumulation system, are presented below. In addition, the last proposed model is a general case of models of accumulation processes with an arbitrary level of reliability of the initial information for any structure of the incoming flow of cars. In conclusion, a technique for estimating the results of

  13. Effects of Heterogeneity and Uncertainties in Sources and Initial and Boundary Conditions on Spatiotemporal Variations of Groundwater Levels

    Science.gov (United States)

    Zhang, Y. K.; Liang, X.

    2014-12-01

    Effects of aquifer heterogeneity and uncertainties in source/sink, and initial and boundary conditions in a groundwater flow model on the spatiotemporal variations of groundwater level, h(x,t), were investigated. Analytical solutions for the variance and covariance of h(x, t) in an unconfined aquifer described by a linearized Boussinesq equation with a white noise source/sink and a random transmissivity field were derived. It was found that in a typical aquifer the error in h(x,t) in early time is mainly caused by the random initial condition and the error reduces as time goes to reach a constant error in later time. The duration during which the effect of the random initial condition is significant may last a few hundred days in most aquifers. The constant error in groundwater in later time is due to the combined effects of the uncertain source/sink and flux boundary: the closer to the flux boundary, the larger the error. The error caused by the uncertain head boundary is limited in a narrow zone near the boundary but it remains more or less constant over time. The effect of the heterogeneity is to increase the variation of groundwater level and the maximum effect occurs close to the constant head boundary because of the linear mean hydraulic gradient. The correlation of groundwater level decreases with temporal interval and spatial distance. In addition, the heterogeneity enhances the correlation of groundwater level, especially at larger time intervals and small spatial distances.

  14. Determination of uncertainty of automated emission measuring systems under field conditions using a second method as a reference

    Energy Technology Data Exchange (ETDEWEB)

    Puustinen, H.; Aunela-Tapola, L.; Tolvanen, M.; Vahlman, T. [VTT Chemical Technology, Espoo (Finland). Environmental Technology; Kovanen, K. [VTT Building Technology, Espoo (Finland). Building Physics, Building Services and Fire Technology

    1999-09-01

    This report presents a procedure to determine the uncertainty of an automated emission measuring system (AMS) by comparing the results with a second method (REF). The procedure determines the uncertainty of AMS by comparing the final concentration and emission results of AMS and REF. In this way, the data processing of the plant is included in the result evaluation. This procedure assumes that the uncertainty of REF is known and determined in due form. The uncertainty determination has been divided into two cases; varying and nearly constant concentration. The suggested procedure calculates the uncertainty of AMS at the 95 % confidence level by a tabulated t-value. A minimum of three data pairs is required. However, a higher amount of data pairs is desirable, since a low amount of data pairs results in a higher uncertainty of AMS. The uncertainty of AMS is valid only within the range of concentrations at which the tests were carried out. Statistical data processing shows that the uncertainty of the reference method has a significant effect on the uncertainty of AMS, which always becomes larger than the uncertainty of REF. This should be taken into account when testing whether AMS fulfils the given uncertainty limits. Practical details, concerning parallel measurements at the plant, and the costs of the measurement campaign, have been taken into account when suggesting alternative ways for implementing the comparative measurements. (orig.) 6 refs.

  15. Uncertainty of Blood Alcohol Concentration (BAC Results as Related to Instrumental Conditions: Optimization and Robustness of BAC Analysis Headspace Parameters

    Directory of Open Access Journals (Sweden)

    Haleigh A. Boswell

    2015-12-01

    Full Text Available Analysis of blood alcohol concentration is a routine analysis performed in many forensic laboratories. This analysis commonly utilizes static headspace sampling, followed by gas chromatography combined with flame ionization detection (GC-FID. Studies have shown several “optimal” methods for instrumental operating conditions, which are intended to yield accurate and precise data. Given that different instruments, sampling methods, application specific columns and parameters are often utilized, it is much less common to find information on the robustness of these reported conditions. A major problem can arise when these “optimal” conditions may not also be robust, thus producing data with higher than desired uncertainty or potentially inaccurate results. The goal of this research was to incorporate the principles of quality by design (QBD in the adjustment and determination of BAC (blood alcohol concentration instrumental headspace parameters, thereby ensuring that minor instrumental variations, which occur as a matter of normal work, do not appreciably affect the final results of this analysis. This study discusses both the QBD principles as well as the results of the experiments, which allow for determination of more favorable instrumental headspace conditions. Additionally, method detection limits will also be reported in order to determine a reporting threshold and the degree of uncertainty at the common threshold value of 0.08 g/dL. Furthermore, the comparison of two internal standards, n-propanol and t-butanol, will be investigated. The study showed that an altered parameter of 85 °C headspace oven temperature and 15 psi headspace vial pressurization produces the lowest percent relative standard deviation of 1.3% when t-butanol is implemented as an internal standard, at least for one very common platform. The study also showed that an altered parameter of 100 °C headspace oven temperature and 15-psi headspace vial pressurization

  16. An Innovative Continuing Nursing Education Program Targeting Key Geriatric Conditions for Hospitalized Older People in China

    Science.gov (United States)

    Xiao, Lily Dongxia; Shen, Jun; Wu, Haifeng; Ding, Fu; He, Xizhen; Zhu, Yueping

    2013-01-01

    A lack of knowledge in registered nurses about geriatric conditions is one of the major factors that contribute to these conditions being overlooked in hospitalized older people. In China, an innovative geriatric continuing nursing education program aimed at developing registered nurses' understanding of the complex care needs of hospitalized…

  17. Robust and predictive fuzzy key performance indicators for condition-based treatment of squats in railway infrastructures

    NARCIS (Netherlands)

    Jamshidi, A.; Nunez Vicencio, Alfredo; Dollevoet, R.P.B.J.; Li, Z.

    2017-01-01

    This paper presents a condition-based treatment methodology for a type of rail surface defect called squat. The proposed methodology is based on a set of robust and predictive fuzzy key performance indicators. A fuzzy Takagi-Sugeno interval model is used to predict squat evolution for different

  18. 48{sup th} Annual meeting on nuclear technology (AMNT 2017). Key topic / Enhanced safety and operation excellence. Focus session: Uncertainty analyses in reactor core simulations

    Energy Technology Data Exchange (ETDEWEB)

    Zwermann, Winfried [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany). Forschungszentrum

    2017-12-15

    The supplementation of reactor simulations by uncertainty analyses is becoming increasingly important internationally due to the fact that the reliability of simulation calculations can be significantly increased by the quantification of uncertainties in comparison to the use of so-called conservative methods (BEPU- ''Best-Estimate plus Uncertainties''). While systematic uncertainty analyses for thermo-hydraulic calculations have been performed routinely for a long time, methods for taking into account uncertainties in nuclear data, which are the basis for neutron transport calculations, are under development. The Focus Session Uncertainty Analyses in Reactor Core Simulations was intended to provide an overview of international research and development with respect to supplementing reactor core simulations with uncertainty and sensitivity analyses, in research institutes as well as within the nuclear industry. The presented analyses not only focused on light water reactors, but also on advanced reactor systems. Particular emphasis was put on international benchmarks in the field. The session was chaired by Winfried Zwermann (Gesellschaft fuer Anlagen- und Reaktorsicherheit).

  19. [Assessment on the changing conditions of ecosystems in key ecological function zones in China].

    Science.gov (United States)

    Huang, Lin; Cao, Wei; Wu, Dan; Gong, Guo-li; Zhao, Guo-song

    2015-09-01

    In this paper, the dynamics of ecosystem macrostructure, qualities and core services during 2000 and 2010 were analyzed for the key ecological function zones of China, which were classified into four types of water conservation, soil conservation, wind prevention and sand fixation, and biodiversity maintenance. In the water conservation ecological function zones, the areas of forest and grassland ecosystems were decreased whereas water bodies and wetland were increased in the past 11 years, and the water conservation volume of forest, grassland and wetland ecosystems increased by 2.9%. This region needs to reverse the decreasing trends of forest and grassland ecosystems. In the soil conservation ecological function zones, the area of farmland ecosystem was decreased, and the areas of forest, grassland, water bodies and wetland ecosystems were increased. The total amount of the soil erosion was reduced by 28.2%, however, the soil conservation amount of ecosystems increased by 38.1%. In the wind prevention and sand fixation ecological function zones, the areas of grassland, water bodies and wetland ecosystems were decreased, but forest and farmland ecosystems were increased. The unit amount of the soil. wind erosion was reduced and the sand fixation amount of ecosystems increased lightly. In this kind of region that is located in arid and semiarid areas, ecological conservation needs to reduce farmland area and give priority to the protection of the original ecological system. In the biodiversity maintenance ecological function zones, the areas of grassland and desert ecosystems were decreased and other types were increased. The human disturbances showed a weakly upward trend and needs to be reduced. The key ecological function zones should be aimed at the core services and the protecting objects, to assess quantitatively on the effectiveness of ecosystem conservation and improvement.

  20. Estimating the Economic Attractiveness of Investment Projects in Conditions of Uncertainty and Risk with the Use of Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Kotsyuba Oleksiy S.

    2018-02-01

    Full Text Available The article is concerned with the methodology of economic substantiation of real investments in case of considerable lack of information on possible fluctuations of initial parameters and the resulting risk. The analysis of sensitivity as the main instrument for accounting the risk in the indicated problem situation is the focus of the presented research. In the publication, on the basis of the apparatus of interval mathematics, a set of models for comparative estimation of economic attractiveness (efficiency of alternative investment projects in conditions of uncertainty and risk is formulated, using the sensitivity analysis. The developed instrumentarium assumes both mono- and poly-interval version of the sensitivity analysis. As the risk component in the constructed models is used: in some – values of the specially developed sensitivity coefficient, in others – the worst values, which are based on the interval estimations of the partial criteria of efficiency. The sensitivity coefficient, according to the approach proposed in the publication, is the ratio of the target semi-range of variation to the increase (economy of efficiency, which is provided when the basic level of the analyzed partial criterion of economic attractiveness in comparison with some of its threshold (limit value is being reached.

  1. The Personality Trait of Intolerance to Uncertainty Affects Behavior in a Novel Computer-Based Conditioned Place Preference Task

    Directory of Open Access Journals (Sweden)

    Milen Radell

    2016-08-01

    Full Text Available Recent work has found that personality factors that confer vulnerability to addiction can also affect learning and economic decision making. One personality trait which has been implicated in vulnerability to addiction is intolerance to uncertainty (IU, i.e. a preference for familiar over unknown (possible better options. In animals, the motivation to obtain drugs is often assessed through conditioned place preference (CPP, which compares preference for contexts where drug reward was previously received. It is an open question whether participants with high IU also show heightened preference for previously-rewarded contexts. To address this question, we developed a novel computer-based CPP task for humans in which participants guide an avatar through a paradigm in which one room contains frequent reward and one contains less frequent reward. Following exposure to both contexts, subjects are assessed for preference to enter the previously-rich and previously-poor room. Individuals with low IU showed little bias to enter the previously-rich room first, and instead entered both rooms at about the same rate. By contrast, those with high IU showed a strong bias to enter the previously-rich room first. This suggests an increased tendency to chase reward in the intolerant group, consistent with previously observed behavior in opioid-addicted individuals. Thus, high IU may represent a pre-existing cognitive bias that provides a mechanism to promote decision-making processes that increase vulnerability to addiction.

  2. Phosphatidylinositol 3-kinase is a key mediator of central sensitization in painful inflammatory conditions

    Science.gov (United States)

    Pezet, Sophie; Marchand, Fabien; D'Mello, Richard; Grist, John; Clark, Anna K.; Malcangio, Marzia; Dickenson, Anthony H.; Williams, Robert J.; McMahon, Stephen B.

    2010-01-01

    Here we show that phosphatidylinositol 3-kinase (PI3K) is a key player in the establishment of central sensitization, the spinal cord phenomenon associated with persistent afferent inputs and contributing to chronic pain states. We demonstrated electrophysiologically that PI3K is required for the full expression of spinal neuronal wind-up. In an inflammatory pain model, intrathecal administration of LY294002, a potent PI3K inhibitor, dose-dependently inhibited pain related behavior. This effect was correlated with a reduction of the phosphorylation of extracellular signal-regulated kinase (ERK) and CaMKinase II. In addition, we observed a significant decrease in the phosphorylation of the NMDA receptor subunit NR2B, decreased translocation to the plasma membrane of the GluR1 AMPA receptor subunit in the spinal cord and a reduction of evoked neuronal activity as measured using c-Fos immunohistochemistry. Our study suggests that PI3K is a major factor in the expression of central sensitization after noxious inflammatory stimuli. PMID:18417706

  3. Reducing conditions are the key for efficient production of active ribonuclease inhibitor in Escherichia coli

    Directory of Open Access Journals (Sweden)

    Neubauer Peter

    2011-05-01

    Full Text Available Abstract Background The eukaryotic RNase ribonuclease/angiogenin inhibitors (RI are a protein group distinguished by a unique structure - they are composed of hydrophobic leucine-rich repeat motifs (LRR and contain a high amount of reduced cysteine residues. The members of this group are difficult to produce in E. coli and other recombinant hosts due to their high aggregation tendency. Results In this work dithiothreitol (DTT was successfully applied for improving the yield of correctly folded ribonuclease/angiogenin inhibitor in E. coli K12 periplasmic and cytoplasmic compartments. The feasibility of the in vivo folding concepts for cytoplasmic and periplasmic production were demonstrated at batch and fed-batch cultivation modes in shake flasks and at the bioreactor scale. Firstly, the best secretion conditions of RI in the periplasmic space were evaluated by using a high throughput multifactorial screening approach of a vector library, directly with the Enbase fed-batch production mode in 96-well plates. Secondly, the effect of the redox environment was evaluated in isogenic dsbA+ and dsbA- strains at the various cultivation conditions with reducing agents in the cultivation medium. Despite the fusion to the signal peptide, highest activities were found in the cytoplasmic fraction. Thus by removing the signal peptide the positive effect of the reducing agent DTT was clearly proven also for the cytoplasmic compartment. Finally, optimal periplasmic and cytoplasmic RI fed-batch production processes involving externally added DTT were developed in shake flasks and scaled up to the bioreactor scale. Conclusions DTT highly improved both, periplasmic and cytoplasmic accumulation and activity of RI at low synthesis rate, i.e. in constructs harbouring weak recombinant synthesis rate stipulating genetic elements together with cultivation at low temperature. In a stirred bioreactor environment RI folding was strongly improved by repeated pulse addition

  4. Increased fitness of a key appendicularian zooplankton species under warmer, acidified seawater conditions.

    Directory of Open Access Journals (Sweden)

    Jean-Marie Bouquet

    Full Text Available Ocean warming and acidification (OA may alter the fitness of species in marine pelagic ecosystems through community effects or direct physiological impacts. We used the zooplanktonic appendicularian, Oikopleura dioica, to assess temperature and pH effects at mesocosm and microcosm scales. In mesocosms, both OA and warming positively impacted O. dioica abundance over successive generations. In microcosms, the positive impact of OA, was observed to result from increased fecundity. In contrast, increased pH, observed for example during phytoplankton blooms, reduced fecundity. Oocyte fertility and juvenile development were equivalent under all pH conditions, indicating that the positive effect of lower pH on O. dioica abundance was principally due to increased egg number. This effect was influenced by food quantity and quality, supporting possible improved digestion and assimilation at lowered pH. Higher temperature resulted in more rapid growth, faster maturation and earlier reproduction. Thus, increased temperature and reduced pH had significant positive impacts on O. dioica fitness through increased fecundity and shortened generation time, suggesting that predicted future ocean conditions may favour this zooplankton species.

  5. The Key Role of Government in Addressing the Pandemic of Micronutrient Deficiency Conditions in Southeast Asia

    Directory of Open Access Journals (Sweden)

    Theodore H. Tulchinsky

    2015-04-01

    Full Text Available Micronutrient deficiency conditions are a major global public health problem. While the private sector has an important role in addressing this problem, the main responsibility lies with national governments, in cooperation with international agencies and donors. Mandatory fortification of basic foods provides a basic necessary intake for the majority and needs to be supported by provision of essential vitamin and mineral supplements for mothers and children and other high risk groups. Fortification by government mandate and regulation is essential with cooperation by private sector food manufacturers, and in the context of broader policies for poverty reduction, education and agricultural reform. Iron, iodine, vitamin A, vitamin B complex, folic acid, zinc, vitamin D and vitamin B12 are prime examples of international fortification experience achieved by proactive governmental nutrition policies. These are essential to achieve the Millennium Development Goals and their follow-up sustainable global health targets. National governmental policies for nutritional security and initiatives are essential to implement both food fortification and targeted supplementation policies to reduce the huge burden of micronutrient deficiency conditions in Southeast Asia and other parts of the world.

  6. Development and Application of the Key Technologies for the Quality Control and Inspection of National Geographical Conditions Survey Products

    Science.gov (United States)

    Zhao, Y.; Zhang, L.; Ma, W.; Zhang, P.; Zhao, T.

    2018-04-01

    The First National Geographical Condition Survey is a predecessor task to dynamically master basic situations of the nature, ecology and human activities on the earth's surface and it is the brand-new mapping geographic information engineering. In order to ensure comprehensive, real and accurate survey results and achieve the quality management target which the qualified rate is 100 % and the yield is more than 80 %, it is necessary to carry out the quality control and result inspection for national geographical conditions survey on a national scale. To ensure that achievement quality meets quality target requirements, this paper develops the key technology method of "five-in-one" quality control that is constituted by "quality control system of national geographical condition survey, quality inspection technology system, quality evaluation system, quality inspection information management system and national linked quality control institutions" by aiming at large scale, wide coverage range, more undertaking units, more management levels, technical updating, more production process and obvious regional differences in the national geographical condition survey and combining with novel achievement manifestation, complicated dependency, more special reference data, and large data size. This project fully considering the domestic and foreign related research results and production practice experience, combined with the technology development and the needs of the production, it stipulates the inspection methods and technical requirements of each stage in the quality inspection of the geographical condition survey results, and extends the traditional inspection and acceptance technology, and solves the key technologies that are badly needed in the first national geographic survey.

  7. Effective regulation under conditions of scientific uncertainty: how collaborative networks contribute to occupational health and safety regulation for nanomaterials

    NARCIS (Netherlands)

    Reichow, Aline

    2015-01-01

    This thesis seeks to understand, and evaluate, the contribution of business associations within the United States (US) and German chemical sector, to the effective regulation of nanomaterials. In the effective regulation of new technologies characterized by high scientific uncertainty, with

  8. Field measurements of key parameters associated with nocturnal OBT formation in vegetables grown under Canadian conditions

    International Nuclear Information System (INIS)

    Kim, S.B.; Workman, W.G.; Korolevych, V.; Davis, P.A.

    2012-01-01

    The objective of this study was to provide the parameter values required to model OBT formation in the edible parts of plants following a hypothetical accidental tritium release to the atmosphere at night. The parameters considered were leaf area index, stomatal resistance, photosynthesis rate, the photosynthetic production rate of starch, the nocturnal hydrolysis rate of starch, the fraction of starch produced daily by photosynthesis that appears in the fruits, and the mass of the fruit. Values of these parameters were obtained in the summer of 2002 for lettuce, radishes and tomatoes grown under typical Canadian environmental conditions. Based on the maximum observed photosynthetic rate and growth rate, the fraction of starch translocated to the fruit was calculated to be 17% for tomato fruit and 14% for radish root. - Highlights: ► Plant physiological parameters affecting nocturnal OBT formation have been investigated. ► The fraction of starch produced daily by photosynthesis in the leaves that appears in the fruit was calculated. ► Realistic estimates of OBT concentrations following a nighttime accidental HTO release to the atmosphere.

  9. Plant cell death and cellular alterations induced by ozone: Key studies in Mediterranean conditions

    International Nuclear Information System (INIS)

    Faoro, Franco; Iriti, Marcello

    2009-01-01

    An account of histo-cytological and ultrastructural studies on ozone effect on crop and forest species in Italy is given, with emphasis on induced cell death and the underlying mechanisms. Cell death phenomena possibly due to ambient O 3 were recorded in crop and forest species. In contrast, visible O 3 effects on Mediterranean vegetation are often unclear. Microscopy is thus suggested as an effective tool to validate and evaluate O 3 injury to Mediterranean vegetation. A DAB-Evans blue staining was proposed to validate O 3 symptoms at the microscopic level and for a pre-visual diagnosis of O 3 injury. The method has been positively tested in some of the most important crop species, such as wheat, tomato, bean and onion and, with some restriction, in forest species, and it also allows one to gain some very useful insights into the mechanisms at the base of O 3 sensitivity or tolerance. - Ozone-induced cell death is a frequent phenomenon in Mediterranean conditions, not only in the most sensitive crops but also in forest species.

  10. Effects of Different Drying Conditions on Key Quality Parameters of Pink Peppercorns (Schinus terebinthifolius Raddi

    Directory of Open Access Journals (Sweden)

    Bruno Guzzo Silva

    2017-01-01

    Full Text Available Pink peppercorns are among the most sophisticated condiments in the international cuisine. This culinary spice is obtained from dried fruits of Schinus terebinthifolius Raddi, a species native to South America. In this work, a methodology for the assessment of pink peppercorn quality under various drying conditions was defined. Experiments were performed in a pilot tray dryer, which ensured integrity of the product. A central composite rotatable design with 11 experiments was devised to study the influence of drying air temperature (35–75°C and air velocity (0.3–0.9 m/s on product quality, assessed by moisture content, color (CIELAB system, and volatile compounds. The essential oils of fresh and dried fruits were extracted by hydrodistillation and analyzed by gas chromatography coupled to mass spectrometry. Air temperature had the greatest influence on the quality parameters under study, while air velocity had no statistically significant effect. Considering all quality criteria, temperatures between 40 and 55°C provided the best compromise, yielding an adequate moisture content in the dried product without dramatic degradation of color and essential oil.

  11. Uncertainties in the Antarctic Ice Sheet Contribution to Sea Level Rise: Exploration of Model Response to Errors in Climate Forcing, Boundary Conditions, and Internal Parameters

    Science.gov (United States)

    Schlegel, N.; Seroussi, H. L.; Boening, C.; Larour, E. Y.; Limonadi, D.; Schodlok, M.; Watkins, M. M.

    2017-12-01

    The Jet Propulsion Laboratory-University of California at Irvine Ice Sheet System Model (ISSM) is a thermo-mechanical 2D/3D parallelized finite element software used to physically model the continental-scale flow of ice at high resolutions. Embedded into ISSM are uncertainty quantification (UQ) tools, based on the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) software. ISSM-DAKOTA offers various UQ methods for the investigation of how errors in model input impact uncertainty in simulation results. We utilize these tools to regionally sample model input and key parameters, based on specified bounds of uncertainty, and run a suite of continental-scale 100-year ISSM forward simulations of the Antarctic Ice Sheet. Resulting diagnostics (e.g., spread in local mass flux and regional mass balance) inform our conclusion about which parameters and/or forcing has the greatest impact on century-scale model simulations of ice sheet evolution. The results allow us to prioritize the key datasets and measurements that are critical for the minimization of ice sheet model uncertainty. Overall, we find that Antartica's total sea level contribution is strongly affected by grounding line retreat, which is driven by the magnitude of ice shelf basal melt rates and by errors in bedrock topography. In addition, results suggest that after 100 years of simulation, Thwaites glacier is the most significant source of model uncertainty, and its drainage basin has the largest potential for future sea level contribution. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  12. Evaluation of uncertainties of key neutron parameters of PWR-type reactors with slab fuel, application to neutronic conformity; Determination des incertitudes liees aux grandeurs neutroniques d'interet des reacteurs a eau pressurisee a plaques combustibles et application aux etudes de conformite

    Energy Technology Data Exchange (ETDEWEB)

    Bernard, D

    2001-12-01

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and life-time. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then, neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimized. (author)

  13. Evaluation of uncertainties of key neutron parameters of PWR-type reactors with slab fuel, application to neutronic conformity; Determination des incertitudes liees aux grandeurs neutroniques d'interet des reacteurs a eau pressurisee a plaques combustibles et application aux etudes de conformite

    Energy Technology Data Exchange (ETDEWEB)

    Bernard, D

    2001-12-01

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and life-time. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then, neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimized. (author)

  14. DEVELOPMENT AND APPLICATION OF THE KEY TECHNOLOGIES FOR THE QUALITY CONTROL AND INSPECTION OF NATIONAL GEOGRAPHICAL CONDITIONS SURVEY PRODUCTS

    Directory of Open Access Journals (Sweden)

    Y. Zhao

    2018-04-01

    Full Text Available The First National Geographical Condition Survey is a predecessor task to dynamically master basic situations of the nature, ecology and human activities on the earth’s surface and it is the brand-new mapping geographic information engineering. In order to ensure comprehensive, real and accurate survey results and achieve the quality management target which the qualified rate is 100 % and the yield is more than 80 %, it is necessary to carry out the quality control and result inspection for national geographical conditions survey on a national scale. To ensure that achievement quality meets quality target requirements, this paper develops the key technology method of “five-in-one” quality control that is constituted by “quality control system of national geographical condition survey, quality inspection technology system, quality evaluation system, quality inspection information management system and national linked quality control institutions” by aiming at large scale, wide coverage range, more undertaking units, more management levels, technical updating, more production process and obvious regional differences in the national geographical condition survey and combining with novel achievement manifestation, complicated dependency, more special reference data, and large data size. This project fully considering the domestic and foreign related research results and production practice experience, combined with the technology development and the needs of the production, it stipulates the inspection methods and technical requirements of each stage in the quality inspection of the geographical condition survey results, and extends the traditional inspection and acceptance technology, and solves the key technologies that are badly needed in the first national geographic survey.

  15. Hydrogeological boundary settings in SR 97. Uncertainties in regional boundary settings and transfer of boundary conditions to site-scale models

    International Nuclear Information System (INIS)

    Follin, S.

    1999-06-01

    The SR 97 project presents a performance assessment (PA) of the overall safety of a hypothetical deep repository at three sites in Sweden arbitrarily named Aberg, Beberg and Ceberg. One component of this PA assesses the uncertainties in the hydrogeological modelling. This study focuses on uncertainties in boundary settings (size of model domain and boundary conditions) in the regional and site-scale hydrogeological modelling of the three sites used to simulating the possible transport of radionuclides from the emplacement waste packages through the host rock to the accessible environment. Model uncertainties associated with, for instance, parameter heterogeneity and structural interpretations are addressed in other studies. This study concludes that the regional modelling of the SR 97 project addresses uncertainties in the choice of boundary conditions and size of model domain differently at each site, although the overall handling is acceptable and in accordance with common modelling practice. For example, the treatment of uncertainties with regard to the ongoing post-glacial flushing of the Baltic Shield is creditably addressed although not exhaustive from a modelling point of view. A significant contribution of the performed modelling is the study of nested numerical models, i.e., the numerical interplay between regional and site-scale numerical models. In the site-scale modelling great efforts are made to address problems associated with (i) the telescopic mesh refinement (TMR) technique with regard to the stochastic continuum approach, and (ii) the transfer of boundary conditions between variable-density flow systems and flow systems that are constrained to treat uniform density flow. This study concludes that the efforts made to handle these problems are acceptable with regards to the objectives of the SR 97 project

  16. Uncertainty and Cognitive Control

    Directory of Open Access Journals (Sweden)

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  17. An interdisciplinary approach to volcanic risk reduction under conditions of uncertainty: a case study of Tristan da Cunha

    Science.gov (United States)

    Hicks, A.; Barclay, J.; Simmons, P.; Loughlin, S.

    2014-07-01

    The uncertainty brought about by intermittent volcanic activity is fairly common at volcanoes worldwide. While better knowledge of any one volcano's behavioural characteristics has the potential to reduce this uncertainty, the subsequent reduction of risk from volcanic threats is only realised if that knowledge is pertinent to stakeholders and effectively communicated to inform good decision making. Success requires integration of methods, skills and expertise across disciplinary boundaries. This research project develops and trials a novel interdisciplinary approach to volcanic risk reduction on the remote volcanic island of Tristan da Cunha (South Atlantic). For the first time, volcanological techniques, probabilistic decision support and social scientific methods were integrated in a single study. New data were produced that (1) established no spatio-temporal pattern to recent volcanic activity; (2) quantified the high degree of scientific uncertainty around future eruptive scenarios; (3) analysed the physical vulnerability of the community as a consequence of their geographical isolation and exposure to volcanic hazards; (4) evaluated social and cultural influences on vulnerability and resilience; and (5) evaluated the effectiveness of a scenario planning approach, both as a method for integrating the different strands of the research and as a way of enabling on-island decision makers to take ownership of risk identification and management, and capacity building within their community. The paper provides empirical evidence of the value of an innovative interdisciplinary framework for reducing volcanic risk. It also provides evidence for the strength that comes from integrating social and physical sciences with the development of effective, tailored engagement and communication strategies in volcanic risk reduction.

  18. An interdisciplinary approach to volcanic risk reduction under conditions of uncertainty: a case study of Tristan da Cunha

    Science.gov (United States)

    Hicks, A.; Barclay, J.; Simmons, P.; Loughlin, S.

    2013-12-01

    This research project adopted an interdisciplinary approach to volcanic risk reduction on the remote volcanic island of Tristan da Cunha (South Atlantic). New data were produced that: (1) established no spatio-temporal pattern to recent volcanic activity; (2) quantified the high degree of scientific uncertainty around future eruptive scenarios; (3) analysed the physical vulnerability of the community as a consequence of their geographical isolation and exposure to volcanic hazards; (4) evaluated social and cultural influences on vulnerability and resilience. Despite their isolation and prolonged periods of hardship, islanders have demonstrated an ability to cope with and recover from adverse events. This resilience is likely a function of remoteness, strong kinship ties, bonding social capital, and persistence of shared values and principles established at community inception. While there is good knowledge of the styles of volcanic activity on Tristan, given the high degree of scientific uncertainty about the timing, size and location of future volcanism, a qualitative scenario planning approach was used as a vehicle to convey this information to the islanders. This deliberative, anticipatory method allowed on-island decision makers to take ownership of risk identification, management and capacity building within their community. This paper demonstrates the value of integrating social and physical sciences with development of effective, tailored communication strategies in volcanic risk reduction.

  19. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  20. Elements of a pragmatic approach for dealing with bias and uncertainty in experiments through predictions : experiment design and data conditioning; %22real space%22 model validation and conditioning; hierarchical modeling and extrapolative prediction.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose

    2011-11-01

    This report explores some important considerations in devising a practical and consistent framework and methodology for utilizing experiments and experimental data to support modeling and prediction. A pragmatic and versatile 'Real Space' approach is outlined for confronting experimental and modeling bias and uncertainty to mitigate risk in modeling and prediction. The elements of experiment design and data analysis, data conditioning, model conditioning, model validation, hierarchical modeling, and extrapolative prediction under uncertainty are examined. An appreciation can be gained for the constraints and difficulties at play in devising a viable end-to-end methodology. Rationale is given for the various choices underlying the Real Space end-to-end approach. The approach adopts and refines some elements and constructs from the literature and adds pivotal new elements and constructs. Crucially, the approach reflects a pragmatism and versatility derived from working many industrial-scale problems involving complex physics and constitutive models, steady-state and time-varying nonlinear behavior and boundary conditions, and various types of uncertainty in experiments and models. The framework benefits from a broad exposure to integrated experimental and modeling activities in the areas of heat transfer, solid and structural mechanics, irradiated electronics, and combustion in fluids and solids.

  1. Effect of culturing conditions on the expression of key enzymes in the proteolytic system of Lactobacillus bulgaricus *

    Science.gov (United States)

    Hou, Jun-cai; Liu, Fei; Ren, Da-xi; Han, Wei-wei; Du, Yue-ou

    2015-01-01

    The proteolytic system of Lactobacillus bulgaricus breaks down milk proteins into peptides and amino acids, which are essential for the growth of the bacteria. The aim of this study was to determine the expressions of seven key genes in the proteolytic system under different culturing conditions (different phases, initial pH values, temperatures, and nitrogen sources) using real-time polymerase chain reaction (RT-PCR). The transcriptions of the seven genes were reduced by 30-fold on average in the stationary phase compared with the exponential growth phase. The transcriptions of the seven genes were reduced by 62.5-, 15.0-, and 59.0-fold in the strains KLDS 08006, KLDS 08007, and KLDS 08012, respectively, indicating that the expressions of the seven genes were significantly different among strains. In addition, the expressions of the seven genes were repressed in the MRS medium containing casein peptone. The effect of peptone supply on PepX transcription was the weakest compared with the other six genes, and the impact on OppD transcription was the strongest. Moreover, the expressions of the seven genes were significantly different among different strains (PLactobacillus bulgaricus at the transcription level. PMID:25845365

  2. ngVLA Key Science Goal 2: Probing the Initial Conditions for Planetary Systems and Life with Astrochemistry

    Science.gov (United States)

    McGuire, Brett; ngVLA Science Working Group 1

    2018-01-01

    One of the most challenging aspects in understanding the origin and evolution of planets and planetary systems is tracing the influence of chemistry on the physical evolution of a system from a molecular cloud to a solar system. Existing facilities have already shown the stunning degree of molecular complexity present in these systems. The unique combination of sensitivity and spatial resolution offered by the ngVLA will permit the observation of both highly complex and very low-abundance chemical species that are exquisitely sensitive to the physical conditions and evolutionary history of their sources, which are out of reach of current observatories. In turn, by understanding the chemical evolution of these complex molecules, unprecedentedly detailed astrophysical insight can be gleaned from these astrochemical observations.This poster will overview a number of key science goals in astrochemistry which will be enabled by the ngVLA, including:1) imaging of the deepest, densest regions in protoplanetary disks and unveiling the physical history through isotopic ratios2) probing the ammonia snow line in these disks, thought to be the only viable tracer of the water snowline3) observations of the molecular content of giant planet atmospheres4) detections of new, complex molecules, potentially including the simplest amino acids and sugars5) tracing the origin of chiral excess in star-forming regions

  3. Influence of postharvest processing and storage conditions on key antioxidants in pūhā (Sonchus oleraceus L.).

    Science.gov (United States)

    Ou, Zong-Quan; Schmierer, David M; Strachan, Clare J; Rades, Thomas; McDowell, Arlene

    2014-07-01

    To investigate effects of different postharvest drying processes and storage conditions on key antioxidants in Sonchus oleraceus L. leaves. Fresh leaves were oven-dried (60°C), freeze-dried or air-dried (∼25°C) for 6 h, 24 h and 3 days, respectively. Design of experiments (DOE) was applied to study the stability of antioxidants (caftaric, chlorogenic and chicoric acids) in S. oleraceus leaves and leaf extracts stored at different temperatures (4, 25 and 50°C) and relative humidities (15%, 43% and 75%) for 180 days. The concentration of antioxidants was quantified by a HPLC-2,2'-diphenylpicrylhydrazyl post-column derivatisation method. Antioxidant activity was assessed by a cellular antioxidant activity assay. The three antioxidants degraded to unquantifiable levels after oven-drying. More than 90% of the antioxidants were retained by freeze-drying and air-drying. Both leaf and extract samples retained >90% of antioxidants, except those stored at 75% relative humidity. Leaf material had higher antioxidant concentrations and greater cellular antioxidant activity than corresponding extract samples. Freeze-drying and air-drying preserved more antioxidants in S. oleraceus than oven-drying. From DOE analysis, humidity plays an important role in degradation of antioxidants during storage. To preserve antioxidant activity, it is preferable to store S. oleraceus as dried leaf material. © 2014 Royal Pharmaceutical Society.

  4. A Narrow-Linewidth Atomic Line Filter for Free Space Quantum Key Distribution under Daytime Atmospheric Conditions

    Science.gov (United States)

    Brown, Justin; Woolf, David; Hensley, Joel

    2016-05-01

    Quantum key distribution can provide secure optical data links using the established BB84 protocol, though solar backgrounds severely limit the performance through free space. Several approaches to reduce the solar background include time-gating the photon signal, limiting the field of view through geometrical design of the optical system, and spectral rejection using interference filters. Despite optimization of these parameters, the solar background continues to dominate under daytime atmospheric conditions. We demonstrate an improved spectral filter by replacing the interference filter (Δν ~ 50 GHz) with an atomic line filter (Δν ~ 1 GHz) based on optical rotation of linearly polarized light through a warm Rb vapor. By controlling the magnetic field and the optical depth of the vapor, a spectrally narrow region can be transmitted between crossed polarizers. We find that the transmission is more complex than a single peak and evaluate peak transmission as well as a ratio of peak transmission to average transmission of the local spectrum. We compare filters containing a natural abundance of Rb with those containing isotopically pure 87 Rb and 85 Rb. A filter providing > 95 % transmission and Δν ~ 1.1 GHz is achieved.

  5. Effect of culturing conditions on the expression of key enzymes in the proteolytic system of Lactobacillus bulgaricus.

    Science.gov (United States)

    Hou, Jun-cai; Liu, Fei; Ren, Da-xi; Han, Wei-wei; Du, Yue-ou

    2015-04-01

    The proteolytic system of Lactobacillus bulgaricus breaks down milk proteins into peptides and amino acids, which are essential for the growth of the bacteria. The aim of this study was to determine the expressions of seven key genes in the proteolytic system under different culturing conditions (different phases, initial pH values, temperatures, and nitrogen sources) using real-time polymerase chain reaction (RT-PCR). The transcriptions of the seven genes were reduced by 30-fold on average in the stationary phase compared with the exponential growth phase. The transcriptions of the seven genes were reduced by 62.5-, 15.0-, and 59.0-fold in the strains KLDS 08006, KLDS 08007, and KLDS 08012, respectively, indicating that the expressions of the seven genes were significantly different among strains. In addition, the expressions of the seven genes were repressed in the MRS medium containing casein peptone. The effect of peptone supply on PepX transcription was the weakest compared with the other six genes, and the impact on OppD transcription was the strongest. Moreover, the expressions of the seven genes were significantly different among different strains (Pproteolytic system genes in Lactobacillus bulgaricus at the transcription level.

  6. Characterization of XR-RV3 GafChromic{sup ®} films in standard laboratory and in clinical conditions and means to evaluate uncertainties and reduce errors

    Energy Technology Data Exchange (ETDEWEB)

    Farah, J., E-mail: jad.farah@irsn.fr; Clairand, I.; Huet, C. [External Dosimetry Department, Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP-17, 92260 Fontenay-aux-Roses (France); Trianni, A. [Medical Physics Department, Udine University Hospital S. Maria della Misericordia (AOUD), p.le S. Maria della Misericordia, 15, 33100 Udine (Italy); Ciraj-Bjelac, O. [Vinca Institute of Nuclear Sciences (VINCA), P.O. Box 522, 11001 Belgrade (Serbia); De Angelis, C. [Department of Technology and Health, Istituto Superiore di Sanità (ISS), Viale Regina Elena 299, 00161 Rome (Italy); Delle Canne, S. [Fatebenefratelli San Giovanni Calibita Hospital (FBF), UOC Medical Physics - Isola Tiberina, 00186 Rome (Italy); Hadid, L.; Waryn, M. J. [Radiology Department, Hôpital Jean Verdier (HJV), Avenue du 14 Juillet, 93140 Bondy Cedex (France); Jarvinen, H.; Siiskonen, T. [Radiation and Nuclear Safety Authority (STUK), P.O. Box 14, 00881 Helsinki (Finland); Negri, A. [Veneto Institute of Oncology (IOV), Via Gattamelata 64, 35124 Padova (Italy); Novák, L. [National Radiation Protection Institute (NRPI), Bartoškova 28, 140 00 Prague 4 (Czech Republic); Pinto, M. [Istituto Nazionale di Metrologia delle Radiazioni Ionizzanti (ENEA-INMRI), C.R. Casaccia, Via Anguillarese 301, I-00123 Santa Maria di Galeria (RM) (Italy); Knežević, Ž. [Ruđer Bošković Institute (RBI), Bijenička c. 54, 10000 Zagreb (Croatia)

    2015-07-15

    Purpose: To investigate the optimal use of XR-RV3 GafChromic{sup ®} films to assess patient skin dose in interventional radiology while addressing the means to reduce uncertainties in dose assessment. Methods: XR-Type R GafChromic films have been shown to represent the most efficient and suitable solution to determine patient skin dose in interventional procedures. As film dosimetry can be associated with high uncertainty, this paper presents the EURADOS WG 12 initiative to carry out a comprehensive study of film characteristics with a multisite approach. The considered sources of uncertainties include scanner, film, and fitting-related errors. The work focused on studying film behavior with clinical high-dose-rate pulsed beams (previously unavailable in the literature) together with reference standard laboratory beams. Results: First, the performance analysis of six different scanner models has shown that scan uniformity perpendicular to the lamp motion axis and that long term stability are the main sources of scanner-related uncertainties. These could induce errors of up to 7% on the film readings unless regularly checked and corrected. Typically, scan uniformity correction matrices and reading normalization to the scanner-specific and daily background reading should be done. In addition, the analysis on multiple film batches has shown that XR-RV3 films have generally good uniformity within one batch (<1.5%), require 24 h to stabilize after the irradiation and their response is roughly independent of dose rate (<5%). However, XR-RV3 films showed large variations (up to 15%) with radiation quality both in standard laboratory and in clinical conditions. As such, and prior to conducting patient skin dose measurements, it is mandatory to choose the appropriate calibration beam quality depending on the characteristics of the x-ray systems that will be used clinically. In addition, yellow side film irradiations should be preferentially used since they showed a lower

  7. "What time is my next meal?" delay-discounting individuals choose smaller portions under conditions of uncertainty.

    Science.gov (United States)

    Zimmerman, Annie R; Ferriday, Danielle; Davies, Sarah R; Martin, Ashley A; Rogers, Peter J; Mason, Alice; Brunstrom, Jeffrey M

    2017-09-01

    'Dietary' delay discounting is typically framed as a trade-off between immediate rewards and long-term health concerns. Our contention is that prospective thinking also occurs over shorter periods, and is engaged to select portion sizes based on the interval between meals (inter-meal interval; IMI). We sought to assess the extent to which the length of an IMI influences portion-size selection. We predicted that delay discounters would show 'IMI insensitivity' (relative lack of concern about hunger or fullness between meals). In particular, we were interested in participants' sensitivity to an uncertain IMI. We hypothesized that when meal times were uncertain, delay discounters would be less responsive and select smaller portion sizes. Participants (N = 90) selected portion sizes for lunch. In different trials, they were told to expect dinner at 5pm, 9pm, and either 5pm or 9pm (uncertain IMI). Individual differences in future-orientation were measured using a monetary delay-discounting task. Participants chose larger portions when the IMI was longer (p relationship between BMI and smaller portion selection in uncertainty (p < 0.05). This is the first study to report an association between delay discounting and IMI insensitivity. We reason that delay discounters selected smaller portions because they were less sensitive to the uncertain IMI, and overlooked concerns about potential future hunger. These findings are important because they illustrate that differences in discounting are expressed in short-term portion-size decisions and suggest that IMI insensitivity increases when meal timings are uncertain. Further research is needed to confirm whether these findings generalise to other populations. Copyright © 2017. Published by Elsevier Ltd.

  8. Culture conditions for equine bone marrow mesenchymal stem cells and expression of key transcription factors during their differentiation into osteoblasts

    Science.gov (United States)

    2013-01-01

    Background The use of equine bone marrow mesenchymal stem cells (BMSC) is a novel method to improve fracture healing in horses. However, additional research is needed to identify optimal culture conditions and to determine the mechanisms involved in regulating BMSC differentiation into osteoblasts. The objectives of the experiments were to determine: 1) if autologous or commercial serum is better for proliferation and differentiation of equine BMSC into osteoblasts, and 2) the expression of key transcription factors during the differentiation of equine BMSC into osteoblasts. Equine BMSC were isolated from the sterna of 3 horses, treated with purchased fetal bovine serum (FBS) or autologous horse serum (HS), and cell proliferation determined. To induce osteoblast differentiation, cells were incubated with L-ascorbic acid-2-phosphate and glycerol-2-phosphate in the presence or absence of human bone morphogenetic protein2 (BMP2), dexamethasone (DEX), or combination of the two. Alkaline phosphatase (ALP) activity, a marker of osteoblast differentiation, was determined by ELISA. Total RNA was isolated from differentiating BMSC between d 0 to 18 to determine expression of runt-related transcription factor2 (Runx2), osterix (Osx), and T-box3 (Tbx3). Data were analyzed by ANOVA. Results Relative to control, FBS and HS increased cell number (133 ± 5 and 116 ± 5%, respectively; P  0.8). Runt-related transcription factor2 expression increased 3-fold (P equine BMSC into osteoblasts. In addition, expression of Runx2 and osterix increased and expression of Tbx3 is reduced during differentiation. PMID:24169030

  9. Development of a Dynamic Lidar Uncertainty Framework

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clifton, Andrew [WindForS; Bonin, Timothy [CIRES/NOAA ESRL; Choukulkar, Aditya [CIRES/NOAA ESRL; Brewer, W. Alan [NOAA ESRL; Delgado, Ruben [University of Maryland Baltimore County

    2017-08-07

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote-sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote-sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote-sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards for quantifying remote sensing device uncertainty for power performance testing consider uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device and are generally fixed, leading to climatic uncertainty values that apply to the entire measurement campaign. However, real-world experience and a consideration of the fundamentals of the measurement process have shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we describe the development of a new dynamic lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from a field measurement site to assess the ability of the framework to predict

  10. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  11. A New Framework for Quantifying Lidar Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer, F.; Clifton, Andrew; Bonin, Timothy A.; Churchfield, Matthew J.

    2017-03-24

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards discuss uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device. However, real-world experience has shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we propose the development of a new lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from an operational wind farm to assess the ability of the framework to predict errors in lidar-measured wind speed.

  12. Stabilization of burn conditions in an ITER FEAT like Tokamak with uncertainties in the helium ash confinement time

    International Nuclear Information System (INIS)

    Vitela, J.E.

    2004-01-01

    In this work we demonstrate using a two-temperature volume averaged 0-D model that robust stabilization, with regard the helium ash confinement time, of the burn conditions of a tokamak reactor with the ITER FEAT design parameters can be achieved using Radial Basis Neural Networks (RBNN). Alpha particle thermalization time delay is taken into account in this model. The control actions implemented by means of a RBNN, include the modulation of the D-T (deuterium and tritium) refueling rate, a neutral He-4 injection beam and auxiliary heating powers to ions and electrons; all of them constrained to lie within allowable range values. Here we assume that the tokamak follows the IPB98(y,2) scaling for the energy confinement time, while helium ash confinement time is assumed to be independently estimated on-line. The D-T and helium ash particle confinement times are assumed to keep a constant relationship at all times. An on-line noisy estimation of the helium ash confinement time due to measurements is simulated by corrupting it with pseudo Gaussian noise. (author)

  13. Uncertainty and sensitivity analysis for two-phase flow in the vicinity of the repository in the 1996 performance assessment for the Waste Isolation Pilot Plant: Disturbed conditions

    International Nuclear Information System (INIS)

    HELTON, JON CRAIG; BEAN, J.E.; ECONOMY, K.; GARNER, J.W.; MACKINNON, ROBERT J.; MILLER, JOEL D.; SCHREIBER, J.D.; VAUGHN, PALMER

    2000-01-01

    Uncertainty and sensitivity analysis results obtained in the 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) are presented for two-phase flow in the vicinity of the repository under disturbed conditions resulting from drilling intrusions. Techniques based on Latin hypercube sampling, examination of scatterplots, stepwise regression analysis, partial correlation analysis and rank transformations are used to investigate brine inflow, gas generation repository pressure, brine saturation and brine and gas outflow. Of the variables under study, repository pressure and brine flow from the repository to the Culebra Dolomite are potentially the most important in PA for the WIPP. Subsequent to a drilling intrusion repository pressure was dominated by borehole permeability and generally below the level (i.e., 8 MPa) that could potentially produce spallings and direct brine releases. Brine flow from the repository to the Culebra Dolomite tended to be small or nonexistent with its occurrence and size also dominated by borehole permeability

  14. Application of data representation by fuzzy conditional propositions in the modeling of measurement uncertainty; Aplicacao da representacao de dados por proposicoes condicionais difusas na modelagem da incerteza de medicao

    Energy Technology Data Exchange (ETDEWEB)

    Magalhaes, A.N. de; Lambert-Torres, G.; Rissino, S.; Silva, M.F. da; Silva, L.E. Borges da; Carvalho, L.M.R. de

    2009-07-01

    It is not an easy task to frame uncertainty measurement problems by means of differential equations quickly and satisfactorily. Therefore, it is necessary to adapt the method for data representation by conditional fuzzy propositions for modeling uncertainties measurement and their effect on the propagation. This method provides a parametric adjustment for fuzzy sets of assumptions, and the functions of consequence of each rule in the manner of a parable. The paper introduces concepts of sources of errors in measures, fundamentals of fuzzy logic, description of the algorithm method, application to error detection and representation of global uncertainty.

  15. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  16. Identification of key factors in Accelerated Low Water Corrosion through experimental simulation of tidal conditions: influence of stimulated indigenous microbiota

    NARCIS (Netherlands)

    Marty, F.; Gueuné, H.; Malard, E.; Sánchez-Amaya, J.M.; Sjögren, L.; Abbas, B.; Quillet, L.; van Loosdrecht, M.C.M.; Muyzer, G.

    2014-01-01

    Biotic and abiotic factors favoring Accelerated Low Water Corrosion (ALWC) on harbor steel structures remain unclear warranting their study under controlled experimental tidal conditions. Initial stimulation of marine microbial consortia by a pulse of organic matter resulted in localized corrosion

  17. Development of Evaluation Code for MUF Uncertainty

    International Nuclear Information System (INIS)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan

    2015-01-01

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities

  18. Development of Evaluation Code for MUF Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities.

  19. Updated numerical model with uncertainty assessment of 1950-56 drought conditions on brackish-water movement within the Edwards aquifer, San Antonio, Texas

    Science.gov (United States)

    Brakefield, Linzy K.; White, Jeremy T.; Houston, Natalie A.; Thomas, Jonathan V.

    2015-01-01

    In 2010, the U.S. Geological Survey, in cooperation with the San Antonio Water System, began a study to assess the brackish-water movement within the Edwards aquifer (more specifically the potential for brackish-water encroachment into wells near the interface between the freshwater and brackish-water transition zones, referred to in this report as the transition-zone interface) and effects on spring discharge at Comal and San Marcos Springs under drought conditions using a numerical model. The quantitative targets of this study are to predict the effects of higher-than-average groundwater withdrawals from wells and drought-of-record rainfall conditions of 1950–56 on (1) dissolved-solids concentration changes at production wells near the transition-zone interface, (2) total spring discharge at Comal and San Marcos Springs, and (3) the groundwater head (head) at Bexar County index well J-17. The predictions of interest, and the parameters implemented into the model, were evaluated to quantify their uncertainty so the results of the predictions could be presented in terms of a 95-percent credible interval.

  20. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  1. Technical Requirements and Principles for the Standards Development of the Key Parts for Rotor Air-conditioning Compressors

    Institute of Scientific and Technical Information of China (English)

    Sun Min; Wen Yun; Fan Zhangzeng

    2011-01-01

    ntroductionSince 2000,air-conditioning sales continues to grow,and the development of air-conditioning market makes a booming market of compressor.At the present time,compressor production rising all the way,and the sales steps up the new steps constantly.Tendency chart is shown in figure 1.Rotor compressor with its simple structure,small volume,light weight,easy processed mechanical parts,reliable operation and other excellent characteristics occupied the dominant position in the market.Compared with reciprocating compressor on the same application situation,decreased in the size by 40%~50%,weight was reduced by 40%~50%.But there were also disadvantages,mainly large friction loss,friction power consumption was about 10%of compressor's total power input.

  2. ESPRIT-Like Two-Dimensional DOA Estimation for Monostatic MIMO Radar with Electromagnetic Vector Received Sensors under the Condition of Gain and Phase Uncertainties and Mutual Coupling.

    Science.gov (United States)

    Zhang, Dong; Zhang, Yongshun; Zheng, Guimei; Feng, Cunqian; Tang, Jun

    2017-10-26

    In this paper, we focus on the problem of two-dimensional direction of arrival (2D-DOA) estimation for monostatic MIMO Radar with electromagnetic vector received sensors (MIMO-EMVSs) under the condition of gain and phase uncertainties (GPU) and mutual coupling (MC). GPU would spoil the invariance property of the EMVSs in MIMO-EMVSs, thus the effective ESPRIT algorithm unable to be used directly. Then we put forward a C-SPD ESPRIT-like algorithm. It estimates the 2D-DOA and polarization station angle (PSA) based on the instrumental sensors method (ISM). The C-SPD ESPRIT-like algorithm can obtain good angle estimation accuracy without knowing the GPU. Furthermore, it can be applied to arbitrary array configuration and has low complexity for avoiding the angle searching procedure. When MC and GPU exist together between the elements of EMVSs, in order to make our algorithm feasible, we derive a class of separated electromagnetic vector receiver and give the S-SPD ESPRIT-like algorithm. It can solve the problem of GPU and MC efficiently. And the array configuration can be arbitrary. The effectiveness of our proposed algorithms is verified by the simulation result.

  3. ESPRIT-Like Two-Dimensional DOA Estimation for Monostatic MIMO Radar with Electromagnetic Vector Received Sensors under the Condition of Gain and Phase Uncertainties and Mutual Coupling

    Directory of Open Access Journals (Sweden)

    Dong Zhang

    2017-10-01

    Full Text Available In this paper, we focus on the problem of two-dimensional direction of arrival (2D-DOA estimation for monostatic MIMO Radar with electromagnetic vector received sensors (MIMO-EMVSs under the condition of gain and phase uncertainties (GPU and mutual coupling (MC. GPU would spoil the invariance property of the EMVSs in MIMO-EMVSs, thus the effective ESPRIT algorithm unable to be used directly. Then we put forward a C-SPD ESPRIT-like algorithm. It estimates the 2D-DOA and polarization station angle (PSA based on the instrumental sensors method (ISM. The C-SPD ESPRIT-like algorithm can obtain good angle estimation accuracy without knowing the GPU. Furthermore, it can be applied to arbitrary array configuration and has low complexity for avoiding the angle searching procedure. When MC and GPU exist together between the elements of EMVSs, in order to make our algorithm feasible, we derive a class of separated electromagnetic vector receiver and give the S-SPD ESPRIT-like algorithm. It can solve the problem of GPU and MC efficiently. And the array configuration can be arbitrary. The effectiveness of our proposed algorithms is verified by the simulation result.

  4. Business incubation in a university as a key condition for the formation of innovational micro entrepreneurship in a region

    Directory of Open Access Journals (Sweden)

    Anatoliy Viktorovich Grebenkin

    2012-09-01

    Full Text Available This paper substantiates the hypothesis of the special role of universities in creating an environment of innovational micro entrepreneurship in a region. The role of business incubators is allocated; the algorithm for selecting projects is described. The results of a three-year organizational and economic experiment (with the changing conditions on the functioning of the student business incubator in the Ural State University are shown. Various models of the selection of ideas and projects for different cycles of incubation areimplemented. A decision on theestablishment of the Entrepreneurship Center in theInstitute of Management and Entrepreneurship is made. The Center’s main task is to form a series of events to support continuous generation of students’ business ideas, finding resonant response with the University experts and representatives of business environment in the region. A student in the business incubation system plays a new role for a Russian university — a role of a catalyst, i.e., directly acts as an element of positive feedback in the innovational system. It is shown that the catalytic path of the establishment and development of small high-tech business — Science to Business (StB — leads to the phenomenon of resonance, i.e., sustainable innovation flow generated by the business incubator of the University. The poll of the USU students in 2009-2011 (a sample from 660 to 854 respondents confirmed their positive attitude towards entrepreneurship and allowed to estimate the structure of the factors that hamper to increase student participation in the innovational business. Three blocks of factors were identified: the reluctance to take risks, inaccessibility of material and financial resources and the turbulence of the environment. A system of monitoring students' attitudes towards entrepreneurship, which allows adjusting the curriculum and creating institutional conditions for activation of innovative entrepreneurship

  5. Uncertainties in the Value of Bill Savings from Behind-the-Meter, Residential Photovoltaic Systems: The Roles of Electricity Market Conditions, Retail Rate Design, and Net Metering

    Science.gov (United States)

    Darghouth, Naim Richard

    Net metering has become a widespread policy mechanism in the U.S. for supporting customer adoption of distributed photovoltaics (PV), allowing customers with PV systems to reduce their electric bills by offsetting their consumption with PV generation, independent of the timing of the generation relative to consumption. Although net metering is one of the principal drivers for the residential PV market in the U.S., the academic literature on this policy has been sparse and this dissertation contributes to this emerging body of literature. This dissertation explores the linkages between the availability of net metering, wholesale electricity market conditions, retail rates, and the residential bill savings from behind-the-meter PV systems. First, I examine the value of the bill savings that customers receive under net metering and alternatives to net metering, and the associated role of retail rate design, based on current rates and a sample of approximately two hundred residential customers of California's two largest electric utilities. I find that the bill savings per kWh of PV electricity generated varies greatly, largely attributable to the increasing block structure of the California utilities' residential retail rates. I also find that net metering provides significantly greater bill savings than alternative compensation mechanisms based on avoided costs. However, retail electricity rates may shift as wholesale electricity market conditions change. I then investigate a potential change in market conditions -- increased solar PV penetrations -- on wholesale prices in the short-term based on the merit-order effect. This demonstrates the potential price effects of changes in market conditions, but also points to a number of methodological shortcomings of this method, motivating my usage of a long-term capacity investment and economic dispatch model to examine wholesale price effects of various wholesale market scenarios in the subsequent analysis. By developing

  6. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  7. Thiazolidine-2,4-dione derivatives: programmed chemical weapons for key protein targets of various pathological conditions.

    Science.gov (United States)

    Chadha, Navriti; Bahia, Malkeet Singh; Kaur, Maninder; Silakari, Om

    2015-07-01

    Thiazolidine-2,4-dione is an extensively explored heterocyclic nucleus for designing of novel agents implicated for a wide variety of pathophysiological conditions, that is, diabetes, diabetic complications, cancer, arthritis, inflammation, microbial infection, and melanoma, etc. The current paradigm of drug development has shifted to the structure-based drug design, since high-throughput screenings have continued to generate disappointing results. The gap between hit generation and drug establishment can be narrowed down by investigation of ligand interactions with its receptor protein. Therefore, it would always be highly beneficial to gain knowledge of molecular level interactions between specific protein target and developed ligands; since this information can be maneuvered to design new molecules with improved protein fitting. Thus, considering this aspect, we have corroborated the information about molecular (target) level implementations of thiazolidine-2,4-diones (TZD) derivatives having therapeutic implementations such as, but not limited to, anti-diabetic (glitazones), anti-cancer, anti-arthritic, anti-inflammatory, anti-oxidant and anti-microbial, etc. The structure based SAR of TZD derivatives for various protein targets would serve as a benchmark for the alteration of existing ligands to design new ones with better binding interactions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. The Key to Acetate: Metabolic Fluxes of Acetic Acid Bacteria under Cocoa Pulp Fermentation-Simulating Conditions

    Science.gov (United States)

    Adler, Philipp; Frey, Lasse Jannis; Berger, Antje; Bolten, Christoph Josef; Hansen, Carl Erik

    2014-01-01

    Acetic acid bacteria (AAB) play an important role during cocoa fermentation, as their main product, acetate, is a major driver for the development of the desired cocoa flavors. Here, we investigated the specialized metabolism of these bacteria under cocoa pulp fermentation-simulating conditions. A carefully designed combination of parallel 13C isotope labeling experiments allowed the elucidation of intracellular fluxes in the complex environment of cocoa pulp, when lactate and ethanol were included as primary substrates among undefined ingredients. We demonstrate that AAB exhibit a functionally separated metabolism during coconsumption of two-carbon and three-carbon substrates. Acetate is almost exclusively derived from ethanol, while lactate serves for the formation of acetoin and biomass building blocks. Although this is suboptimal for cellular energetics, this allows maximized growth and conversion rates. The functional separation results from a lack of phosphoenolpyruvate carboxykinase and malic enzymes, typically present in bacteria to interconnect metabolism. In fact, gluconeogenesis is driven by pyruvate phosphate dikinase. Consequently, a balanced ratio of lactate and ethanol is important for the optimum performance of AAB. As lactate and ethanol are individually supplied by lactic acid bacteria and yeasts during the initial phase of cocoa fermentation, respectively, this underlines the importance of a well-balanced microbial consortium for a successful fermentation process. Indeed, AAB performed the best and produced the largest amounts of acetate in mixed culture experiments when lactic acid bacteria and yeasts were both present. PMID:24837393

  9. Teaching Uncertainties

    Science.gov (United States)

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  10. Calibration uncertainty

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...

  11. The impact of the uncertainty in the initial soil moisture condition of irrigated areas on the spatiotemporal characteristics of convective activity in Central Greece

    Science.gov (United States)

    Kotsopoulos, Stylianos; Ioannis, Tegoulias; Ioannis, Pytharoulis; Stergios, Kartsios; Dimitrios, Bampzelis; Theodore, Karacostas

    2015-04-01

    The region of Thessaly is the second largest plain in Greece and has a vital role in the financial life of the country, because of its significant agricultural production. The intensive and extensive cultivation of irrigated crops, in combination with the population increase and the alteration of precipitation patterns due to climate change, often leading the region to experience severe drought conditions, especially during the warm period of the year. The aim of the DAPHNE project is to tackle the problem of drought in this area by means of Weather Modification.In the framework of the project DAPHNE, the numerical weather prediction model WRF-ARW 3.5.1 is used to provide operational forecasts and hindcasts for the region of Thessaly. The goal of this study is to investigate the impact of the uncertainty in the initial soil moisture condition of irrigated areas, on the spatiotemporal characteristics of convective activity in the region of interest. To this end, six cases under the six most frequent synoptic conditions, which are associated with convective activity in the region of interest, are utilized, considering six different soil moisture initialization scenarios. In the first scenario (Control Run), the model is initialized with the surface soil moisture of the ECMWF analysis data, that usually does not take into account the modification of soil moisture due to agricultural activity in the area of interest. In the other five scenarios (Experiment 1,2,3,4,5) the soil moisture in the upper soil layers of the study area are modified from -50% to 50% of field capacity (-50%FC, -25%FC, FC, 25%FC, 50%FC),for the irrigated cropland.Three model domains, covering Europe, the Mediterranean Sea and northern Africa (d01), the wider area of Greece (d02) and central Greece - Thessaly region (d03) are used at horizontal grid-spacings of 15km, 5km and 1km respectively. ECMWF operational analyses at 6-hourly intervals (0.25ox0.25o lat.-long.) are imported as initial and

  12. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models....... This retooling addresses several shortcomings. First, the imperfect correlation of demands reconciles the sales variation observed in and across destinations. Second, since demands for the firm's output are correlated across destinations, a firm can use previously realized demands to forecast unknown demands...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles...

  13. Integrating uncertainties for climate change mitigation

    Science.gov (United States)

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    The target of keeping global average temperature increase to below 2°C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by

  14. Photometric Uncertainties

    Science.gov (United States)

    Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon

    2018-01-01

    The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.

  15. Uncertainty analysis

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  16. Development of a Prototype Model-Form Uncertainty Knowledge Base

    Science.gov (United States)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  17. Short-term selective alleviation of glucotoxicity and lipotoxicity ameliorates the suppressed expression of key β-cell factors under diabetic conditions

    Energy Technology Data Exchange (ETDEWEB)

    Shimo, Naoki [Department of Metabolic Medicine, Osaka University Graduate School of Medicine, 2-2 Yamadaoka, Suita, Osaka, 565-0871 (Japan); Matsuoka, Taka-aki, E-mail: matsuoka@endmet.med.osaka-u.ac.jp [Department of Metabolic Medicine, Osaka University Graduate School of Medicine, 2-2 Yamadaoka, Suita, Osaka, 565-0871 (Japan); Miyatsuka, Takeshi [Department of Metabolic Medicine, Osaka University Graduate School of Medicine, 2-2 Yamadaoka, Suita, Osaka, 565-0871 (Japan); Department of Metabolism and Endocrinology, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunky-ku, Tokyo, 113-8421 (Japan); Takebe, Satomi; Tochino, Yoshihiro; Takahara, Mitsuyoshi [Department of Metabolic Medicine, Osaka University Graduate School of Medicine, 2-2 Yamadaoka, Suita, Osaka, 565-0871 (Japan); Kaneto, Hideaki [Division of Diabetes, Endocrinology and Metabolism, Kawasaki Medical School, 577 Matsushima, Kurashiki-city, Okayama, 701-0192 (Japan); Shimomura, Iichiro [Department of Metabolic Medicine, Osaka University Graduate School of Medicine, 2-2 Yamadaoka, Suita, Osaka, 565-0871 (Japan)

    2015-11-27

    Alleviation of hyperglycaemia and hyperlipidemia improves pancreatic β-cell function in type 2 diabetes. However, the underlying molecular mechanisms are still not well clarified. In this study, we aimed to elucidate how the expression alterations of key β-cell factors are altered by the short-term selective alleviation of glucotoxicity or lipotoxicity. We treated db/db mice for one week with empagliflozin and/or bezafibrate to alleviate glucotoxicity and/or liptotoxicity, respectively. The gene expression levels of Pdx1 and Mafa, and their potential targets, insulin 1, Slc2a2, and Glp1r, were higher in the islets of empagliflozin-treated mice, and levels of insulin 2 were higher in mice treated with both reagents, than in untreated mice. Moreover, compared to the pretreatment levels, Mafa and insulin 1 expression increased in empagliflozin-treated mice, and Slc2a2 increased in combination-treated mice. In addition, empagliflozin treatment enhanced β-cell proliferation assessed by Ki-67 immunostaining. Our date clearly demonstrated that the one-week selective alleviation of glucotoxicity led to the better expression levels of the key β-cell factors critical for β-cell function over pretreatment levels, and that the alleviation of lipotoxicity along with glucotoxicity augmented the favorable effects under diabetic conditions. - Highlights: • One-week selective reduction of gluco- and lipo-toxicity in db/db mice was performed. • Selective glucotoxicity reduction increases key pancreatic β-cell factors expression. • Selective glucotoxicity reduction improves β-cell factors over pretreatment levels. • Selective glucotoxicity reduction turns β-cell mass toward increase. • Lipotoxicity reduction has additive effects on glucotoxicity reduction.

  18. Developing an Agent-Based Simulation System for Post-Earthquake Operations in Uncertainty Conditions: A Proposed Method for Collaboration among Agents

    Directory of Open Access Journals (Sweden)

    Navid Hooshangi

    2018-01-01

    Full Text Available Agent-based modeling is a promising approach for developing simulation tools for natural hazards in different areas, such as during urban search and rescue (USAR operations. The present study aimed to develop a dynamic agent-based simulation model in post-earthquake USAR operations using geospatial information system and multi agent systems (GIS and MASs, respectively. We also propose an approach for dynamic task allocation and establishing collaboration among agents based on contract net protocol (CNP and interval-based Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS methods, which consider uncertainty in natural hazards information during agents’ decision-making. The decision-making weights were calculated by analytic hierarchy process (AHP. In order to implement the system, earthquake environment was simulated and the damage of the buildings and a number of injuries were calculated in Tehran’s District 3: 23%, 37%, 24% and 16% of buildings were in slight, moderate, extensive and completely vulnerable classes, respectively. The number of injured persons was calculated to be 17,238. Numerical results in 27 scenarios showed that the proposed method is more accurate than the CNP method in the terms of USAR operational time (at least 13% decrease and the number of human fatalities (at least 9% decrease. In interval uncertainty analysis of our proposed simulated system, the lower and upper bounds of uncertain responses are evaluated. The overall results showed that considering uncertainty in task allocation can be a highly advantageous in the disaster environment. Such systems can be used to manage and prepare for natural hazards.

  19. Effect of increasing body condition on key regulators of fat metabolism in subcutaneous adipose tissue depot and circulation of nonlactating dairy cows.

    Science.gov (United States)

    Locher, L; Häussler, S; Laubenthal, L; Singh, S P; Winkler, J; Kinoshita, A; Kenéz, Á; Rehage, J; Huber, K; Sauerwein, H; Dänicke, S

    2015-02-01

    In response to negative energy balance, overconditioned cows mobilize more body fat than thin cows and subsequently are prone to develop metabolic disorders. Changes in adipose tissue (AT) metabolism are barely investigated in overconditioned cows. Therefore, the objective was to investigate the effect of increasing body condition on key regulator proteins of fat metabolism in subcutaneous AT and circulation of dairy cows. Nonlactating, nonpregnant dairy cows (n=8) investigated in the current study served as a model to elucidate the changes in the course of overcondition independent from physiological changes related to gestation, parturition, and lactation. Cows were fed diets with increasing portions of concentrate during the first 6wk of the experiment until 60% were reached, which was maintained for 9wk. Biopsy samples from AT of the subcutaneous tailhead region were collected every 8wk, whereas blood was sampled monthly. Within the experimental period cows had an average BW gain of 243±33.3 kg. Leptin and insulin concentrations were increased until wk 12. Based on serum concentrations of glucose, insulin, and nonesterified fatty acids, the surrogate indices for insulin sensitivity were calculated. High-concentrate feeding led to decreased quantitative insulin sensitivity check index and homeostasis model assessment due to high insulin and glucose concentrations indicating decreased insulin sensitivity. Adiponectin, an adipokine-promoting insulin sensitivity, decreased in subcutaneous AT, but remained unchanged in the circulation. The high-concentrate diet affected key enzymes reflecting AT metabolism such as AMP-activated protein kinase and hormone-sensitive lipase, both represented as the proportion of the phosphorylated protein to total protein, as well as fatty acid synthase. The extent of phosphorylation of AMP-activated protein kinase and the protein expression of fatty acid synthase were inversely regulated throughout the experimental period, whereas

  20. Validation of newly developed and redesigned key indicator methods for assessment of different working conditions with physical workloads based on mixed-methods design: a study protocol.

    Science.gov (United States)

    Klussmann, Andre; Liebers, Falk; Brandstädt, Felix; Schust, Marianne; Serafin, Patrick; Schäfer, Andreas; Gebhardt, Hansjürgen; Hartmann, Bernd; Steinberg, Ulf

    2017-08-21

    The impact of work-related musculoskeletal disorders is considerable. The assessment of work tasks with physical workloads is crucial to estimate the work-related health risks of exposed employees. Three key indicator methods are available for risk assessment regarding manual lifting, holding and carrying of loads; manual pulling and pushing of loads; and manual handling operations. Three further KIMs for risk assessment regarding whole-body forces, awkward body postures and body movement have been developed de novo. In addition, the development of a newly drafted combined method for mixed exposures is planned. All methods will be validated regarding face validity, reliability, convergent validity, criterion validity and further aspects of utility under practical conditions. As part of the joint project MEGAPHYS (multilevel risk assessment of physical workloads), a mixed-methods study is being designed for the validation of KIMs and conducted in companies of different sizes and branches in Germany. Workplaces are documented and analysed by observations, applying KIMs, interviews and assessment of environmental conditions. Furthermore, a survey among the employees at the respective workplaces takes place with standardised questionnaires, interviews and physical examinations. It is intended to include 1200 employees at 120 different workplaces. For analysis of the quality criteria, recommendations of the COSMIN checklist (COnsensus-based Standards for the selection of health Measurement INstruments) will be taken into account. The study was planned and conducted in accordance with the German Medical Professional Code and the Declaration of Helsinki as well as the German Federal Data Protection Act. The design of the study was approved by ethics committees. We intend to publish the validated KIMs in 2018. Results will be published in peer-reviewed journals, presented at international meetings and disseminated to actual users for practical application. © Article

  1. Measurement uncertainty: Friend or foe?

    Science.gov (United States)

    Infusino, Ilenia; Panteghini, Mauro

    2018-02-02

    The definition and enforcement of a reference measurement system, based on the implementation of metrological traceability of patients' results to higher order reference methods and materials, together with a clinically acceptable level of measurement uncertainty, are fundamental requirements to produce accurate and equivalent laboratory results. The uncertainty associated with each step of the traceability chain should be governed to obtain a final combined uncertainty on clinical samples fulfilling the requested performance specifications. It is important that end-users (i.e., clinical laboratory) may know and verify how in vitro diagnostics (IVD) manufacturers have implemented the traceability of their calibrators and estimated the corresponding uncertainty. However, full information about traceability and combined uncertainty of calibrators is currently very difficult to obtain. Laboratory professionals should investigate the need to reduce the uncertainty of the higher order metrological references and/or to increase the precision of commercial measuring systems. Accordingly, the measurement uncertainty should not be considered a parameter to be calculated by clinical laboratories just to fulfil the accreditation standards, but it must become a key quality indicator to describe both the performance of an IVD measuring system and the laboratory itself. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  2. Assessment of key transport parameters in a karst system under different dynamic conditions based on tracer experiments: the Jeita karst system, Lebanon

    Science.gov (United States)

    Doummar, Joanna; Margane, Armin; Geyer, Tobias; Sauter, Martin

    2018-03-01

    Artificial tracer experiments were conducted in the mature karst system of Jeita (Lebanon) under various flow conditions using surface and subsurface tracer injection points, to determine the variation of transport parameters (attenuation of peak concentration, velocity, transit times, dispersivity, and proportion of immobile and mobile regions) along fast and slow flow pathways. Tracer breakthrough curves (TBCs) observed at the karst spring were interpreted using a two-region nonequilibrium approach (2RNEM) to account for the skewness in the TBCs' long tailings. The conduit test results revealed a discharge threshold in the system dynamics, beyond which the transport parameters vary significantly. The polynomial relationship between transport velocity and discharge can be related to the variation of the conduit's cross-sectional area. Longitudinal dispersivity in the conduit system is not a constant value (α = 7-10 m) and decreases linearly with increasing flow rate because of dilution effects. Additionally, the proportion of immobile regions (arising from conduit irregularities) increases with decreasing water level in the conduit system. From tracer tests with injection at the surface, longitudinal dispersivity values are found to be large (8-27 m). The tailing observed in some TBCs is generated in the unsaturated zone before the tracer actually arrives at the major subsurface conduit draining the system. This work allows the estimation and prediction of the key transport parameters in karst aquifers. It shows that these parameters vary with time and flow dynamics, and they reflect the geometry of the flow pathway and the origin of infiltrating (potentially contaminated) recharge.

  3. The Uncertainty Multiplier and Business Cycles

    OpenAIRE

    Saijo, Hikaru

    2013-01-01

    I study a business cycle model where agents learn about the state of the economy by accumulating capital. During recessions, agents invest less, and this generates noisier estimates of macroeconomic conditions and an increase in uncertainty. The endogenous increase in aggregate uncertainty further reduces economic activity, which in turn leads to more uncertainty, and so on. Thus, through changes in uncertainty, learning gives rise to a multiplier effect that amplifies business cycles. I use ...

  4. A pilot study using scripted ventilation conditions to identify key factors affecting indoor pollutant concentration and air exchange rate in a residence.

    Science.gov (United States)

    Johnson, Ted; Myers, Jeffrey; Kelly, Thomas; Wisbith, Anthony; Ollison, Will

    2004-01-01

    A pilot study was conducted using an occupied, single-family test house in Columbus, OH, to determine whether a script-based protocol could be used to obtain data useful in identifying the key factors affecting air-exchange rate (AER) and the relationship between indoor and outdoor concentrations of selected traffic-related air pollutants. The test script called for hourly changes to elements of the test house considered likely to influence air flow and AER, including the position (open or closed) of each window and door and the operation (on/off) of the furnace, air conditioner, and ceiling fans. The script was implemented over a 3-day period (January 30-February 1, 2002) during which technicians collected hourly-average data for AER, indoor, and outdoor air concentrations for six pollutants (benzene, formaldehyde (HCHO), polycyclic aromatic hydrocarbons (PAH), carbon monoxide (CO), nitric oxide (NO), and nitrogen oxides (NO(x))), and selected meteorological variables. Consistent with expectations, AER tended to increase with the number of open exterior windows and doors. The 39 AER values measured during the study when all exterior doors and windows were closed varied from 0.36 to 2.29 h(-1) with a geometric mean (GM) of 0.77 h(-1) and a geometric standard deviation (GSD) of 1.435. The 27 AER values measured when at least one exterior door or window was opened varied from 0.50 to 15.8 h(-1) with a GM of 1.98 h(-1) and a GSD of 1.902. AER was also affected by temperature and wind speed, most noticeably when exterior windows and doors were closed. Results of a series of stepwise linear regression analyses suggest that (1) outdoor pollutant concentration and (2) indoor pollutant concentration during the preceding hour were the "variables of choice" for predicting indoor pollutant concentration in the test house under the conditions of this study. Depending on the pollutant and ventilation conditions, one or more of the following variables produced a small, but

  5. Uncertainty analysis in safety assessment

    International Nuclear Information System (INIS)

    Lemos, Francisco Luiz de; Sullivan, Terry

    1997-01-01

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author)

  6. Uncertainty and measurement

    International Nuclear Information System (INIS)

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  7. Supporting Qualified Database for Uncertainty Evaluation

    International Nuclear Information System (INIS)

    Petruzzi, A.; Fiori, F.; Kovtonyuk, A.; Lisovyy, O.; D'Auria, F.

    2013-01-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS-52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The 'RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QP' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering Handbook) of the input nodalization

  8. Supporting qualified database for uncertainty evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Petruzzi, A.; Fiori, F.; Kovtonyuk, A.; D' Auria, F. [Nuclear Research Group of San Piero A Grado, Univ. of Pisa, Via Livornese 1291, 56122 Pisa (Italy)

    2012-07-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS-52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The' RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QR' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering

  9. Estimates of Uncertainty around the RBA's Forecasts

    OpenAIRE

    Peter Tulip; Stephanie Wallace

    2012-01-01

    We use past forecast errors to construct confidence intervals and other estimates of uncertainty around the Reserve Bank of Australia's forecasts of key macroeconomic variables. Our estimates suggest that uncertainty about forecasts is high. We find that the RBA's forecasts have substantial explanatory power for the inflation rate but not for GDP growth.

  10. Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    Science.gov (United States)

    Boening, C.; Schlegel, N.; Limonadi, D.; Schodlok, M.; Seroussi, H. L.; Larour, E. Y.; Watkins, M. M.

    2017-12-01

    In order to better quantify uncertainties in global mean sea level rise projections and in particular upper bounds, we aim at systematically evaluating the contributions from ice sheets and potential for extreme sea level rise due to sudden ice mass loss. Here, we take advantage of established uncertainty quantification tools embedded within the Ice Sheet System Model (ISSM) as well as sensitivities to ice/ocean interactions using melt rates and melt potential derived from MITgcm/ECCO2. With the use of these tools, we conduct Monte-Carlo style sampling experiments on forward simulations of the Antarctic ice sheet, by varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges. Uncertainty bounds for climate forcing are informed by CMIP5 ensemble precipitation and ice melt estimates for year 2100, and uncertainty bounds for ocean melt rates are derived from a suite of regional sensitivity experiments using MITgcm. Resulting statistics allow us to assess how regional uncertainty in various parameters affect model estimates of century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  11. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.; Curioni, A.; Fedulova, I.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost

  12. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  13. Impact of inherent meteorology uncertainty on air quality ...

    Science.gov (United States)

    It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is important to understand how uncertainties in these inputs affect the simulated concentrations. Ensembles are one method to explore how uncertainty in meteorology affects air pollution concentrations. Most studies explore this uncertainty by running different meteorological models or the same model with different physics options and in some cases combinations of different meteorological and air quality models. While these have been shown to be useful techniques in some cases, we present a technique that leverages the initial condition perturbations of a weather forecast ensemble, namely, the Short-Range Ensemble Forecast system to drive the four-dimensional data assimilation in the Weather Research and Forecasting (WRF)-Community Multiscale Air Quality (CMAQ) model with a key focus being the response of ozone chemistry and transport. Results confirm that a sizable spread in WRF solutions, including common weather variables of temperature, wind, boundary layer depth, clouds, and radiation, can cause a relatively large range of ozone-mixing ratios. Pollutant transport can be altered by hundreds of kilometers over several days. Ozone-mixing ratios of the ensemble can vary as much as 10–20 ppb

  14. Treatment of uncertainty in low-level waste performance assessment

    International Nuclear Information System (INIS)

    Kozak, M.W.; Olague, N.E.; Gallegos, D.P.; Rao, R.R.

    1991-01-01

    Uncertainties arise from a number of different sources in low-level waste performance assessment. In this paper the types of uncertainty are reviewed, and existing methods for quantifying and reducing each type of uncertainty are discussed. These approaches are examined in the context of the current low-level radioactive waste regulatory performance objectives, which are deterministic. The types of uncertainty discussed in this paper are model uncertainty, uncertainty about future conditions, and parameter uncertainty. The advantages and disadvantages of available methods for addressing uncertainty in low-level waste performance assessment are presented. 25 refs

  15. Uncertainty in social dilemmas

    OpenAIRE

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size uncertainty). Several researchers have therefore asked themselves the question as to how such uncertainty influences people’s choice behavior. These researchers have repeatedly concluded that uncertainty...

  16. Uncertainty Communication. Issues and good practice

    International Nuclear Information System (INIS)

    Kloprogge, P.; Van der Sluijs, J.; Wardekker, A.

    2007-12-01

    In 2003 the Netherlands Environmental Assessment Agency (MNP) published the RIVM/MNP Guidance for Uncertainty Assessment and Communication. The Guidance assists in dealing with uncertainty in environmental assessments. Dealing with uncertainty is essential because assessment results regarding complex environmental issues are of limited value if the uncertainties have not been taken into account adequately. A careful analysis of uncertainties in an environmental assessment is required, but even more important is the effective communication of these uncertainties in the presentation of assessment results. The Guidance yields rich and differentiated insights in uncertainty, but the relevance of this uncertainty information may vary across audiences and uses of assessment results. Therefore, the reporting of uncertainties is one of the six key issues that is addressed in the Guidance. In practice, users of the Guidance felt a need for more practical assistance in the reporting of uncertainty information. This report explores the issue of uncertainty communication in more detail, and contains more detailed guidance on the communication of uncertainty. In order to make this a 'stand alone' document several questions that are mentioned in the detailed Guidance have been repeated here. This document thus has some overlap with the detailed Guidance. Part 1 gives a general introduction to the issue of communicating uncertainty information. It offers guidelines for (fine)tuning the communication to the intended audiences and context of a report, discusses how readers of a report tend to handle uncertainty information, and ends with a list of criteria that uncertainty communication needs to meet to increase its effectiveness. Part 2 helps writers to analyze the context in which communication takes place, and helps to map the audiences, and their information needs. It further helps to reflect upon anticipated uses and possible impacts of the uncertainty information on the

  17. Decision-Making under Criteria Uncertainty

    Science.gov (United States)

    Kureychik, V. M.; Safronenkova, I. B.

    2018-05-01

    Uncertainty is an essential part of a decision-making procedure. The paper deals with the problem of decision-making under criteria uncertainty. In this context, decision-making under uncertainty, types and conditions of uncertainty were examined. The decision-making problem under uncertainty was formalized. A modification of the mathematical decision support method under uncertainty via ontologies was proposed. A critical distinction of the developed method is ontology usage as its base elements. The goal of this work is a development of a decision-making method under criteria uncertainty with the use of ontologies in the area of multilayer board designing. This method is oriented to improvement of technical-economic values of the examined domain.

  18. Impact of Climate Change. Policy Uncertainty in Power Investment

    International Nuclear Information System (INIS)

    Blyth, W.; Yang, M.

    2006-10-01

    Climate change policies are being introduced or actively considered in all IEA member countries, changing the investment conditions and technology choices in the energy sector. Many of these policies are at a formative stage, and policy uncertainty is currently high. The objective of this paper is to quantify the impacts of climate change policy on power investment. We use Real Options Analysis approach in the study and model uncertain carbon price and fuel price with stochastic variables. The analysis compares the effects of climate policy uncertainty with fuel price uncertainty, showing the relative importance of these sources of risk for different technologies. This paper considers views on the importance of climate policy risk, how it is managed, and how it might affect investment behaviour. The implications for policymakers are analyzed, allowing the key messages to be transferred into policy design decisions. We found that in many cases, the dominant risks facing base-load generation investment decisions will be market risks associated with electricity and fuel prices. However, under certain conditions and for some technologies, climate policy uncertainty can be an important risk factor, creating an incentive to delay investment and raising investment thresholds. This paper concludes that government climate change policies to promote investment in low-carbon technologies should aim to overcome this incentive to delay by sending long-term investment signals backed up by strengthened international policy action to enhance domestic policy credibility

  19. INFERENCIA DIFUSA APLICADA A LA INGENIERÍA CONCURRENTE PARA EL DISEÑO DE PRODUCTOS DE MANUFACTURA EN CONDICIONES DE INCERTIDUMBRE FUZZY INFERENCE APPLIED TO CONCURRENT ENGINEERING FOR MANUFACTURING PRODUCT DESIGN UNDER CONDITIONS OF UNCERTAINTY

    Directory of Open Access Journals (Sweden)

    Martín Darío Arango Serna

    2012-12-01

    Full Text Available En este artículo se desarrolla un modelo de inferencia difusa para la toma de decisiones en condiciones de incertidumbre aplicado al diseño de productos bajo un esquema de ingeniería concurrente. Los requisitos del cliente y los criterios de los diferentes equipos interdisciplinarios para evaluar un diseño en particular son presentados como variables difusas. El modelo aquí desarrollado es aplicado a una empresa de confecciones.In this article a fuzzy inference model is developed for decision making under uncertainty conditions, applied to the design of products under a scheme of concurrent engineering. Customer requirements and criteria of different interdisciplinary teams to evaluate a particular design are presented as fuzzy variables. The model developed is applied to a garment company.

  20. Phenomenon of Uncertainty as a Subjective Experience

    Directory of Open Access Journals (Sweden)

    Lifintseva A.A.

    2018-04-01

    Full Text Available The phenomenon of uncertainty in illness of patients is discussed and analyzed in this article. Uncertainty in illness is a condition that accompanies the patient from the moment of appearance of the first somatic symptoms of the disease and could be strengthened or weakened thanks to many psychosocial factors. The level of uncertainty is related to the level of stress, emotional disadaptation, affective states, coping strategies, mechanisms of psychological defense, etc. Uncertainty can perform destructive functions, acting as a trigger for stressful conditions and launching negative emotional experiences. As a positive function of uncertainty, one can note a possible positive interpretation of the patient's disease. In addition, the state of uncertainty allows the patient to activate the resources of coping with the disease, among which the leading role belongs to social support.

  1. Investment, regulation, and uncertainty

    Science.gov (United States)

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

  2. Uncertainty and sensitivity analysis on probabilistic safety assessment of an experimental facility

    International Nuclear Information System (INIS)

    Burgazzi, L.

    2000-01-01

    The aim of this work is to perform an uncertainty and sensitivity analysis on the probabilistic safety assessment of the International Fusion Materials Irradiation Facility (IFMIF), in order to assess the effect on the final risk values of the uncertainties associated with the generic data used for the initiating events and component reliability and to identify the key quantities contributing to this uncertainty. The analysis is conducted on the expected frequency calculated for the accident sequences, defined through the event tree (ET) modeling. This is in order to increment credit to the ET model quantification, to calculate frequency distributions for the occurrence of events and, consequently, to assess if sequences have been correctly selected on the probability standpoint and finally to verify the fulfillment of the safety conditions. Uncertainty and sensitivity analysis are performed using respectively Monte Carlo sampling and an importance parameter technique. (author)

  3. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  4. Uncertainty and global climate change research

    Energy Technology Data Exchange (ETDEWEB)

    Tonn, B.E. [Oak Ridge National Lab., TN (United States); Weiher, R. [National Oceanic and Atmospheric Administration, Boulder, CO (United States)

    1994-06-01

    The Workshop on Uncertainty and Global Climate Change Research March 22--23, 1994, in Knoxville, Tennessee. This report summarizes the results and recommendations of the workshop. The purpose of the workshop was to examine in-depth the concept of uncertainty. From an analytical point of view, uncertainty is a central feature of global climate science, economics and decision making. The magnitude and complexity of uncertainty surrounding global climate change has made it quite difficult to answer even the most simple and important of questions-whether potentially costly action is required now to ameliorate adverse consequences of global climate change or whether delay is warranted to gain better information to reduce uncertainties. A major conclusion of the workshop is that multidisciplinary integrated assessments using decision analytic techniques as a foundation is key to addressing global change policy concerns. First, uncertainty must be dealt with explicitly and rigorously since it is and will continue to be a key feature of analysis and recommendations on policy questions for years to come. Second, key policy questions and variables need to be explicitly identified, prioritized, and their uncertainty characterized to guide the entire scientific, modeling, and policy analysis process. Multidisciplinary integrated assessment techniques and value of information methodologies are best suited for this task. In terms of timeliness and relevance of developing and applying decision analytic techniques, the global change research and policy communities are moving rapidly toward integrated approaches to research design and policy analysis.

  5. Modeling Uncertainty in Climate Change: A Multi-Model Comparison

    Energy Technology Data Exchange (ETDEWEB)

    Gillingham, Kenneth; Nordhaus, William; Anthoff, David; Blanford, Geoffrey J.; Bosetti, Valentina; Christensen, Peter; McJeon, Haewon C.; Reilly, J. M.; Sztorc, Paul

    2015-10-01

    The economics of climate change involves a vast array of uncertainties, complicating both the analysis and development of climate policy. This study presents the results of the first comprehensive study of uncertainty in climate change using multiple integrated assessment models. The study looks at model and parametric uncertainties for population, total factor productivity, and climate sensitivity and estimates the pdfs of key output variables, including CO2 concentrations, temperature, damages, and the social cost of carbon (SCC). One key finding is that parametric uncertainty is more important than uncertainty in model structure. Our resulting pdfs also provide insight on tail events.

  6. Uncertainty analysis in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Lemos, Francisco Luiz de [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil); Sullivan, Terry [Brookhaven National Lab., Upton, NY (United States)

    1997-12-31

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author) 13 refs.; e-mail: lemos at bnl.gov; sulliva1 at bnl.gov

  7. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  8. Keys to Successful EPIQ Business Demonstrator Implementation

    NARCIS (Netherlands)

    Shoikova, Elena; Denishev, Vladislav

    2009-01-01

    Shoikova, E., & Denishev, V. (2009). Keys to Successful EPIQ Business Demonstrator Implementation. Paper presented at the 'Open workshop of TENCompetence - Rethinking Learning and Employment at a Time of Economic Uncertainty-event'. November, 19, 2009, Manchester, United Kingdom: TENCompetence.

  9. Inflation and Inflation Uncertainty in Turkey

    OpenAIRE

    dogru, bulent

    2014-01-01

    Abstract: In this study, the relationship between inflation and inflation uncertainty is analyzed using Granger causality tests with annual inflation series covering the time period 1923 to 2012 for Turkish Economy. Inflation uncertainty is measured by Exponential Generalized Autoregressive Conditional Heteroskedastic model. Econometric findings suggest that although in long run the Friedman's hypothesis that high inflation increases inflation ...

  10. Davis-Besse uncertainty study

    International Nuclear Information System (INIS)

    Davis, C.B.

    1987-08-01

    The uncertainties of calculations of loss-of-feedwater transients at Davis-Besse Unit 1 were determined to address concerns of the US Nuclear Regulatory Commission relative to the effectiveness of feed and bleed cooling. Davis-Besse Unit 1 is a pressurized water reactor of the raised-loop Babcock and Wilcox design. A detailed, quality-assured RELAP5/MOD2 model of Davis-Besse was developed at the Idaho National Engineering Laboratory. The model was used to perform an analysis of the loss-of-feedwater transient that occurred at Davis-Besse on June 9, 1985. A loss-of-feedwater transient followed by feed and bleed cooling was also calculated. The evaluation of uncertainty was based on the comparisons of calculations and data, comparisons of different calculations of the same transient, sensitivity calculations, and the propagation of the estimated uncertainty in initial and boundary conditions to the final calculated results

  11. A state-space modeling approach to estimating canopy conductance and associated uncertainties from sap flux density data

    Science.gov (United States)

    David M. Bell; Eric J. Ward; A. Christopher Oishi; Ram Oren; Paul G. Flikkema; James S. Clark; David Whitehead

    2015-01-01

    Uncertainties in ecophysiological responses to environment, such as the impact of atmospheric and soil moisture conditions on plant water regulation, limit our ability to estimate key inputs for ecosystem models. Advanced statistical frameworks provide coherent methodologies for relating observed data, such as stem sap flux density, to unobserved processes, such as...

  12. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  13. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  14. Potential effects of organizational uncertainty on safety

    International Nuclear Information System (INIS)

    Durbin, N.E.; Lekberg, A.; Melber, B.D.

    2001-12-01

    When organizations face significant change - reorganization, mergers, acquisitions, down sizing, plant closures or decommissioning - both the organizations and the workers in those organizations experience significant uncertainty about the future. This uncertainty affects the organization and the people working in the organization - adversely affecting morale, reducing concentration on safe operations, and resulting in the loss of key staff. Hence, organizations, particularly those using high risk technologies, which are facing significant change need to consider and plan for the effects of organizational uncertainty on safety - as well as planning for other consequences of change - technical, economic, emotional, and productivity related. This paper reviews some of what is known about the effects of uncertainty on organizations and individuals, discusses the potential consequences of uncertainty on organizational and individual behavior, and presents some of the implications for safety professionals

  15. Potential effects of organizational uncertainty on safety

    Energy Technology Data Exchange (ETDEWEB)

    Durbin, N.E. [MPD Consulting Group, Kirkland, WA (United States); Lekberg, A. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Melber, B.D. [Melber Consulting, Seattle WA (United States)

    2001-12-01

    When organizations face significant change - reorganization, mergers, acquisitions, down sizing, plant closures or decommissioning - both the organizations and the workers in those organizations experience significant uncertainty about the future. This uncertainty affects the organization and the people working in the organization - adversely affecting morale, reducing concentration on safe operations, and resulting in the loss of key staff. Hence, organizations, particularly those using high risk technologies, which are facing significant change need to consider and plan for the effects of organizational uncertainty on safety - as well as planning for other consequences of change - technical, economic, emotional, and productivity related. This paper reviews some of what is known about the effects of uncertainty on organizations and individuals, discusses the potential consequences of uncertainty on organizational and individual behavior, and presents some of the implications for safety professionals.

  16. Background and Qualification of Uncertainty Methods

    International Nuclear Information System (INIS)

    D'Auria, F.; Petruzzi, A.

    2008-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. The paper reviews the salient features of two independent approaches for estimating uncertainties associated with predictions of complex system codes. Namely the propagation of code input error and the propagation of the calculation output error constitute the key-words for identifying the methods of current interest for industrial applications. Throughout the developed methods, uncertainty bands can be derived (both upper and lower) for any desired quantity of the transient of interest. For the second case, the uncertainty method is coupled with the thermal-hydraulic code to get the Code with capability of Internal Assessment of Uncertainty, whose features are discussed in more detail.

  17. Towards a different attitude to uncertainty

    Directory of Open Access Journals (Sweden)

    Guy Pe'er

    2014-10-01

    Full Text Available The ecological literature deals with uncertainty primarily from the perspective of how to reduce it to acceptable levels. However, the current rapid and ubiquitous environmental changes, as well as anticipated rates of change, pose novel conditions and complex dynamics due to which many sources of uncertainty are difficult or even impossible to reduce. These include both uncertainty in knowledge (epistemic uncertainty and societal responses to it. Under these conditions, an increasing number of studies ask how one can deal with uncertainty as it is. Here, we explore the question how to adopt an overall alternative attitude to uncertainty, which accepts or even embraces it. First, we show that seeking to reduce uncertainty may be counterproductive under some circumstances. It may yield overconfidence, ignoring early warning signs, policy- and societal stagnation, or irresponsible behaviour if personal certainty is offered by externalization of environmental costs. We then demonstrate that uncertainty can have positive impacts by driving improvements in knowledge, promoting cautious action, contributing to keeping societies flexible and adaptable, enhancing awareness, support and involvement of the public in nature conservation, and enhancing cooperation and communication. We discuss the risks of employing a certainty paradigm on uncertain knowledge, the potential benefits of adopting an alternative attitude to uncertainty, and the need to implement such an attitude across scales – from adaptive management at the local scale, to the evolving Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES at the global level.

  18. Report on the uncertainty methods study

    International Nuclear Information System (INIS)

    1998-06-01

    The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes: the Pisa method (based on extrapolation from integral experiments) and four methods identifying and combining input uncertainties. Three of these, the GRS, IPSN and ENUSA methods, use subjective probability distributions, and one, the AEAT method, performs a bounding analysis. Each method has been used to calculate the uncertainty in specified parameters for the LSTF SB-CL-18 5% cold leg small break LOCA experiment in the ROSA-IV Large Scale Test Facility (LSTF). The uncertainty analysis was conducted essentially blind and the participants did not use experimental measurements from the test as input apart from initial and boundary conditions. Participants calculated uncertainty ranges for experimental parameters including pressurizer pressure, primary circuit inventory and clad temperature (at a specified position) as functions of time

  19. Revaluation of the concept of the human condition and the common heritage of mankind: Keys to the social benefits of space technology

    Science.gov (United States)

    Cocca, Aldo Armando

    Men may do many things, but they must never forget the human condition in any act or relation with a fellow human being. Space Law has vindicated the supreme value of man as a legal subject par excellence. The dignity of the human being is a value that rates above any scientific or technological advance. A benefit, by definition and derivation, is anything contributing to an improvement in a condition. Social benefits pertain only to human beings, who are their sole beneficiaries. Developing countries are young nations that through their international relations may, and indeed must, realize the benefits of space technology. The principle of the "common heritage of Mankind" was created to satisfy the aspirations of all peoples and to meet the needs of both industrialized and developing countries. Only a groundless fear and lack of vision of the future can induce governments to delay its implementation. We must not forget that the concept was transformed into a principle of international positive law by the unanimous decision of the international community, which enshrined it in the Moon Agreement. The social and individual responsibility of the scientist is becoming even more clearly defined, and scientists play an important role in the conduct of nations. Through education, including education in the humanities and a graduation pledge, the scientist has embarked on the road leading to an active presence in society, facing his responsibility. Inter-generational equity contributes to strengthening the concept of the human condition and the legal principle of the common heritage of mankind.

  20. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  1. Uncertainty of dustfall monitoring results

    Directory of Open Access Journals (Sweden)

    Martin A. van Nierop

    2017-06-01

    Full Text Available Fugitive dust has the ability to cause a nuisance and pollute the ambient environment, particularly from human activities including construction and industrial sites and mining operations. As such, dustfall monitoring has occurred for many decades in South Africa; little has been published on the repeatability, uncertainty, accuracy and precision of dustfall monitoring. Repeatability assesses the consistency associated with the results of a particular measurement under the same conditions; the consistency of the laboratory is assessed to determine the uncertainty associated with dustfall monitoring conducted by the laboratory. The aim of this study was to improve the understanding of the uncertainty in dustfall monitoring; thereby improving the confidence in dustfall monitoring. Uncertainty of dustfall monitoring was assessed through a 12-month study of 12 sites that were located on the boundary of the study area. Each site contained a directional dustfall sampler, which was modified by removing the rotating lid, with four buckets (A, B, C and D installed. Having four buckets on one stand allows for each bucket to be exposed to the same conditions, for the same period of time; therefore, should have equal amounts of dust deposited in these buckets. The difference in the weight (mg of the dust recorded from each bucket at each respective site was determined using the American Society for Testing and Materials method D1739 (ASTM D1739. The variability of the dust would provide the confidence level of dustfall monitoring when reporting to clients.

  2. Stability in the country as a key condition for the progressive improvement of the quality of life in the new independent states

    Directory of Open Access Journals (Sweden)

    D G Rotman

    2014-12-01

    Full Text Available Based on the sociological data the article determines the causes of emergence of social tensions in the country and society, estimates the reliance of such situations on the population’s satisfaction with the living conditions, and argues the importance of maintaining the sustainable stability for the sustainable development of the state. The author aims to build a comprehensive rating of ‘social concern’ of the post-Soviet states’ citizens as a sum of the following empirical indicators: economic problems of individuals and their families; chances to ensure decent living conditions (health care, access to education, security, etc.; environmental problems; political problems. The article also reconstructs an ideal-typical model of social transformation of post-communist societies consisting of three main stages: searching, stabilization, sustainable development. Based on the analysis of empirical data, the author states that some transitional societies at the turn of the third millennium (for instance, Russia, Belarus, and Kazakhstan have reached the stabilization phase. The author proposes to assess the level of social stability with the indicator of social tension obtained through the ‘many-stage data grouping’. The empirical data presented in the article are the results of the projects of the Center for Sociological and Political Studies of the Belarusian State University.

  3. Transcriptional profiling of human breast cancer cells cultured under microgravity conditions revealed the key role of genetic gravity sensors previously detected in Drosophila melanogaster

    Science.gov (United States)

    Valdivia-Silva, Julio E.; Lavan, David; Diego Orihuela-Tacuri, M.; Sanabria, Gabriela

    2016-07-01

    Currently, studies in Drosophila melanogaster has shown emerging evidence that microgravity stimuli can be detected at the genetic level. Analysis of the transcriptome in the pupal stage of the fruit flies under microgravity conditions versus ground controls has suggested the presence of a few candidate genes as "gravity sensors" which are experimentally validated. Additionally, several studies have shown that microgravity causes inhibitory effects in different types of cancer cells, although the genes involved and responsible for these effects are still unknown. Here, we demonstrate that the genes suggested as the sensors of gravitational waves in Drosophila melanogaster and their human counterpart (orthologous genes) are highly involved in carcinogenesis, proliferation, anti-apoptotic signals, invasiveness, and metastatic potential of breast cancer cell tumors. The transcriptome analyses suggested that the observed inhibitory effect in cancer cells could be due to changes in the genetic expression of these candidates. These results encourage the possibility of new therapeutic targets managed together and not in isolation.

  4. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  5. Key issues

    International Nuclear Information System (INIS)

    Cook, N.G.W.

    1980-01-01

    Successful modeling of the thermo-mechanical and hydrochemical behavior of radioactive waste repositories in hard rock is possible in principle. Because such predictions lie outside the realm of experience, their adequacy depends entirely upon a thorough understanding of three fundamental questions: an understanding of the chemical and physical processess that determine the behavior of rock and all its complexities; accurate and realistic numerical models of the geologic media within which a repository may be built; and sufficient in-situ data covering the entire geologic region affected by, or effecting the behavior of a repository. At present sufficient is known to be able to identify most of those areas which require further attention. These areas extend all the way from a complete understanding of the chemical and physical processes determining the behavior of rock through to the exploration mapping and testing that must be done during the development of any potential repository. Many of the techniques, laboratory equipment, field instrumentation, and numerical methods needed to accomplish this do not exist at present. Therefore it is necessary to accept that a major investment in scientific research is required to generate this information over the next few years. The spectrum of scientific and engineering activities is wide extending from laboratory measurements through the development of numerical models to the measurement of data in-situ, but there is every prospect that sufficient can be done to resolve these key issues. However, to do so requires overt recognition of the many gaps which exist in our knowledge and abilities today, and of the need to bridge these gaps and of the significant costs involved in doing so

  6. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  7. Uncertainty in artificial intelligence

    CERN Document Server

    Kanal, LN

    1986-01-01

    How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

  8. Uncertainties in hydrogen combustion

    International Nuclear Information System (INIS)

    Stamps, D.W.; Wong, C.C.; Nelson, L.S.

    1988-01-01

    Three important areas of hydrogen combustion with uncertainties are identified: high-temperature combustion, flame acceleration and deflagration-to-detonation transition, and aerosol resuspension during hydrogen combustion. The uncertainties associated with high-temperature combustion may affect at least three different accident scenarios: the in-cavity oxidation of combustible gases produced by core-concrete interactions, the direct containment heating hydrogen problem, and the possibility of local detonations. How these uncertainties may affect the sequence of various accident scenarios is discussed and recommendations are made to reduce these uncertainties. 40 references

  9. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  10. The Importance of Caveolin-1 as Key-Regulator of Three-Dimensional Growth in Thyroid Cancer Cells Cultured under Real and Simulated Microgravity Conditions

    Directory of Open Access Journals (Sweden)

    Stefan Riwaldt

    2015-11-01

    Full Text Available We recently demonstrated that the CAV1 gene was down-regulated, when poorly differentiated thyroid FTC-133 cancer cells formed spheroids under simulated microgravity conditions. Here, we present evidence that the caveolin-1 protein is involved in the inhibition of spheroid formation, when confluent monolayers are exposed to microgravity. The evidence is based on proteins detected in cells and their supernatants of the recent spaceflight experiment: “NanoRacks-CellBox-Thyroid Cancer”. The culture supernatant had been collected in a special container adjacent to the flight hardware incubation chamber and stored at low temperature until it was analyzed by Multi-Analyte Profiling (MAP technology, while the cells remaining in the incubation chamber were fixed by RNAlater and examined by mass spectrometry. The soluble proteins identified by MAP were investigated in regard to their mutual interactions and their influence on proteins, which were associated with the cells secreting the soluble proteins and had been identified in a preceding study. A Pathway Studio v.11 analysis of the soluble and cell-associated proteins together with protein kinase C alpha (PRKCA suggests that caveolin-1 is involved, when plasminogen enriched in the extracellular space is not activated and the vascular cellular adhesion molecule (VCAM-1 mediated cell–cell adhesion is simultaneously strengthened and activated PRKCA is recruited in caveolae, while the thyroid cancer cells do not form spheroids.

  11. Factoring uncertainty into restoration modeling of in-situ leach uranium mines

    Science.gov (United States)

    Johnson, Raymond H.; Friedel, Michael J.

    2009-01-01

    Postmining restoration is one of the greatest concerns for uranium in-situ leach (ISL) mining operations. The ISL-affected aquifer needs to be returned to conditions specified in the mining permit (either premining or other specified conditions). When uranium ISL operations are completed, postmining restoration is usually achieved by injecting reducing agents into the mined zone. The objective of this process is to restore the aquifer to premining conditions by reducing the solubility of uranium and other metals in the ground water. Reactive transport modeling is a potentially useful method for simulating the effectiveness of proposed restoration techniques. While reactive transport models can be useful, they are a simplification of reality that introduces uncertainty through the model conceptualization, parameterization, and calibration processes. For this reason, quantifying the uncertainty in simulated temporal and spatial hydrogeochemistry is important for postremedial risk evaluation of metal concentrations and mobility. Quantifying the range of uncertainty in key predictions (such as uranium concentrations at a specific location) can be achieved using forward Monte Carlo or other inverse modeling techniques (trial-and-error parameter sensitivity, calibration constrained Monte Carlo). These techniques provide simulated values of metal concentrations at specified locations that can be presented as nonlinear uncertainty limits or probability density functions. Decisionmakers can use these results to better evaluate environmental risk as future metal concentrations with a limited range of possibilities, based on a scientific evaluation of uncertainty.

  12. Geochemistry of rare earth elements in the Baba Ali magnetite skarn deposit, western Iran – a key to determine conditions of mineralisation

    Directory of Open Access Journals (Sweden)

    Zamanian Hassan

    2016-03-01

    Full Text Available The Baba Ali skarn deposit, situated 39 km to the northwest of Hamadan (Iran, is the result of a syenitic pluton that intruded and metamorphosed the diorite host rock. Rare earth element (REE values in the quartz syenite and diorite range between 35.4 and 560 ppm. Although the distribution pattern of REEs is more and less flat and smooth, light REEs (LREEs in general show higher concentrations than heavy REEs (HREEs in different lithounits. The skarn zone reveals the highest REE-enriched pattern, while the ore zone shows the maximum depletion pattern. A comparison of the concentration variations of LREEs (La–Nd, middle REEs (MREEs; Sm–Ho and HREEs (Er–Lu of the ore zone samples to the other zones elucidates two important points for the distribution of REEs: 1 the distribution patterns of LREEs and MREEs show a distinct depletion in the ore zone while representing a great enrichment in the skarn facies neighbouring the ore body border and decreasing towards the altered diorite host rock; 2 HREEs show the same pattern, but in the exoskarn do not reveal any distinct increase as observed for LREEs and MREEs. The ratio of La/Y in the Baba Ali skarn ranges from 0.37 to 2.89. The ore zone has the highest La/Y ratio. In this regard the skarn zones exhibit two distinctive portions: 1 one that has La/Y >1 beingadjacent to the ore body and; 2 another one with La/Y < 1 neighbouring altered diorite. Accordingly, the Baba Ali profile, from the quartz syenite to the middle part of the exoskarn, demonstrates chiefly alkaline conditions of formation, with a gradual change to acidic towards the altered diorite host rocks. Utilising three parameters, Ce/Ce*, Eu/Eu* and (Pr/Ybn, in different minerals implies that the hydrothermal fluids responsible for epidote and garnet were mostly of magmatic origin and for magnetite, actinolite and phlogopite these were of magmatic origin with low REE concentration or meteoric water involved.

  13. Magnetic studies of archaeological obsidian: Variability of eruptive conditions within obsidian flows is key to high-resolution artifact sourcing (Invited)

    Science.gov (United States)

    Feinberg, J. M.; Frahm, E.; Muth, M.

    2013-12-01

    Previous studies have endeavored to use petrophysical traits of obsidian, particularly its magnetic properties, as an alternative to conventional geochemical sourcing, one of the greatest successes in archaeological science. Magnetic approaches, however, have not seen widespread application due to their mixed success. In a time when geochemical analyses can be conducted non-destructively, in the field, and in a minute or two, magnetic measurements of obsidian must offer novel archaeological insights to be worthwhile, not merely act as a less successful version of geochemistry. To this end, we report the findings of a large-scale study of obsidian magnetism, which includes 912 geological obsidian specimens and 97 artifacts measured for six simple magnetic parameters. Based on these results, we propose, rather than using magnetic properties to source artifacts to a particular obsidian flow (inter-flow sourcing), these properties are best used to differentiate quarrying sites within an individual flow (intra-flow sourcing). The magnetic properties within an individual flow are highly variable, due to the fact that a single flow experiences a wide array of cooling rates, absolute temperatures, viscosities, deformation, and oxidation. These conditions affect the concentrations, compositions, size distributions, shapes, and spatial arrangements of magnetic grains within an obsidian specimen and, thus, its intrinsic magnetic properties. This variability decreases dramatically at spatial scales of individual outcrops, and decreases even further at scales of hand samples. Thus, magnetic data appear to shift the scale of obsidian sourcing from flows to quarries and, in turn, enable new insights into raw-material procurement strategies, group mobility, lithic technology, and the organization of space and production. From a geologic perspective, the magnetic variability of obsidian can be broadly interpreted within the context of the igneous processes that were active during

  14. Large break LOCA uncertainty evaluation and comparison with conservative calculation

    International Nuclear Information System (INIS)

    Glaeser, H.G.

    2004-01-01

    The first formulation of the USA Code of Federal Regulations (CFR) 10CFR50 with applicable sections specific to NPP licensing requirements was released 1976. Over a decade later 10CFR 50.46 allowed the use of BE codes instead of conservative code models but uncertainties have to be identified and quantified. Guidelines were released that described interpretations developed over the intervening years that are applicable. Other countries established similar conservative procedures and acceptance criteria. Because conservative methods were used to calculate the peak values of key parameters, such as peak clad temperature (PCT), it was always acknowledged that a large margin, between the 'conservative' calculated value and the 'true' value, existed. Beside USA, regulation in other countries, like Germany, for example, allowed that the state of science and technology is applied in licensing. I.e. the increase of experimental evidence and progress in code development during time could be used. There was no requirement to apply a pure evaluation methodology with licensed assumptions and frozen codes. The thermal-hydraulic system codes became more and more best-estimate codes based on comprehensive validation. This development was and is possible because the rules and guidelines provide the necessary latitude to consider further development of safety technology. Best estimate codes are allowed to be used in licensing in combination with conservative initial and boundary conditions. However, uncertainty quantification is not required. Since some of the initial and boundary conditions are more conservative compared with those internationally used (e.g. 106% reactor power instead 102%, a single failure plus a non-availability due to preventive maintenance is assumed, etc.) it is claimed that the uncertainties of code models are covered. Since many utilities apply for power increase, calculation results come closer to some licensing criteria. The situation in German licensing

  15. Uncertainty in social dilemmas

    NARCIS (Netherlands)

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size

  16. Uncertainty and Climate Change

    OpenAIRE

    Berliner, L. Mark

    2003-01-01

    Anthropogenic, or human-induced, climate change is a critical issue in science and in the affairs of humankind. Though the target of substantial research, the conclusions of climate change studies remain subject to numerous uncertainties. This article presents a very brief review of the basic arguments regarding anthropogenic climate change with particular emphasis on uncertainty.

  17. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  18. Uncertainty and simulation

    International Nuclear Information System (INIS)

    Depres, B.; Dossantos-Uzarralde, P.

    2009-01-01

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  19. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  20. Sketching Uncertainty into Simulations.

    Science.gov (United States)

    Ribicic, H; Waser, J; Gurbat, R; Sadransky, B; Groller, M E

    2012-12-01

    In a variety of application areas, the use of simulation steering in decision making is limited at best. Research focusing on this problem suggests that most user interfaces are too complex for the end user. Our goal is to let users create and investigate multiple, alternative scenarios without the need for special simulation expertise. To simplify the specification of parameters, we move from a traditional manipulation of numbers to a sketch-based input approach. Users steer both numeric parameters and parameters with a spatial correspondence by sketching a change onto the rendering. Special visualizations provide immediate visual feedback on how the sketches are transformed into boundary conditions of the simulation models. Since uncertainty with respect to many intertwined parameters plays an important role in planning, we also allow the user to intuitively setup complete value ranges, which are then automatically transformed into ensemble simulations. The interface and the underlying system were developed in collaboration with experts in the field of flood management. The real-world data they have provided has allowed us to construct scenarios used to evaluate the system. These were presented to a variety of flood response personnel, and their feedback is discussed in detail in the paper. The interface was found to be intuitive and relevant, although a certain amount of training might be necessary.

  1. Treatment of uncertainties in atmospheric chemical systems: A combined modeling and experimental approach

    Science.gov (United States)

    Pun, Betty Kong-Ling

    1998-12-01

    Uncertainty is endemic in modeling. This thesis is a two- phase program to understand the uncertainties in urban air pollution model predictions and in field data used to validate them. Part I demonstrates how to improve atmospheric models by analyzing the uncertainties in these models and using the results to guide new experimentation endeavors. Part II presents an experiment designed to characterize atmospheric fluctuations, which have significant implications towards the model validation process. A systematic study was undertaken to investigate the effects of uncertainties in the SAPRC mechanism for gas- phase chemistry in polluted atmospheres. The uncertainties of more than 500 parameters were compiled, including reaction rate constants, product coefficients, organic composition, and initial conditions. Uncertainty propagation using the Deterministic Equivalent Modeling Method (DEMM) revealed that the uncertainties in ozone predictions can be up to 45% based on these parametric uncertainties. The key parameters found to dominate the uncertainties of the predictions include photolysis rates of NO2, O3, and formaldehyde; the rate constant for nitric acid formation; and initial amounts of NOx and VOC. Similar uncertainty analysis procedures applied to two other mechanisms used in regional air quality models led to the conclusion that in the presence of parametric uncertainties, the mechanisms cannot be discriminated. Research efforts should focus on reducing parametric uncertainties in photolysis rates, reaction rate constants, and source terms. A new tunable diode laser (TDL) infrared spectrometer was designed and constructed to measure multiple pollutants simultaneously in the same ambient air parcels. The sensitivities of the one hertz measurements were 2 ppb for ozone, 1 ppb for NO, and 0.5 ppb for NO2. Meteorological data were also collected for wind, temperature, and UV intensity. The field data showed clear correlations between ozone, NO, and NO2 in the one

  2. Moving Beyond 2% Uncertainty: A New Framework for Quantifying Lidar Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer F.; Clifton, Andrew

    2017-03-08

    Remote sensing of wind using lidar is revolutionizing wind energy. However, current generations of wind lidar are ascribed a climatic value of uncertainty, which is based on a poor description of lidar sensitivity to external conditions. In this presentation, we show how it is important to consider the complete lidar measurement process to define the measurement uncertainty, which in turn offers the ability to define a much more granular and dynamic measurement uncertainty. This approach is a progression from the 'white box' lidar uncertainty method.

  3. Compliance uncertainty of diameter characteristic in the next-generation geometrical product specifications and verification

    International Nuclear Information System (INIS)

    Lu, W L; Jiang, X; Liu, X J; Xu, Z G

    2008-01-01

    Compliance uncertainty is one of the most important elements in the next-generation geometrical product specifications and verification (GPS). It consists of specification uncertainty, method uncertainty and implementation uncertainty, which are three of the four fundamental uncertainties in the next-generation GPS. This paper analyzes the key factors that influence compliance uncertainty and then proposes a procedure to manage the compliance uncertainty. A general model on evaluation of compliance uncertainty has been devised and a specific formula for diameter characteristic has been derived based on this general model. The case study was conducted and it revealed that the completeness of currently dominant diameter characteristic specification needs to be improved

  4. Uncertainty Propagation in OMFIT

    Science.gov (United States)

    Smith, Sterling; Meneghini, Orso; Sung, Choongki

    2017-10-01

    A rigorous comparison of power balance fluxes and turbulent model fluxes requires the propagation of uncertainties in the kinetic profiles and their derivatives. Making extensive use of the python uncertainties package, the OMFIT framework has been used to propagate covariant uncertainties to provide an uncertainty in the power balance calculation from the ONETWO code, as well as through the turbulent fluxes calculated by the TGLF code. The covariant uncertainties arise from fitting 1D (constant on flux surface) density and temperature profiles and associated random errors with parameterized functions such as a modified tanh. The power balance and model fluxes can then be compared with quantification of the uncertainties. No effort is made at propagating systematic errors. A case study will be shown for the effects of resonant magnetic perturbations on the kinetic profiles and fluxes at the top of the pedestal. A separate attempt at modeling the random errors with Monte Carlo sampling will be compared to the method of propagating the fitting function parameter covariant uncertainties. Work supported by US DOE under DE-FC02-04ER54698, DE-FG2-95ER-54309, DE-SC 0012656.

  5. On Uncertainty and the WTA-WTP Gap

    OpenAIRE

    Douglas D. Davis; Robert J. Reilly

    2012-01-01

    We correct an analysis by Isik (2004) regarding the effects of uncertainty on the WTA-WTP gap. Isik presents as his primary result a proposition that the introduction of uncertainty regarding environmental quality improvements causes WTA to increase and WTP to decrease by identical amounts relative to a certainty condition where WTA=WTP. These conclusions are incorrect. In fact, WTP may equal WTA even with uncertainty, and increases in the uncertainty of environmental quality improvements cau...

  6. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    applied at the grid scale. Flux and state hydrological outputs which integrate responses over time and space showed more sensitivity to precipitation mean spatial biases and less so on extremes. In the investigated catchments, the projected change of groundwater levels and basin discharge between current......Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...

  7. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    , and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  8. Discriminative Random Field Models for Subsurface Contamination Uncertainty Quantification

    Science.gov (United States)

    Arshadi, M.; Abriola, L. M.; Miller, E. L.; De Paolis Kaluza, C.

    2017-12-01

    Application of flow and transport simulators for prediction of the release, entrapment, and persistence of dense non-aqueous phase liquids (DNAPLs) and associated contaminant plumes is a computationally intensive process that requires specification of a large number of material properties and hydrologic/chemical parameters. Given its computational burden, this direct simulation approach is particularly ill-suited for quantifying both the expected performance and uncertainty associated with candidate remediation strategies under real field conditions. Prediction uncertainties primarily arise from limited information about contaminant mass distributions, as well as the spatial distribution of subsurface hydrologic properties. Application of direct simulation to quantify uncertainty would, thus, typically require simulating multiphase flow and transport for a large number of permeability and release scenarios to collect statistics associated with remedial effectiveness, a computationally prohibitive process. The primary objective of this work is to develop and demonstrate a methodology that employs measured field data to produce equi-probable stochastic representations of a subsurface source zone that capture the spatial distribution and uncertainty associated with key features that control remediation performance (i.e., permeability and contamination mass). Here we employ probabilistic models known as discriminative random fields (DRFs) to synthesize stochastic realizations of initial mass distributions consistent with known, and typically limited, site characterization data. Using a limited number of full scale simulations as training data, a statistical model is developed for predicting the distribution of contaminant mass (e.g., DNAPL saturation and aqueous concentration) across a heterogeneous domain. Monte-Carlo sampling methods are then employed, in conjunction with the trained statistical model, to generate realizations conditioned on measured borehole data

  9. Uncertainty analysis of the FRAP code

    International Nuclear Information System (INIS)

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the Fuel Rod Analysis Program (FRAP) code and has been applied to a pressurized water reactor (PWR) fuel rod undergoing a loss-of-coolant accident (LOCA). The method of uncertainty analysis is the response surface method. The automated version significantly reduced the time required to complete the analysis and, at the same time, greatly increased the problem scope. Results of the analysis showed a significant difference in the total and relative contributions to the uncertainty of the response parameters between steady state and transient conditions

  10. Uncertainties in projecting climate-change impacts in marine ecosystems

    DEFF Research Database (Denmark)

    Payne, Mark; Barange, Manuel; Cheung, William W. L.

    2016-01-01

    with a projection and building confidence in its robustness. We review how uncertainties in such projections are handled in marine science. We employ an approach developed in climate modelling by breaking uncertainty down into (i) structural (model) uncertainty, (ii) initialization and internal variability......Projections of the impacts of climate change on marine ecosystems are a key prerequisite for the planning of adaptation strategies, yet they are inevitably associated with uncertainty. Identifying, quantifying, and communicating this uncertainty is key to both evaluating the risk associated...... and highlight the opportunities and challenges associated with doing a better job. We find that even within a relatively small field such as marine science, there are substantial differences between subdisciplines in the degree of attention given to each type of uncertainty. We find that initialization...

  11. Responding to the Challenge of True Uncertainty

    DEFF Research Database (Denmark)

    Hallin, Carina Antonia; Andersen, Torben Juul

    We construe a conceptual framework for responding effectively to true uncertainty in the business environment. We drill down to the essential micro-foundational capabilities - sensing and seizing of dynamic capabilities - and link them to classical strategic issue management theory with suggestions...... on how to operationalize these essential capabilities. By definition true uncertainty represents environmental conditions that are hard to foresee, which can catch the unprepared by surprise while presenting opportunities to the conscious organization. We demonstrate that organizations relying...

  12. Uncertainty in oil projects

    International Nuclear Information System (INIS)

    Limperopoulos, G.J.

    1995-01-01

    This report presents an oil project valuation under uncertainty by means of two well-known financial techniques: The Capital Asset Pricing Model (CAPM) and The Black-Scholes Option Pricing Formula. CAPM gives a linear positive relationship between expected rate of return and risk but does not take into consideration the aspect of flexibility which is crucial for an irreversible investment as an oil price is. Introduction of investment decision flexibility by using real options can increase the oil project value substantially. Some simple tests for the importance of uncertainty in stock market for oil investments are performed. Uncertainty in stock returns is correlated with aggregate product market uncertainty according to Pindyck (1991). The results of the tests are not satisfactory due to the short data series but introducing two other explanatory variables the interest rate and Gross Domestic Product make the situation better. 36 refs., 18 figs., 6 tabs

  13. Life expectancy, adapted technology and cold climate conditions : key issues for wind turbines in Canada; Duree de vie, adaptation technologique et conditions froides : un enjeu majeur pour les eoliennes au Canada

    Energy Technology Data Exchange (ETDEWEB)

    Chaumel, J.L.; Nanta, R. [Quebec Univ., Rimouski, PQ (Canada); Golbeck, P. [Peter Golbeck Consultant, Rimouski, PQ (Canada)

    2007-07-01

    This presentation discussed the service life of wind turbines, particularly those operating in cold climates. A map of Quebec was included to indicate the potential sites for an additional 450 MW of wind energy capacity for northern Quebec, near James Bay. Different types of wind turbines were described in terms of their size and power, including those without transformers. It was noted that a 30 per cent growth in the wind power industry is anticipated annually. However, there is currently a lack of wind turbines. A 2 MW wind turbine costs $3 million and major reinvestment is needed after 10 years of service life due to component wear. It was noted that a gear box lasts less than 15 years and other generator components also require maintenance. The primary reasons for increased risk and cost include equipment failures due to component fatigue, cold weather operation, lack of maintenance and bad design for winter conditions. The components affected by failures include gearboxes, generators, pitch controls, and hydraulics. Since the industry is relatively new, there are no replacement parts available for these components and cranage costs are high. In addition, since Canada's entry into the wind industry is also relatively new, there is a lack of machine testing in Canada as well as a lack of understanding of energy capacity and the effects of cold weather. Overproduction also occurs frequently. tabs., figs.

  14. Uncertainty Quantification Techniques for Sensor Calibration Monitoring in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Ramuhalli, Pradeep; Lin, Guang; Crawford, Susan L.; Konomi, Bledar A.; Coble, Jamie B.; Shumaker, Brent; Hashemian, Hash

    2014-04-30

    This report describes research towards the development of advanced algorithms for online calibration monitoring. The objective of this research is to develop the next generation of online monitoring technologies for sensor calibration interval extension and signal validation in operating and new reactors. These advances are expected to improve the safety and reliability of current and planned nuclear power systems as a result of higher accuracies and increased reliability of sensors used to monitor key parameters. The focus of this report is on documenting the outcomes of the first phase of R&D under this project, which addressed approaches to uncertainty quantification (UQ) in online monitoring that are data-driven, and can therefore adjust estimates of uncertainty as measurement conditions change. Such data-driven approaches to UQ are necessary to address changing plant conditions, for example, as nuclear power plants experience transients, or as next-generation small modular reactors (SMR) operate in load-following conditions.

  15. Uncertainty Quantification Techniques for Sensor Calibration Monitoring in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Ramuhalli, Pradeep; Lin, Guang; Crawford, Susan L.; Konomi, Bledar A.; Braatz, Brett G.; Coble, Jamie B.; Shumaker, Brent; Hashemian, Hash

    2013-09-01

    This report describes the status of ongoing research towards the development of advanced algorithms for online calibration monitoring. The objective of this research is to develop the next generation of online monitoring technologies for sensor calibration interval extension and signal validation in operating and new reactors. These advances are expected to improve the safety and reliability of current and planned nuclear power systems as a result of higher accuracies and increased reliability of sensors used to monitor key parameters. The focus of this report is on documenting the outcomes of the first phase of R&D under this project, which addressed approaches to uncertainty quantification (UQ) in online monitoring that are data-driven, and can therefore adjust estimates of uncertainty as measurement conditions change. Such data-driven approaches to UQ are necessary to address changing plant conditions, for example, as nuclear power plants experience transients, or as next-generation small modular reactors (SMR) operate in load-following conditions.

  16. Uncertainties and climatic change

    International Nuclear Information System (INIS)

    De Gier, A.M.; Opschoor, J.B.; Van de Donk, W.B.H.J.; Hooimeijer, P.; Jepma, J.; Lelieveld, J.; Oerlemans, J.; Petersen, A.

    2008-01-01

    Which processes in the climate system are misunderstood? How are scientists dealing with uncertainty about climate change? What will be done with the conclusions of the recently published synthesis report of the IPCC? These and other questions were answered during the meeting 'Uncertainties and climate change' that was held on Monday 26 November 2007 at the KNAW in Amsterdam. This report is a compilation of all the presentations and provides some conclusions resulting from the discussions during this meeting. [mk] [nl

  17. Mechanics and uncertainty

    CERN Document Server

    Lemaire, Maurice

    2014-01-01

    Science is a quest for certainty, but lack of certainty is the driving force behind all of its endeavors. This book, specifically, examines the uncertainty of technological and industrial science. Uncertainty and Mechanics studies the concepts of mechanical design in an uncertain setting and explains engineering techniques for inventing cost-effective products. Though it references practical applications, this is a book about ideas and potential advances in mechanical science.

  18. Uncertainty: lotteries and risk

    OpenAIRE

    Ávalos, Eloy

    2011-01-01

    In this paper we develop the theory of uncertainty in a context where the risks assumed by the individual are measurable and manageable. We primarily use the definition of lottery to formulate the axioms of the individual's preferences, and its representation through the utility function von Neumann - Morgenstern. We study the expected utility theorem and its properties, the paradoxes of choice under uncertainty and finally the measures of risk aversion with monetary lotteries.

  19. Uncertainty calculations made easier

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-07-01

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)

  20. Inclusion of time uncertainty in calibration of ionizing radiations

    International Nuclear Information System (INIS)

    Jordao, B.O.; Quaresma, D.S.; Carvalho, R.J.; Peixoto, J.G.P.

    2014-01-01

    In terms of metrology, two key factors for reliability employed in the calibration process are what we call Traceability and Uncertainty. Traceability will provide confidence in measurements. Already uncertainty will provide security and quality of what this being measured. Based on the above, this article suggests the implementation time of uncertainty in the calibration of radiological instruments thus increasing the reliability and traceability of the system. (author)

  1. Justification for recommended uncertainties

    International Nuclear Information System (INIS)

    Pronyaev, V.G.; Badikov, S.A.; Carlson, A.D.

    2007-01-01

    The uncertainties obtained in an earlier standards evaluation were considered to be unrealistically low by experts of the US Cross Section Evaluation Working Group (CSEWG). Therefore, the CSEWG Standards Subcommittee replaced the covariance matrices of evaluated uncertainties by expanded percentage errors that were assigned to the data over wide energy groups. There are a number of reasons that might lead to low uncertainties of the evaluated data: Underestimation of the correlations existing between the results of different measurements; The presence of unrecognized systematic uncertainties in the experimental data can lead to biases in the evaluated data as well as to underestimations of the resulting uncertainties; Uncertainties for correlated data cannot only be characterized by percentage uncertainties or variances. Covariances between evaluated value at 0.2 MeV and other points obtained in model (RAC R matrix and PADE2 analytical expansion) and non-model (GMA) fits of the 6 Li(n,t) TEST1 data and the correlation coefficients are presented and covariances between the evaluated value at 0.045 MeV and other points (along the line or column of the matrix) as obtained in EDA and RAC R matrix fits of the data available for reactions that pass through the formation of the 7 Li system are discussed. The GMA fit with the GMA database is shown for comparison. The following diagrams are discussed: Percentage uncertainties of the evaluated cross section for the 6 Li(n,t) reaction and the for the 235 U(n,f) reaction; estimation given by CSEWG experts; GMA result with full GMA database, including experimental data for the 6 Li(n,t), 6 Li(n,n) and 6 Li(n,total) reactions; uncertainties in the GMA combined fit for the standards; EDA and RAC R matrix results, respectively. Uncertainties of absolute and 252 Cf fission spectrum averaged cross section measurements, and deviations between measured and evaluated values for 235 U(n,f) cross-sections in the neutron energy range 1

  2. Identification of the Uncertainties for the Calibration of the Partial Safety Factors for Load in Tidal Turbines

    Directory of Open Access Journals (Sweden)

    Gaizka Zarraonandia Simeón

    2016-03-01

    Full Text Available Tidal energy is nowadays one of the fastest growing types of marine renewable energy. In particular, Horizontal Axis Tidal Turbines (HATTs are the most advanced designs and the most appropriate for standardization. This paper presents a review of actual design criteria focusing on the identification of the uncertainties that technology developers need to address during the design process. Key environmental parameters like turbine inflow conditions or predictions of extreme values are still grey areas due to the lack of site measurements and the uncertainty in metocean model predictions. A comparison of turbulence intensity characterization using different tools and at different points in time shows the uncertainty in the prediction of this parameter. Numerical models of HATTs are still quite uncertain, often dependent on experience of the people running them. In the reliability-based calibration of partial safety factors, the uncertainties need to be reflected on the limit state formulation. This paper analyses the different types of uncertainties present in the limit state equation. These uncertainties are assessed in terms of stochastic variables in the limit state equation. In some cases, advantage can be taken from the experience from offshore wind and oil and gas industries. Tidal turbines have a mixture of the uncertainties present in both industries with regard to partial safety factor calibration.

  3. Uncertainties and severe-accident management

    International Nuclear Information System (INIS)

    Kastenberg, W.E.

    1991-01-01

    Severe-accident management can be defined as the use of existing and or alternative resources, systems, and actions to prevent or mitigate a core-melt accident. Together with risk management (e.g., changes in plant operation and/or addition of equipment) and emergency planning (off-site actions), accident management provides an extension of the defense-indepth safety philosophy for severe accidents. A significant number of probabilistic safety assessments have been completed, which yield the principal plant vulnerabilities, and can be categorized as (a) dominant sequences with respect to core-melt frequency, (b) dominant sequences with respect to various risk measures, (c) dominant threats that challenge safety functions, and (d) dominant threats with respect to failure of safety systems. Severe-accident management strategies can be generically classified as (a) use of alternative resources, (b) use of alternative equipment, and (c) use of alternative actions. For each sequence/threat and each combination of strategy, there may be several options available to the operator. Each strategy/option involves phenomenological and operational considerations regarding uncertainty. These include (a) uncertainty in key phenomena, (b) uncertainty in operator behavior, (c) uncertainty in system availability and behavior, and (d) uncertainty in information availability (i.e., instrumentation). This paper focuses on phenomenological uncertainties associated with severe-accident management strategies

  4. Uncertainty in hydraulic tests in fractured rock

    International Nuclear Information System (INIS)

    Ji, Sung-Hoon; Koh, Yong-Kwon

    2014-01-01

    Interpretation of hydraulic tests in fractured rock has uncertainty because of the different hydraulic properties of a fractured rock to a porous medium. In this study, we reviewed several interesting phenomena which show uncertainty in a hydraulic test at a fractured rock and discussed their origins and the how they should be considered during site characterisation. Our results show that the estimated hydraulic parameters of a fractured rock from a hydraulic test are associated with uncertainty due to the changed aperture and non-linear groundwater flow during the test. Although the magnitude of these two uncertainties is site-dependent, the results suggest that it is recommended to conduct a hydraulic test with a little disturbance from the natural groundwater flow to consider their uncertainty. Other effects reported from laboratory and numerical experiments such as the trapping zone effect (Boutt, 2006) and the slip condition effect (Lee, 2014) can also introduce uncertainty to a hydraulic test, which should be evaluated in a field test. It is necessary to consider the way how to evaluate the uncertainty in the hydraulic property during the site characterisation and how to apply it to the safety assessment of a subsurface repository. (authors)

  5. Dealing with exploration uncertainties

    International Nuclear Information System (INIS)

    Capen, E.

    1992-01-01

    Exploration for oil and gas should fulfill the most adventurous in their quest for excitement and surprise. This paper tries to cover that tall order. The authors will touch on the magnitude of the uncertainty (which is far greater than in most other businesses), the effects of not knowing target sizes very well, how to build uncertainty into analyses naturally, how to tie reserves and chance estimates to economics, and how to look at the portfolio effect of an exploration program. With no apologies, the authors will be using a different language for some readers - the language of uncertainty, which means probability and statistics. These tools allow one to combine largely subjective exploration information with the more analytical data from the engineering and economic side

  6. Parameter Uncertainty on AGCM-simulated Tropical Cyclones

    Science.gov (United States)

    He, F.

    2015-12-01

    This work studies the parameter uncertainty on tropical cyclone (TC) simulations in Atmospheric General Circulation Models (AGCMs) using the Reed-Jablonowski TC test case, which is illustrated in Community Atmosphere Model (CAM). It examines the impact from 24 parameters across the physical parameterization schemes that represent the convection, turbulence, precipitation and cloud processes in AGCMs. The one-at-a-time (OAT) sensitivity analysis method first quantifies their relative importance on TC simulations and identifies the key parameters to the six different TC characteristics: intensity, precipitation, longwave cloud radiative forcing (LWCF), shortwave cloud radiative forcing (SWCF), cloud liquid water path (LWP) and ice water path (IWP). Then, 8 physical parameters are chosen and perturbed using the Latin-Hypercube Sampling (LHS) method. The comparison between OAT ensemble run and LHS ensemble run shows that the simulated TC intensity is mainly affected by the parcel fractional mass entrainment rate in Zhang-McFarlane (ZM) deep convection scheme. The nonlinear interactive effect among different physical parameters is negligible on simulated TC intensity. In contrast, this nonlinear interactive effect plays a significant role in other simulated tropical cyclone characteristics (precipitation, LWCF, SWCF, LWP and IWP) and greatly enlarge their simulated uncertainties. The statistical emulator Extended Multivariate Adaptive Regression Splines (EMARS) is applied to characterize the response functions for nonlinear effect. Last, we find that the intensity uncertainty caused by physical parameters is in a degree comparable to uncertainty caused by model structure (e.g. grid) and initial conditions (e.g. sea surface temperature, atmospheric moisture). These findings suggest the importance of using the perturbed physics ensemble (PPE) method to revisit tropical cyclone prediction under climate change scenario.

  7. Application of intelligence based uncertainty analysis for HLW disposal

    International Nuclear Information System (INIS)

    Kato, Kazuyuki

    2003-01-01

    Safety assessment for geological disposal of high level radioactive waste inevitably involves factors that cannot be specified in a deterministic manner. These are namely: (1) 'variability' that arises from stochastic nature of the processes and features considered, e.g., distribution of canister corrosion times and spatial heterogeneity of a host geological formation; (2) 'ignorance' due to incomplete or imprecise knowledge of the processes and conditions expected in the future, e.g., uncertainty in the estimation of solubilities and sorption coefficients for important nuclides. In many cases, a decision in assessment, e.g., selection among model options or determination of a parameter value, is subjected to both variability and ignorance in a combined form. It is clearly important to evaluate both influences of variability and ignorance on the result of a safety assessment in a consistent manner. We developed a unified methodology to handle variability and ignorance by using probabilistic and possibilistic techniques respectively. The methodology has been applied to safety assessment of geological disposal of high level radioactive waste. Uncertainties associated with scenarios, models and parameters were defined in terms of fuzzy membership functions derived through a series of interviews to the experts while variability was formulated by means of probability density functions (pdfs) based on available data set. The exercise demonstrated applicability of the new methodology and, in particular, its advantage in quantifying uncertainties based on expert's opinion and in providing information on dependence of assessment result on the level of conservatism. In addition, it was also shown that sensitivity analysis could identify key parameters in reducing uncertainties associated with the overall assessment. The above information can be used to support the judgment process and guide the process of disposal system development in optimization of protection against

  8. Uncertainty in artificial intelligence

    CERN Document Server

    Levitt, TS; Lemmer, JF; Shachter, RD

    1990-01-01

    Clearly illustrated in this volume is the current relationship between Uncertainty and AI.It has been said that research in AI revolves around five basic questions asked relative to some particular domain: What knowledge is required? How can this knowledge be acquired? How can it be represented in a system? How should this knowledge be manipulated in order to provide intelligent behavior? How can the behavior be explained? In this volume, all of these questions are addressed. From the perspective of the relationship of uncertainty to the basic questions of AI, the book divides naturally i

  9. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  10. Religion in the face of uncertainty: an uncertainty-identity theory account of religiousness.

    Science.gov (United States)

    Hogg, Michael A; Adelman, Janice R; Blagg, Robert D

    2010-02-01

    The authors characterize religions as social groups and religiosity as the extent to which a person identifies with a religion, subscribes to its ideology or worldview, and conforms to its normative practices. They argue that religions have attributes that make them well suited to reduce feelings of self-uncertainty. According to uncertainty-identity theory, people are motivated to reduce feelings of uncertainty about or reflecting on self; and identification with groups, particularly highly entitative groups, is a very effective way to reduce uncertainty. All groups provide belief systems and normative prescriptions related to everyday life. However, religions also address the nature of existence, invoking sacred entities and associated rituals and ceremonies. They are entitative groups that provide a moral compass and rules for living that pervade a person's life, making them particularly attractive in times of uncertainty. The authors document data supporting their analysis and discuss conditions that transform religiosity into religious zealotry and extremism.

  11. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  12. Flood risk assessment and robust management under deep uncertainty: Application to Dhaka City

    Science.gov (United States)

    Mojtahed, Vahid; Gain, Animesh Kumar; Giupponi, Carlo

    2014-05-01

    The socio-economic changes as well as climatic changes have been the main drivers of uncertainty in environmental risk assessment and in particular flood. The level of future uncertainty that researchers face when dealing with problems in a future perspective with focus on climate change is known as Deep Uncertainty (also known as Knightian uncertainty), since nobody has already experienced and undergone those changes before and our knowledge is limited to the extent that we have no notion of probabilities, and therefore consolidated risk management approaches have limited potential.. Deep uncertainty is referred to circumstances that analysts and experts do not know or parties to decision making cannot agree on: i) the appropriate models describing the interaction among system variables, ii) probability distributions to represent uncertainty about key parameters in the model 3) how to value the desirability of alternative outcomes. The need thus emerges to assist policy-makers by providing them with not a single and optimal solution to the problem at hand, such as crisp estimates for the costs of damages of natural hazards considered, but instead ranges of possible future costs, based on the outcomes of ensembles of assessment models and sets of plausible scenarios. Accordingly, we need to substitute optimality as a decision criterion with robustness. Under conditions of deep uncertainty, the decision-makers do not have statistical and mathematical bases to identify optimal solutions, while instead they should prefer to implement "robust" decisions that perform relatively well over all conceivable outcomes out of all future unknown scenarios. Under deep uncertainty, analysts cannot employ probability theory or other statistics that usually can be derived from observed historical data and therefore, we turn to non-statistical measures such as scenario analysis. We construct several plausible scenarios with each scenario being a full description of what may happen

  13. Uncertainty analysis for Ulysses safety evaluation report

    International Nuclear Information System (INIS)

    Frank, M.V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source---Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show that INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low

  14. Uncertainty Relations and Possible Experience

    Directory of Open Access Journals (Sweden)

    Gregg Jaeger

    2016-06-01

    Full Text Available The uncertainty principle can be understood as a condition of joint indeterminacy of classes of properties in quantum theory. The mathematical expressions most closely associated with this principle have been the uncertainty relations, various inequalities exemplified by the well known expression regarding position and momentum introduced by Heisenberg. Here, recent work involving a new sort of “logical” indeterminacy principle and associated relations introduced by Pitowsky, expressable directly in terms of probabilities of outcomes of measurements of sharp quantum observables, is reviewed and its quantum nature is discussed. These novel relations are derivable from Boolean “conditions of possible experience” of the quantum realm and have been considered both as fundamentally logical and as fundamentally geometrical. This work focuses on the relationship of indeterminacy to the propositions regarding the values of discrete, sharp observables of quantum systems. Here, reasons for favoring each of these two positions are considered. Finally, with an eye toward future research related to indeterminacy relations, further novel approaches grounded in category theory and intended to capture and reconceptualize the complementarity characteristics of quantum propositions are discussed in relation to the former.

  15. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  16. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander; Matthies, Hermann G.

    2014-01-01

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  17. Inverse problems and uncertainty quantification

    KAUST Repository

    Litvinenko, Alexander

    2013-12-18

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)— the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  18. Scientific Uncertainties in Climate Change Detection and Attribution Studies

    Science.gov (United States)

    Santer, B. D.

    2017-12-01

    It has been claimed that the treatment and discussion of key uncertainties in climate science is "confined to hushed sidebar conversations at scientific conferences". This claim is demonstrably incorrect. Climate change detection and attribution studies routinely consider key uncertainties in observational climate data, as well as uncertainties in model-based estimates of natural variability and the "fingerprints" in response to different external forcings. The goal is to determine whether such uncertainties preclude robust identification of a human-caused climate change fingerprint. It is also routine to investigate the impact of applying different fingerprint identification strategies, and to assess how detection and attribution results are impacted by differences in the ability of current models to capture important aspects of present-day climate. The exploration of the uncertainties mentioned above will be illustrated using examples from detection and attribution studies with atmospheric temperature and moisture.

  19. "The more you know, the more you realise it is really challenging to do": Tensions and uncertainties in person-centred support for people with long-term conditions.

    Science.gov (United States)

    Entwistle, Vikki A; Cribb, Alan; Watt, Ian S; Skea, Zoë C; Owens, John; Morgan, Heather M; Christmas, Simon

    2018-03-30

    To identify and examine tensions and uncertainties in person-centred approaches to self-management support - approaches that take patients seriously as moral agents and orient support to enable them to live (and die) well on their own terms. Interviews with 26 UK clinicians about working with people with diabetes or Parkinson's disease, conducted within a broader interdisciplinary project on self-management support. The analysis reported here was informed by philosophical reasoning and discussions with stakeholders. Person-centred approaches require clinicians to balance tensions between the many things that can matter in life, and their own and each patient's perspectives on these. Clinicians must ensure that their supportive efforts do not inadvertently disempower people. When attending to someone's particular circumstances and perspectives, they sometimes face intractable uncertainties, including about what is most important to the person and what, realistically, the person can or could do and achieve. The kinds of professional judgement that person-centred working necessitates are not always acknowledged and supported. Practical and ethical tensions are inherent in person-centred support and need to be better understood and addressed. Professional development and service improvement initiatives should recognise these tensions and uncertainties and support clinicians to navigate them well. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  20. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  1. Uncertainties in repository modeling

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1996-01-01

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling

  2. Risks, uncertainty, vagueness

    International Nuclear Information System (INIS)

    Haefele, W.; Renn, O.; Erdmann, G.

    1990-01-01

    The notion of 'risk' is discussed in its social and technological contexts, leading to an investigation of the terms factuality, hypotheticality, uncertainty, and vagueness, and to the problems of acceptance and acceptability especially in the context of political decision finding. (DG) [de

  3. Expanding Uncertainty Principle to Certainty-Uncertainty Principles with Neutrosophy and Quad-stage Method

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-03-01

    Full Text Available The most famous contribution of Heisenberg is uncertainty principle. But the original uncertainty principle is improper. Considering all the possible situations (including the case that people can create laws and applying Neutrosophy and Quad-stage Method, this paper presents "certainty-uncertainty principles" with general form and variable dimension fractal form. According to the classification of Neutrosophy, "certainty-uncertainty principles" can be divided into three principles in different conditions: "certainty principle", namely a particle’s position and momentum can be known simultaneously; "uncertainty principle", namely a particle’s position and momentum cannot be known simultaneously; and neutral (fuzzy "indeterminacy principle", namely whether or not a particle’s position and momentum can be known simultaneously is undetermined. The special cases of "certain ty-uncertainty principles" include the original uncertainty principle and Ozawa inequality. In addition, in accordance with the original uncertainty principle, discussing high-speed particle’s speed and track with Newton mechanics is unreasonable; but according to "certaintyuncertainty principles", Newton mechanics can be used to discuss the problem of gravitational defection of a photon orbit around the Sun (it gives the same result of deflection angle as given by general relativity. Finally, for the reason that in physics the principles, laws and the like that are regardless of the principle (law of conservation of energy may be invalid; therefore "certaintyuncertainty principles" should be restricted (or constrained by principle (law of conservation of energy, and thus it can satisfy the principle (law of conservation of energy.

  4. Uncertainty in hydrological signatures for gauged and ungauged catchments

    Science.gov (United States)

    Westerberg, Ida K.; Wagener, Thorsten; Coxon, Gemma; McMillan, Hilary K.; Castellarin, Attilio; Montanari, Alberto; Freer, Jim

    2016-03-01

    Reliable information about hydrological behavior is needed for water-resource management and scientific investigations. Hydrological signatures quantify catchment behavior as index values, and can be predicted for ungauged catchments using a regionalization procedure. The prediction reliability is affected by data uncertainties for the gauged catchments used in prediction and by uncertainties in the regionalization procedure. We quantified signature uncertainty stemming from discharge data uncertainty for 43 UK catchments and propagated these uncertainties in signature regionalization, while accounting for regionalization uncertainty with a weighted-pooling-group approach. Discharge uncertainty was estimated using Monte Carlo sampling of multiple feasible rating curves. For each sampled rating curve, a discharge time series was calculated and used in deriving the gauged signature uncertainty distribution. We found that the gauged uncertainty varied with signature type, local measurement conditions and catchment behavior, with the highest uncertainties (median relative uncertainty ±30-40% across all catchments) for signatures measuring high- and low-flow magnitude and dynamics. Our regionalization method allowed assessing the role and relative magnitudes of the gauged and regionalized uncertainty sources in shaping the signature uncertainty distributions predicted for catchments treated as ungauged. We found that (1) if the gauged uncertainties were neglected there was a clear risk of overconditioning the regionalization inference, e.g., by attributing catchment differences resulting from gauged uncertainty to differences in catchment behavior, and (2) uncertainty in the regionalization results was lower for signatures measuring flow distribution (e.g., mean flow) than flow dynamics (e.g., autocorrelation), and for average flows (and then high flows) compared to low flows.

  5. Strategy under uncertainty.

    Science.gov (United States)

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  6. Model uncertainty in financial markets : Long run risk and parameter uncertainty

    NARCIS (Netherlands)

    de Roode, F.A.

    2014-01-01

    Uncertainty surrounding key parameters of financial markets, such as the in- flation and equity risk premium, constitute a major risk for institutional investors with long investment horizons. Hedging the investors’ inflation exposure can be challenging due to the lack of domestic inflation-linked

  7. A retrospective dosimetry method and its uncertainty analysis

    International Nuclear Information System (INIS)

    Zhang, L.; Jia, D.; Dai, G.

    2000-01-01

    The main aim of a radiation epidemiological study is to assess the risk of the population exposed to ionizing radiation. The actual work of the assessment may be very difficult because dose information about the population is often indirect and incomplete. It is very important, therefore, to find a way of estimating reasonable and reliable doses of the population by a retrospective method from limited information. In order to provide reasonable dose information for the cohort study of Chinese medical diagnostic X-ray workers, a retrospective dosimetry method was established. In China, a cohort study of more than 27,000 medical diagnostic X-ray workers, with 25,000 controls, has been carried out for about fifteen years in order to assess the risk to an occupationally exposed population. Obviously, a key to the success of the study is to obtain reliable and reasonable results of dose estimation by the dose reconstruction method. Before 1985, there was a lack of information regarding personal dose measured directly; however, we can obtain other indirect information. Examples are information about working loads from the documents of the hospitals, information about operational conditions of the workers of different statuses by a survey of occupational history, and the exposure levels of various working conditions by some simulation methods. The information for estimating organ dose can also be obtained by simulating experiments with a phantom. Based on the information mentioned above, a mathematical model and computerizing system for dose reconstruction of this occupational population was design and developed. Uncertainty analysis very important for dose reconstruction. The sources of uncertainty of our study are coming from two fields. One is coming from the mode of dose reconstruction. Another is coming from the survey of the occupational history. In the result reported, main results of the uncertainty will be presented. In order to control the uncertainty of the

  8. Development of mechanistic sorption model and treatment of uncertainties for Ni sorption on montmorillonite/bentonite

    International Nuclear Information System (INIS)

    Ochs, Michael; Ganter, Charlotte; Tachi, Yukio; Suyama, Tadahiro; Yui, Mikazu

    2011-02-01

    Sorption and diffusion of radionuclides in buffer materials (bentonite) are the key processes in the safe geological disposal of radioactive waste, because migration of radionuclides in this barrier is expected to be diffusion-controlled and retarded by sorption processes. It is therefore necessary to understand the detailed/coupled processes of sorption and diffusion in compacted bentonite and develop mechanistic /predictive models, so that reliable parameters can be set under a variety of geochemical conditions relevant to performance assessment (PA). For this purpose, JAEA has developed the integrated sorption and diffusion (ISD) model/database in montmorillonite/bentonite systems. The main goal of the mechanistic model/database development is to provide a tool for a consistent explanation, prediction, and uncertainty assessment of K d as well as diffusion parameters needed for the quantification of radionuclide transport. The present report focuses on developing the thermodynamic sorption model (TSM) and on the quantification and handling of model uncertainties in applications, based on illustrating by example of Ni sorption on montmorillonite/bentonite. This includes 1) a summary of the present state of the art of thermodynamic sorption modeling, 2) a discussion of the selection of surface species and model design appropriate for the present purpose, 3) possible sources and representations of TSM uncertainties, and 4) details of modeling, testing and uncertainty evaluation for Ni sorption. Two fundamentally different approaches are presented and compared for representing TSM uncertainties: 1) TSM parameter uncertainties calculated by FITEQL optimization routines and some statistical procedure, 2) overall error estimated by direct comparison of modeled and experimental K d values. The overall error in K d is viewed as the best representation of model uncertainty in ISD model/database development. (author)

  9. Uncertainty in biodiversity science, policy and management: a conceptual overview

    Directory of Open Access Journals (Sweden)

    Yrjö Haila

    2014-10-01

    Full Text Available The protection of biodiversity is a complex societal, political and ultimately practical imperative of current global society. The imperative builds upon scientific knowledge on human dependence on the life-support systems of the Earth. This paper aims at introducing main types of uncertainty inherent in biodiversity science, policy and management, as an introduction to a companion paper summarizing practical experiences of scientists and scholars (Haila et al. 2014. Uncertainty is a cluster concept: the actual nature of uncertainty is inherently context-bound. We use semantic space as a conceptual device to identify key dimensions of uncertainty in the context of biodiversity protection; these relate to [i] data; [ii] proxies; [iii] concepts; [iv] policy and management; and [v] normative goals. Semantic space offers an analytic perspective for drawing critical distinctions between types of uncertainty, identifying fruitful resonances that help to cope with the uncertainties, and building up collaboration between different specialists to support mutual social learning.

  10. GRAPH THEORY APPROACH TO QUANTIFY UNCERTAINTY OF PERFORMANCE MEASURES

    Directory of Open Access Journals (Sweden)

    Sérgio D. Sousa

    2015-03-01

    Full Text Available In this work, the performance measurement process is studied to quantify the uncertainty induced in the resulting performance measure (PM. To that end, the causes of uncertainty are identified, analysing the activities undertaken in the three following stages of the performance measurement process: design and implementation, data collection and record, and determination and analysis. A quantitative methodology based on graph theory and on the sources of uncertainty of the performance measurement process is used to calculate an uncertainty index to evaluate the level of uncertainty of a given PM or (key performance indicator. An application example is presented. The quantification of PM uncertainty could contribute to better represent the risk associated with a given decision and also to improve the PM to increase its precision and reliability.

  11. Uncertainty in dispersion forecasts using meteorological ensembles

    International Nuclear Information System (INIS)

    Chin, H N; Leach, M J

    1999-01-01

    The usefulness of dispersion forecasts depends on proper interpretation of results. Understanding the uncertainty in model predictions and the range of possible outcomes is critical for determining the optimal course of action in response to terrorist attacks. One of the objectives for the Modeling and Prediction initiative is creating tools for emergency planning for special events such as the upcoming the Olympics. Meteorological forecasts hours to days in advance are used to estimate the dispersion at the time of the event. However, there is uncertainty in any meteorological forecast, arising from both errors in the data (both initial conditions and boundary conditions) and from errors in the model. We use ensemble forecasts to estimate the uncertainty in the forecasts and the range of possible outcomes

  12. Climate policy uncertainty and investment risk

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-06-21

    Our climate is changing. This is certain. Less certain, however, is the timing and magnitude of climate change, and the cost of transition to a low-carbon world. Therefore, many policies and programmes are still at a formative stage, and policy uncertainty is very high. This book identifies how climate change policy uncertainty may affect investment behaviour in the power sector. For power companies, where capital stock is intensive and long-lived, those risks rank among the biggest and can create an incentive to delay investment. Our analysis results show that the risk premiums of climate change uncertainty can add 40% of construction costs of the plant for power investors, and 10% of price surcharges for the electricity end-users. This publication tells what can be done in policy design to reduce these costs. Incorporating the results of quantitative analysis, this publication also shows the sensitivity of different power sector investment decisions to different risks. It compares the effects of climate policy uncertainty with energy market uncertainty, showing the relative importance of these sources of risk for different technologies in different market types. Drawing on extensive consultation with power companies and financial investors, it also assesses the implications for policy makers, allowing the key messages to be transferred into policy designs. This book is a useful tool for governments to improve climate policy mechanisms and create more certainty for power investors.

  13. Information theoretic quantification of diagnostic uncertainty.

    Science.gov (United States)

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  14. Uncertainty and sampling issues in tank characterization

    International Nuclear Information System (INIS)

    Liebetrau, A.M.; Pulsipher, B.A.; Kashporenko, D.M.

    1997-06-01

    A defensible characterization strategy must recognize that uncertainties are inherent in any measurement or estimate of interest and must employ statistical methods for quantifying and managing those uncertainties. Estimates of risk and therefore key decisions must incorporate knowledge about uncertainty. This report focuses statistical methods that should be employed to ensure confident decision making and appropriate management of uncertainty. Sampling is a major source of uncertainty that deserves special consideration in the tank characterization strategy. The question of whether sampling will ever provide the reliable information needed to resolve safety issues is explored. The issue of sample representativeness must be resolved before sample information is reliable. Representativeness is a relative term but can be defined in terms of bias and precision. Currently, precision can be quantified and managed through an effective sampling and statistical analysis program. Quantifying bias is more difficult and is not being addressed under the current sampling strategies. Bias could be bounded by (1) employing new sampling methods that can obtain samples from other areas in the tanks, (2) putting in new risers on some worst case tanks and comparing the results from existing risers with new risers, or (3) sampling tanks through risers under which no disturbance or activity has previously occurred. With some bound on bias and estimates of precision, various sampling strategies could be determined and shown to be either cost-effective or infeasible

  15. Classification and moral evaluation of uncertainties in engineering modeling.

    Science.gov (United States)

    Murphy, Colleen; Gardoni, Paolo; Harris, Charles E

    2011-09-01

    Engineers must deal with risks and uncertainties as a part of their professional work and, in particular, uncertainties are inherent to engineering models. Models play a central role in engineering. Models often represent an abstract and idealized version of the mathematical properties of a target. Using models, engineers can investigate and acquire understanding of how an object or phenomenon will perform under specified conditions. This paper defines the different stages of the modeling process in engineering, classifies the various sources of uncertainty that arise in each stage, and discusses the categories into which these uncertainties fall. The paper then considers the way uncertainty and modeling are approached in science and the criteria for evaluating scientific hypotheses, in order to highlight the very different criteria appropriate for the development of models and the treatment of the inherent uncertainties in engineering. Finally, the paper puts forward nine guidelines for the treatment of uncertainty in engineering modeling.

  16. Uncertainty in adaptive capacity

    International Nuclear Information System (INIS)

    Neil Adger, W.; Vincent, K.

    2005-01-01

    The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. (authors)

  17. Uncertainties about climate

    International Nuclear Information System (INIS)

    Laval, Katia; Laval, Guy

    2013-01-01

    Like meteorology, climatology is not an exact science: climate change forecasts necessarily include a share of uncertainty. It is precisely this uncertainty which is brandished and exploited by the opponents to the global warming theory to put into question the estimations of its future consequences. Is it legitimate to predict the future using the past climate data (well documented up to 100000 years BP) or the climates of other planets, taking into account the impreciseness of the measurements and the intrinsic complexity of the Earth's machinery? How is it possible to model a so huge and interwoven system for which any exact description has become impossible? Why water and precipitations play such an important role in local and global forecasts, and how should they be treated? This book written by two physicists answers with simpleness these delicate questions in order to give anyone the possibility to build his own opinion about global warming and the need to act rapidly

  18. Understanding Climate Uncertainty with an Ocean Focus

    Science.gov (United States)

    Tokmakian, R. T.

    2009-12-01

    Uncertainty in climate simulations arises from various aspects of the end-to-end process of modeling the Earth’s climate. First, there is uncertainty from the structure of the climate model components (e.g. ocean/ice/atmosphere). Even the most complex models are deficient, not only in the complexity of the processes they represent, but in which processes are included in a particular model. Next, uncertainties arise from the inherent error in the initial and boundary conditions of a simulation. Initial conditions are the state of the weather or climate at the beginning of the simulation and other such things, and typically come from observations. Finally, there is the uncertainty associated with the values of parameters in the model. These parameters may represent physical constants or effects, such as ocean mixing, or non-physical aspects of modeling and computation. The uncertainty in these input parameters propagates through the non-linear model to give uncertainty in the outputs. The models in 2020 will no doubt be better than today’s models, but they will still be imperfect, and development of uncertainty analysis technology is a critical aspect of understanding model realism and prediction capability. Smith [2002] and Cox and Stephenson [2007] discuss the need for methods to quantify the uncertainties within complicated systems so that limitations or weaknesses of the climate model can be understood. In making climate predictions, we need to have available both the most reliable model or simulation and a methods to quantify the reliability of a simulation. If quantitative uncertainty questions of the internal model dynamics are to be answered with complex simulations such as AOGCMs, then the only known path forward is based on model ensembles that characterize behavior with alternative parameter settings [e.g. Rougier, 2007]. The relevance and feasibility of using "Statistical Analysis of Computer Code Output" (SACCO) methods for examining uncertainty in

  19. The uncertainty principle

    International Nuclear Information System (INIS)

    Martens, Hans.

    1991-01-01

    The subject of this thesis is the uncertainty principle (UP). The UP is one of the most characteristic points of differences between quantum and classical mechanics. The starting point of this thesis is the work of Niels Bohr. Besides the discussion the work is also analyzed. For the discussion of the different aspects of the UP the formalism of Davies and Ludwig is used instead of the more commonly used formalism of Neumann and Dirac. (author). 214 refs.; 23 figs

  20. Uncertainty in artificial intelligence

    CERN Document Server

    Shachter, RD; Henrion, M; Lemmer, JF

    1990-01-01

    This volume, like its predecessors, reflects the cutting edge of research on the automation of reasoning under uncertainty.A more pragmatic emphasis is evident, for although some papers address fundamental issues, the majority address practical issues. Topics include the relations between alternative formalisms (including possibilistic reasoning), Dempster-Shafer belief functions, non-monotonic reasoning, Bayesian and decision theoretic schemes, and new inference techniques for belief nets. New techniques are applied to important problems in medicine, vision, robotics, and natural language und

  1. Decision Making Under Uncertainty

    Science.gov (United States)

    2010-11-01

    A sound approach to rational decision making requires a decision maker to establish decision objectives, identify alternatives, and evaluate those...often violate the axioms of rationality when making decisions under uncertainty. The systematic description of such observations may lead to the...which leads to “anchoring” on the initial value. The fact that individuals have been shown to deviate from rationality when making decisions

  2. Economic uncertainty principle?

    OpenAIRE

    Alexander Harin

    2006-01-01

    The economic principle of (hidden) uncertainty is presented. New probability formulas are offered. Examples of solutions of three types of fundamental problems are reviewed.; Principe d'incertitude économique? Le principe économique d'incertitude (cachée) est présenté. De nouvelles formules de chances sont offertes. Les exemples de solutions des trois types de problèmes fondamentaux sont reconsidérés.

  3. Citizen Candidates Under Uncertainty

    OpenAIRE

    Eguia, Jon X.

    2005-01-01

    In this paper we make two contributions to the growing literature on "citizen-candidate" models of representative democracy. First, we add uncertainty about the total vote count. We show that in a society with a large electorate, where the outcome of the election is uncertain and where winning candidates receive a large reward from holding office, there will be a two-candidate equilibrium and no equilibria with a single candidate. Second, we introduce a new concept of equilibrium, which we te...

  4. Managing the uncertainties of the streamflow data produced by the French national hydrological services

    Science.gov (United States)

    Puechberty, Rachel; Bechon, Pierre-Marie; Le Coz, Jérôme; Renard, Benjamin

    2015-04-01

    The French national hydrological services (NHS) manage the production of streamflow time series throughout the national territory. The hydrological data are made available to end-users through different web applications and the national hydrological archive (Banque Hydro). Providing end-users with qualitative and quantitative information on the uncertainty of the hydrological data is key to allow them drawing relevant conclusions and making appropriate decisions. Due to technical and organisational issues that are specific to the field of hydrometry, quantifying the uncertainty of hydrological measurements is still challenging and not yet standardized. The French NHS have made progress on building a consistent strategy to assess the uncertainty of their streamflow data. The strategy consists of addressing the uncertainties produced and propagated at each step of the data production with uncertainty analysis tools that are compatible with each other and compliant with international uncertainty guidance and standards. Beyond the necessary research and methodological developments, operational software tools and procedures are absolutely necessary to the data management and uncertainty analysis by field hydrologists. A first challenge is to assess, and if possible reduce, the uncertainty of streamgauging data, i.e. direct stage-discharge measurements. Interlaboratory experiments proved to be a very efficient way to empirically measure the uncertainty of a given streamgauging technique in given measurement conditions. The Q+ method (Le Coz et al., 2012) was developed to improve the uncertainty propagation method proposed in the ISO748 standard for velocity-area gaugings. Both empirical or computed (with Q+) uncertainty values can now be assigned in BAREME, which is the software used by the French NHS for managing streamgauging measurements. A second pivotal step is to quantify the uncertainty related to stage-discharge rating curves and their application to water level

  5. A review of uncertainty research in impact assessment

    International Nuclear Information System (INIS)

    Leung, Wanda; Noble, Bram; Gunn, Jill; Jaeger, Jochen A.G.

    2015-01-01

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  6. A review of uncertainty research in impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Leung, Wanda, E-mail: wanda.leung@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Noble, Bram, E-mail: b.noble@usask.ca [Department of Geography and Planning, School of Environment and Sustainability, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Gunn, Jill, E-mail: jill.gunn@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Jaeger, Jochen A.G., E-mail: jochen.jaeger@concordia.ca [Department of Geography, Planning and Environment, Concordia University, 1455 de Maisonneuve W., Suite 1255, Montreal, Quebec H3G 1M8 (Canada); Loyola Sustainability Research Centre, Concordia University, 7141 Sherbrooke W., AD-502, Montreal, Quebec H4B 1R6 (Canada)

    2015-01-15

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  7. Calibration Under Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  8. Participation under Uncertainty

    International Nuclear Information System (INIS)

    Boudourides, Moses A.

    2003-01-01

    This essay reviews a number of theoretical perspectives about uncertainty and participation in the present-day knowledge-based society. After discussing the on-going reconfigurations of science, technology and society, we examine how appropriate for policy studies are various theories of social complexity. Post-normal science is such an example of a complexity-motivated approach, which justifies civic participation as a policy response to an increasing uncertainty. But there are different categories and models of uncertainties implying a variety of configurations of policy processes. A particular role in all of them is played by expertise whose democratization is an often-claimed imperative nowadays. Moreover, we discuss how different participatory arrangements are shaped into instruments of policy-making and framing regulatory processes. As participation necessitates and triggers deliberation, we proceed to examine the role and the barriers of deliberativeness. Finally, we conclude by referring to some critical views about the ultimate assumptions of recent European policy frameworks and the conceptions of civic participation and politicization that they invoke

  9. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  10. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  11. Uncertainty during breast diagnostic evaluation: state of the science.

    Science.gov (United States)

    Montgomery, Mariann

    2010-01-01

    To present the state of the science on uncertainty in relationship to the experiences of women undergoing diagnostic evaluation for suspected breast cancer. Published articles from Medline, CINAHL, PubMED, and PsycINFO from 1983-2008 using the following key words: breast biopsy, mammography, uncertainty, reframing, inner strength, and disruption. Fifty research studies were examined with all reporting the presence of anxiety persisting throughout the diagnostic evaluation until certitude is achieved through the establishment of a definitive diagnosis. Indirect determinants of uncertainty for women undergoing breast diagnostic evaluation include measures of anxiety, depression, social support, emotional responses, defense mechanisms, and the psychological impact of events. Understanding and influencing the uncertainty experience have been suggested to be key in relieving psychosocial distress and positively influencing future screening behaviors. Several studies examine correlational relationships among anxiety, selection of coping methods, and demographic factors that influence uncertainty. A gap exists in the literature with regard to the relationship of inner strength and uncertainty. Nurses can be invaluable in assisting women in coping with the uncertainty experience by providing positive communication and support. Nursing interventions should be designed and tested for their effects on uncertainty experienced by women undergoing a breast diagnostic evaluation.

  12. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  13. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  14. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  15. Electroencephalographic Evidence of Abnormal Anticipatory Uncertainty Processing in Gambling Disorder Patients.

    Science.gov (United States)

    Megías, Alberto; Navas, Juan F; Perandrés-Gómez, Ana; Maldonado, Antonio; Catena, Andrés; Perales, José C

    2018-06-01

    Putting money at stake produces anticipatory uncertainty, a process that has been linked to key features of gambling. Here we examined how learning and individual differences modulate the stimulus preceding negativity (SPN, an electroencephalographic signature of perceived uncertainty of valued outcomes) in gambling disorder patients (GDPs) and healthy controls (HCs), during a non-gambling contingency learning task. Twenty-four GDPs and 26 HCs performed a causal learning task under conditions of high and medium uncertainty (HU, MU; null and positive cue-outcome contingency, respectively). Participants were asked to predict the outcome trial-by-trial, and to regularly judge the strength of the cue-outcome contingency. A pre-outcome SPN was extracted from simultaneous electroencephalographic recordings for each participant, uncertainty level, and task block. The two groups similarly learnt to predict the occurrence of the outcome in the presence/absence of the cue. In HCs, SPN amplitude decreased as the outcome became predictable in the MU condition, a decrement that was absent in the HU condition, where the outcome remained unpredictable during the task. Most importantly, GDPs' SPN remained high and insensitive to task type and block. In GDPs, the SPN amplitude was linked to gambling preferences. When both groups were considered together, SPN amplitude was also related to impulsivity. GDPs thus showed an abnormal electrophysiological response to outcome uncertainty, not attributable to faulty contingency learning. Differences with controls were larger in frequent players of passive games, and smaller in players of more active games. Potential psychological mechanisms underlying this set of effects are discussed.

  16. Capacity and Entry Deterrence under Demand Uncertainty

    DEFF Research Database (Denmark)

    Poddar, Sougata

    I consider a two period model with an incumbent firm and a potential entrant each of whom produces a homogeneous good. There is a demand uncertainty: it can be high or low and it realizes in the second period. The question I ask: How by choosing capacity at an earlier period of actual production...... of output and, more importently, not knowing which state of demand is going to realize, and knowing that there is a potential entrant, the incumbent firm can influence the outcome of the game by changing its initial condition. To that end, I study how the impact of the distribution of uncertainty deeply...

  17. Field-theoretical space-uncertainty description

    International Nuclear Information System (INIS)

    Papp, E.; Micu, C.A.

    1980-01-01

    An approach has been given to define both the nonzero minimum value of the space-uncertainty evaluation and of the upper rest-mass bound of the involved particles. In this respect there are analysed the space-uncertainties wich emerge both from the regularised quantum field-theory and high-energy behaviour. In such conditions there are involved particles wich are effectively nonpoint ones. It can be then concluded that the dualism broglien between waves and nonpoint particles is actually involved, now in more general terms

  18. Majorization uncertainty relations for mixed quantum states

    Science.gov (United States)

    Puchała, Zbigniew; Rudnicki, Łukasz; Krawiec, Aleksandra; Życzkowski, Karol

    2018-04-01

    Majorization uncertainty relations are generalized for an arbitrary mixed quantum state ρ of a finite size N. In particular, a lower bound for the sum of two entropies characterizing the probability distributions corresponding to measurements with respect to two arbitrary orthogonal bases is derived in terms of the spectrum of ρ and the entries of a unitary matrix U relating both bases. The results obtained can also be formulated for two measurements performed on a single subsystem of a bipartite system described by a pure state, and consequently expressed as an uncertainty relation for the sum of conditional entropies.

  19. Benchmarking observational uncertainties for hydrology (Invited)

    Science.gov (United States)

    McMillan, H. K.; Krueger, T.; Freer, J. E.; Westerberg, I.

    2013-12-01

    There is a pressing need for authoritative and concise information on the expected error distributions and magnitudes in hydrological data, to understand its information content. Many studies have discussed how to incorporate uncertainty information into model calibration and implementation, and shown how model results can be biased if uncertainty is not appropriately characterised. However, it is not always possible (for example due to financial or time constraints) to make detailed studies of uncertainty for every research study. Instead, we propose that the hydrological community could benefit greatly from sharing information on likely uncertainty characteristics and the main factors that control the resulting magnitude. In this presentation, we review the current knowledge of uncertainty for a number of key hydrological variables: rainfall, flow and water quality (suspended solids, nitrogen, phosphorus). We collated information on the specifics of the data measurement (data type, temporal and spatial resolution), error characteristics measured (e.g. standard error, confidence bounds) and error magnitude. Our results were primarily split by data type. Rainfall uncertainty was controlled most strongly by spatial scale, flow uncertainty was controlled by flow state (low, high) and gauging method. Water quality presented a more complex picture with many component errors. For all variables, it was easy to find examples where relative error magnitude exceeded 40%. We discuss some of the recent developments in hydrology which increase the need for guidance on typical error magnitudes, in particular when doing comparative/regionalisation and multi-objective analysis. Increased sharing of data, comparisons between multiple catchments, and storage in national/international databases can mean that data-users are far removed from data collection, but require good uncertainty information to reduce bias in comparisons or catchment regionalisation studies. Recently it has

  20. Reducing the uncertainty in the fidelity of seismic imaging results

    Science.gov (United States)

    Zhou, H. W.; Zou, Z.

    2017-12-01

    A key aspect in geoscientific inversion is quantifying the quality of the results. In seismic imaging, we must quantify the uncertainty of every imaging result based on field data, because data noise and methodology limitations may produce artifacts. Detection of artifacts is therefore an important aspect in uncertainty quantification in geoscientific inversion. Quantifying the uncertainty of seismic imaging solutions means assessing their fidelity, which defines the truthfulness of the imaged targets in terms of their resolution, position error and artifact. Key challenges to achieving the fidelity of seismic imaging include: (1) Difficulty to tell signal from artifact and noise; (2) Limitations in signal-to-noise ratio and seismic illumination; and (3) The multi-scale nature of the data space and model space. Most seismic imaging studies of the Earth's crust and mantle have employed inversion or modeling approaches. Though they are in opposite directions of mapping between the data space and model space, both inversion and modeling seek the best model to minimize the misfit in the data space, which unfortunately is not the output space. The fact that the selection and uncertainty of the output model are not judged in the output space has exacerbated the nonuniqueness problem for inversion and modeling. In contrast, the practice in exploration seismology has long established a two-fold approach of seismic imaging: Using velocity modeling building to establish the long-wavelength reference velocity models, and using seismic migration to map the short-wavelength reflectivity structures. Most interestingly, seismic migration maps the data into an output space called imaging space, where the output reflection images of the subsurface are formed based on an imaging condition. A good example is the reverse time migration, which seeks the reflectivity image as the best fit in the image space between the extrapolation of time-reversed waveform data and the prediction

  1. Image restoration, uncertainty, and information.

    Science.gov (United States)

    Yu, F T

    1969-01-01

    Some of the physical interpretations about image restoration are discussed. From the theory of information the unrealizability of an inverse filter can be explained by degradation of information, which is due to distortion on the recorded image. The image restoration is a time and space problem, which can be recognized from the theory of relativity (the problem of image restoration is related to Heisenberg's uncertainty principle in quantum mechanics). A detailed discussion of the relationship between information and energy is given. Two general results may be stated: (1) the restoration of the image from the distorted signal is possible only if it satisfies the detectability condition. However, the restored image, at the best, can only approach to the maximum allowable time criterion. (2) The restoration of an image by superimposing the distorted signal (due to smearing) is a physically unrealizable method. However, this restoration procedure may be achieved by the expenditure of an infinite amount of energy.

  2. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    NARCIS (Netherlands)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G.; Ring, David; Parisien, Robert

    2016-01-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if

  3. Approximating uncertainty of annual runoff and reservoir yield using stochastic replicates of global climate model data

    Science.gov (United States)

    Peel, M. C.; Srikanthan, R.; McMahon, T. A.; Karoly, D. J.

    2015-04-01

    Two key sources of uncertainty in projections of future runoff for climate change impact assessments are uncertainty between global climate models (GCMs) and within a GCM. Within-GCM uncertainty is the variability in GCM output that occurs when running a scenario multiple times but each run has slightly different, but equally plausible, initial conditions. The limited number of runs available for each GCM and scenario combination within the Coupled Model Intercomparison Project phase 3 (CMIP3) and phase 5 (CMIP5) data sets, limits the assessment of within-GCM uncertainty. In this second of two companion papers, the primary aim is to present a proof-of-concept approximation of within-GCM uncertainty for monthly precipitation and temperature projections and to assess the impact of within-GCM uncertainty on modelled runoff for climate change impact assessments. A secondary aim is to assess the impact of between-GCM uncertainty on modelled runoff. Here we approximate within-GCM uncertainty by developing non-stationary stochastic replicates of GCM monthly precipitation and temperature data. These replicates are input to an off-line hydrologic model to assess the impact of within-GCM uncertainty on projected annual runoff and reservoir yield. We adopt stochastic replicates of available GCM runs to approximate within-GCM uncertainty because large ensembles, hundreds of runs, for a given GCM and scenario are unavailable, other than the Climateprediction.net data set for the Hadley Centre GCM. To date within-GCM uncertainty has received little attention in the hydrologic climate change impact literature and this analysis provides an approximation of the uncertainty in projected runoff, and reservoir yield, due to within- and between-GCM uncertainty of precipitation and temperature projections. In the companion paper, McMahon et al. (2015) sought to reduce between-GCM uncertainty by removing poorly performing GCMs, resulting in a selection of five better performing GCMs from

  4. Neural Mechanisms of Updating under Reducible and Irreducible Uncertainty.

    Science.gov (United States)

    Kobayashi, Kenji; Hsu, Ming

    2017-07-19

    Adaptive decision making depends on an agent's ability to use environmental signals to reduce uncertainty. However, because of multiple types of uncertainty, agents must take into account not only the extent to which signals violate prior expectations but also whether uncertainty can be reduced in the first place. Here we studied how human brains of both sexes respond to signals under conditions of reducible and irreducible uncertainty. We show behaviorally that subjects' value updating was sensitive to the reducibility of uncertainty, and could be quantitatively characterized by a Bayesian model where agents ignore expectancy violations that do not update beliefs or values. Using fMRI, we found that neural processes underlying belief and value updating were separable from responses to expectancy violation, and that reducibility of uncertainty in value modulated connections from belief-updating regions to value-updating regions. Together, these results provide insights into how agents use knowledge about uncertainty to make better decisions while ignoring mere expectancy violation. SIGNIFICANCE STATEMENT To make good decisions, a person must observe the environment carefully, and use these observations to reduce uncertainty about consequences of actions. Importantly, uncertainty should not be reduced purely based on how surprising the observations are, particularly because in some cases uncertainty is not reducible. Here we show that the human brain indeed reduces uncertainty adaptively by taking into account the nature of uncertainty and ignoring mere surprise. Behaviorally, we show that human subjects reduce uncertainty in a quasioptimal Bayesian manner. Using fMRI, we characterize brain regions that may be involved in uncertainty reduction, as well as the network they constitute, and dissociate them from brain regions that respond to mere surprise. Copyright © 2017 the authors 0270-6474/17/376972-11$15.00/0.

  5. Investment and uncertainty

    DEFF Research Database (Denmark)

    Greasley, David; Madsen, Jakob B.

    2006-01-01

    A severe collapse of fixed capital formation distinguished the onset of the Great Depression from other investment downturns between the world wars. Using a model estimated for the years 1890-2000, we show that the expected profitability of capital measured by Tobin's q, and the uncertainty...... surrounding expected profits indicated by share price volatility, were the chief influences on investment levels, and that heightened share price volatility played the dominant role in the crucial investment collapse in 1930. Investment did not simply follow the downward course of income at the onset...

  6. Optimization under Uncertainty

    KAUST Repository

    Lopez, Rafael H.

    2016-01-06

    The goal of this poster is to present the main approaches to optimization of engineering systems in the presence of uncertainties. We begin by giving an insight about robust optimization. Next, we detail how to deal with probabilistic constraints in optimization, the so called the reliability based design. Subsequently, we present the risk optimization approach, which includes the expected costs of failure in the objective function. After that the basic description of each approach is given, the projects developed by CORE are presented. Finally, the main current topic of research of CORE is described.

  7. Optimizing production under uncertainty

    DEFF Research Database (Denmark)

    Rasmussen, Svend

    This Working Paper derives criteria for optimal production under uncertainty based on the state-contingent approach (Chambers and Quiggin, 2000), and discusses po-tential problems involved in applying the state-contingent approach in a normative context. The analytical approach uses the concept...... of state-contingent production functions and a definition of inputs including both sort of input, activity and alloca-tion technology. It also analyses production decisions where production is combined with trading in state-contingent claims such as insurance contracts. The final part discusses...

  8. Commonplaces and social uncertainty

    DEFF Research Database (Denmark)

    Lassen, Inger

    2008-01-01

    This article explores the concept of uncertainty in four focus group discussions about genetically modified food. In the discussions, members of the general public interact with food biotechnology scientists while negotiating their attitudes towards genetic engineering. Their discussions offer...... an example of risk discourse in which the use of commonplaces seems to be a central feature (Myers 2004: 81). My analyses support earlier findings that commonplaces serve important interactional purposes (Barton 1999) and that they are used for mitigating disagreement, for closing topics and for facilitating...

  9. Principles of Uncertainty

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    An intuitive and mathematical introduction to subjective probability and Bayesian statistics. An accessible, comprehensive guide to the theory of Bayesian statistics, Principles of Uncertainty presents the subjective Bayesian approach, which has played a pivotal role in game theory, economics, and the recent boom in Markov Chain Monte Carlo methods. Both rigorous and friendly, the book contains: Introductory chapters examining each new concept or assumption Just-in-time mathematics -- the presentation of ideas just before they are applied Summary and exercises at the end of each chapter Discus

  10. Mathematical Analysis of Uncertainty

    Directory of Open Access Journals (Sweden)

    Angel GARRIDO

    2016-01-01

    Full Text Available Classical Logic showed early its insufficiencies for solving AI problems. The introduction of Fuzzy Logic aims at this problem. There have been research in the conventional Rough direction alone or in the Fuzzy direction alone, and more recently, attempts to combine both into Fuzzy Rough Sets or Rough Fuzzy Sets. We analyse some new and powerful tools in the study of Uncertainty, as the Probabilistic Graphical Models, Chain Graphs, Bayesian Networks, and Markov Networks, integrating our knowledge of graphs and probability.

  11. Resolving structural uncertainty in natural resources management using POMDP approaches

    Science.gov (United States)

    Williams, B.K.

    2011-01-01

    In recent years there has been a growing focus on the uncertainties of natural resources management, and the importance of accounting for uncertainty in assessing management effectiveness. This paper focuses on uncertainty in resource management in terms of discrete-state Markov decision processes (MDP) under structural uncertainty and partial observability. It describes the treatment of structural uncertainty with approaches developed for partially observable resource systems. In particular, I show how value iteration for partially observable MDPs (POMDP) can be extended to structurally uncertain MDPs. A key difference between these process classes is that structurally uncertain MDPs require the tracking of system state as well as a probability structure for the structure uncertainty, whereas with POMDPs require only a probability structure for the observation uncertainty. The added complexity of the optimization problem under structural uncertainty is compensated by reduced dimensionality in the search for optimal strategy. A solution algorithm for structurally uncertain processes is outlined for a simple example in conservation biology. By building on the conceptual framework developed for POMDPs, natural resource analysts and decision makers who confront structural uncertainties in natural resources can take advantage of the rapid growth in POMDP methods and approaches, and thereby produce better conservation strategies over a larger class of resource problems. ?? 2011.

  12. Facing uncertainty in ecosystem services-based resource management.

    Science.gov (United States)

    Grêt-Regamey, Adrienne; Brunner, Sibyl H; Altwegg, Jürg; Bebi, Peter

    2013-09-01

    The concept of ecosystem services is increasingly used as a support for natural resource management decisions. While the science for assessing ecosystem services is improving, appropriate methods to address uncertainties in a quantitative manner are missing. Ignoring parameter uncertainties, modeling uncertainties and uncertainties related to human-environment interactions can modify decisions and lead to overlooking important management possibilities. In this contribution, we present a new approach for mapping the uncertainties in the assessment of multiple ecosystem services. The spatially explicit risk approach links Bayesian networks to a Geographic Information System for forecasting the value of a bundle of ecosystem services and quantifies the uncertainties related to the outcomes in a spatially explicit manner. We demonstrate that mapping uncertainties in ecosystem services assessments provides key information for decision-makers seeking critical areas in the delivery of ecosystem services in a case study in the Swiss Alps. The results suggest that not only the total value of the bundle of ecosystem services is highly dependent on uncertainties, but the spatial pattern of the ecosystem services values changes substantially when considering uncertainties. This is particularly important for the long-term management of mountain forest ecosystems, which have long rotation stands and are highly sensitive to pressing climate and socio-economic changes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Uncertainty models applied to the substation planning

    Energy Technology Data Exchange (ETDEWEB)

    Fontoura Filho, Roberto N [ELETROBRAS, Rio de Janeiro, RJ (Brazil); Aires, Joao Carlos O; Tortelly, Debora L.S. [Light Servicos de Eletricidade S.A., Rio de Janeiro, RJ (Brazil)

    1994-12-31

    The selection of the reinforcements for a power system expansion becomes a difficult task on an environment of uncertainties. These uncertainties can be classified according to their sources as exogenous and endogenous. The first one is associated to the elements of the generation, transmission and distribution systems. The exogenous uncertainly is associated to external aspects, as the financial resources, the time spent to build the installations, the equipment price and the load level. The load uncertainly is extremely sensible to the behaviour of the economic conditions. Although the impossibility to take out completely the uncertainty , the endogenous one can be convenient treated and the exogenous uncertainly can be compensated. This paper describes an uncertainty treatment methodology and a practical application to a group of substations belonging to LIGHT company, the Rio de Janeiro electric utility. The equipment performance uncertainty is treated by adopting a probabilistic approach. The uncertainly associated to the load increase is considered by using technical analysis of scenarios and choice criteria based on the Decision Theory. On this paper it was used the Savage Method and the Fuzzy Set Method, in order to select the best middle term reinforcements plan. (author) 7 refs., 4 figs., 6 tabs.

  14. Uncertainty analysis of energy consumption in dwellings

    Energy Technology Data Exchange (ETDEWEB)

    Pettersen, Trine Dyrstad

    1997-12-31

    This thesis presents a comprehensive study of an energy estimation model that can be used to examine the uncertainty of predicted energy consumption in a dwelling. The variation and uncertainty of input parameters due to the outdoor climate, the building construction and the inhabitants are studied as a basis for further energy evaluations. The occurring variations of energy consumption in nominal similar dwellings are also investigated due to verification of the simulated energy consumption. The main topics are (1) a study of expected variations and uncertainties in both input parameters used in energy consumption calculations and the energy consumption in the dwelling, (2) the development and evaluation of a simplified energy calculation model that considers uncertainties due to the input parameters, (3) an evaluation of the influence of the uncertain parameters on the total variation so that the most important parameters can be identified, and (4) the recommendation of a simplified procedure for treating uncertainties or possible deviations from average conditions. 90 refs., 182 figs., 73 tabs.

  15. Uncertainty Propagation in Hypersonic Vehicle Aerothermoelastic Analysis

    Science.gov (United States)

    Lamorte, Nicolas Etienne

    Hypersonic vehicles face a challenging flight environment. The aerothermoelastic analysis of its components requires numerous simplifying approximations. Identifying and quantifying the effect of uncertainties pushes the limits of the existing deterministic models, and is pursued in this work. An uncertainty quantification framework is used to propagate the effects of identified uncertainties on the stability margins and performance of the different systems considered. First, the aeroelastic stability of a typical section representative of a control surface on a hypersonic vehicle is examined. Variability in the uncoupled natural frequencies of the system is modeled to mimic the effect of aerodynamic heating. Next, the stability of an aerodynamically heated panel representing a component of the skin of a generic hypersonic vehicle is considered. Uncertainty in the location of transition from laminar to turbulent flow and the heat flux prediction is quantified using CFD. In both cases significant reductions of the stability margins are observed. A loosely coupled airframe--integrated scramjet engine is considered next. The elongated body and cowl of the engine flow path are subject to harsh aerothermodynamic loading which causes it to deform. Uncertainty associated with deformation prediction is propagated to the engine performance analysis. The cowl deformation is the main contributor to the sensitivity of the propulsion system performance. Finally, a framework for aerothermoelastic stability boundary calculation for hypersonic vehicles using CFD is developed. The usage of CFD enables one to consider different turbulence conditions, laminar or turbulent, and different models of the air mixture, in particular real gas model which accounts for dissociation of molecules at high temperature. The system is found to be sensitive to turbulence modeling as well as the location of the transition from laminar to turbulent flow. Real gas effects play a minor role in the

  16. Mixed oxidizer hybrid propulsion system optimization under uncertainty using applied response surface methodology and Monte Carlo simulation

    Science.gov (United States)

    Whitehead, James Joshua

    The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in

  17. Probabilistic Mass Growth Uncertainties

    Science.gov (United States)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  18. Metatarsalgia located by synovitis and uncertainty of the articulation metatarsus-phalanges of the II toe

    International Nuclear Information System (INIS)

    Gerstner G, Juan Bernardo

    2002-01-01

    The synovitis and the uncertainty of the articulation metatarsus-phalanges (MP) of the II toe they are the causes more frequent of metatersalgia located in this articulation of the foot, frequently bad diagnosed and not well managed by the general orthopedist. The natural history understands stadiums so precocious as the synovitis without alteration of peri-articular structures, going by the frank uncertainty, and finishing with the angular deformities and the complete luxation of the articulation MP. The meticulous and directed interrogation, the physical exam specifies and the classification of the diagnostic they are the keys for the successful handling of the pathology. The surgical correction of this condition should always be associated to the correction of associate deformities as the hallux valgus and the fingers in claw

  19. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    The question of how to derive and present uncertainty information in climate data records (CDRs) has received sustained attention within the European Space Agency Climate Change Initiative (CCI), a programme to generate CDRs addressing a range of essential climate variables (ECVs) from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO) should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions). The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the shape of the

  20. Embracing uncertainty in applied ecology.

    Science.gov (United States)

    Milner-Gulland, E J; Shea, K

    2017-12-01

    Applied ecologists often face uncertainty that hinders effective decision-making.Common traps that may catch the unwary are: ignoring uncertainty, acknowledging uncertainty but ploughing on, focussing on trivial uncertainties, believing your models, and unclear objectives.We integrate research insights and examples from a wide range of applied ecological fields to illustrate advances that are generally underused, but could facilitate ecologists' ability to plan and execute research to support management.Recommended approaches to avoid uncertainty traps are: embracing models, using decision theory, using models more effectively, thinking experimentally, and being realistic about uncertainty. Synthesis and applications . Applied ecologists can become more effective at informing management by using approaches that explicitly take account of uncertainty.

  1. Oil price uncertainty in Canada

    Energy Technology Data Exchange (ETDEWEB)

    Elder, John [Department of Finance and Real Estate, 1272 Campus Delivery, Colorado State University, Fort Collins, CO 80523 (United States); Serletis, Apostolos [Department of Economics, University of Calgary, Calgary, Alberta (Canada)

    2009-11-15

    Bernanke [Bernanke, Ben S. Irreversibility, uncertainty, and cyclical investment. Quarterly Journal of Economics 98 (1983), 85-106.] shows how uncertainty about energy prices may induce optimizing firms to postpone investment decisions, thereby leading to a decline in aggregate output. Elder and Serletis [Elder, John and Serletis, Apostolos. Oil price uncertainty.] find empirical evidence that uncertainty about oil prices has tended to depress investment in the United States. In this paper we assess the robustness of these results by investigating the effects of oil price uncertainty in Canada. Our results are remarkably similar to existing results for the United States, providing additional evidence that uncertainty about oil prices may provide another explanation for why the sharp oil price declines of 1985 failed to produce rapid output growth. Impulse-response analysis suggests that uncertainty about oil prices may tend to reinforce the negative response of output to positive oil shocks. (author)

  2. Intrinsic position uncertainty impairs overt search performance.

    Science.gov (United States)

    Semizer, Yelda; Michel, Melchi M

    2017-08-01

    Uncertainty regarding the position of the search target is a fundamental component of visual search. However, due to perceptual limitations of the human visual system, this uncertainty can arise from intrinsic, as well as extrinsic, sources. The current study sought to characterize the role of intrinsic position uncertainty (IPU) in overt visual search and to determine whether it significantly limits human search performance. After completing a preliminary detection experiment to characterize sensitivity as a function of visual field position, observers completed a search task that required localizing a Gabor target within a field of synthetic luminance noise. The search experiment included two clutter conditions designed to modulate the effect of IPU across search displays of varying set size. In the Cluttered condition, the display was tiled uniformly with feature clutter to maximize the effects of IPU. In the Uncluttered condition, the clutter at irrelevant locations was removed to attenuate the effects of IPU. Finally, we derived an IPU-constrained ideal searcher model, limited by the IPU measured in human observers. Ideal searchers were simulated based on the detection sensitivity and fixation sequences measured for individual human observers. The IPU-constrained ideal searcher predicted performance trends similar to those exhibited by the human observers. In the Uncluttered condition, performance decreased steeply as a function of increasing set size. However, in the Cluttered condition, the effect of IPU dominated and performance was approximately constant as a function of set size. Our findings suggest that IPU substantially limits overt search performance, especially in crowded displays.

  3. The cerebellum and decision making under uncertainty.

    Science.gov (United States)

    Blackwood, Nigel; Ffytche, Dominic; Simmons, Andrew; Bentall, Richard; Murray, Robin; Howard, Robert

    2004-06-01

    This study aimed to identify the neural basis of probabilistic reasoning, a type of inductive inference that aids decision making under conditions of uncertainty. Eight normal subjects performed two separate two-alternative-choice tasks (the balls in a bottle and personality survey tasks) while undergoing functional magnetic resonance imaging (fMRI). The experimental conditions within each task were chosen so that they differed only in their requirement to make a decision under conditions of uncertainty (probabilistic reasoning and frequency determination required) or under conditions of certainty (frequency determination required). The same visual stimuli and motor responses were used in the experimental conditions. We provide evidence that the neo-cerebellum, in conjunction with the premotor cortex, inferior parietal lobule and medial occipital cortex, mediates the probabilistic inferences that guide decision making under uncertainty. We hypothesise that the neo-cerebellum constructs internal working models of uncertain events in the external world, and that such probabilistic models subserve the predictive capacity central to induction. Copyright 2004 Elsevier B.V.

  4. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  5. State-independent uncertainty relations and entanglement detection

    Science.gov (United States)

    Qian, Chen; Li, Jun-Li; Qiao, Cong-Feng

    2018-04-01

    The uncertainty relation is one of the key ingredients of quantum theory. Despite the great efforts devoted to this subject, most of the variance-based uncertainty relations are state-dependent and suffering from the triviality problem of zero lower bounds. Here we develop a method to get uncertainty relations with state-independent lower bounds. The method works by exploring the eigenvalues of a Hermitian matrix composed by Bloch vectors of incompatible observables and is applicable for both pure and mixed states and for arbitrary number of N-dimensional observables. The uncertainty relation for the incompatible observables can be explained by geometric relations related to the parallel postulate and the inequalities in Horn's conjecture on Hermitian matrix sum. Practical entanglement criteria are also presented based on the derived uncertainty relations.

  6. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, C. J.

    2017-12-01

    demonstrating metrologically sound methodologies addressing this problem for four key historical CDRs. FIDUCEO methods of uncertainty analysis (which also tend to lead to improved FCDRs and CDRs) could support coherent treatment of uncertainty across FCDRs to CDRs and higher level products for a wide range of essential climate variables.

  7. Knowledge management system for risk mitigation in supply chain uncertainty: case from automotive battery supply chain

    Science.gov (United States)

    Marie, I. A.; Sugiarto, D.; Surjasa, D.; Witonohadi, A.

    2018-01-01

    Automotive battery supply chain include battery manufacturer, sulphuric acid suppliers, polypropylene suppliers, lead suppliers, transportation service providers, warehouses, retailers and even customers. Due to the increasingly dynamic condition of the environment, supply chain actors were required to improve their ability to overcome various uncertainty issues in the environment. This paper aims to describe the process of designing a knowledge management system for risk mitigation in supply chain uncertainty. The design methodology began with the identification of the knowledge needed to solve the problems associated with uncertainty and analysis of system requirements. The design of the knowledge management system was described in the form of a data flow diagram. The results of the study indicated that key knowledge area that needs to be managed were the knowledge to maintain the stability of process in sulphuric acid process and knowledge to overcome the wastes in battery manufacturing process. The system was expected to be a media acquisition, dissemination and storage of knowledge associated with the uncertainty in the battery supply chain and increase the supply chain performance.

  8. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    Science.gov (United States)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics

  9. The Evolving Transmission of Uncertainty Shocks in the United Kingdom

    Directory of Open Access Journals (Sweden)

    Haroon Mumtaz

    2016-03-01

    Full Text Available This paper investigates if the impact of uncertainty shocks on the U.K. economy has changed over time. To this end, we propose an extended time-varying VAR model that simultaneously allows the estimation of a measure of uncertainty and its time-varying impact on key macroeconomic and financial variables. We find that the impact of uncertainty shocks on these variables has declined over time. The timing of the change coincides with the introduction of inflation targeting in the U.K.

  10. Neural Correlates of Intolerance of Uncertainty in Clinical Disorders.

    Science.gov (United States)

    Wever, Mirjam; Smeets, Paul; Sternheim, Lot

    2015-01-01

    Intolerance of uncertainty is a key contributor to anxiety-related disorders. Recent studies highlight its importance in other clinical disorders. The link between its clinical presentation and the underlying neural correlates remains unclear. This review summarizes the emerging literature on the neural correlates of intolerance of uncertainty. In conclusion, studies focusing on the neural correlates of this construct are sparse, and findings are inconsistent across disorders. Future research should identify neural correlates of intolerance of uncertainty in more detail. This may unravel the neurobiology of a wide variety of clinical disorders and pave the way for novel therapeutic targets.

  11. Heisenberg's principle of uncertainty and the uncertainty relations

    International Nuclear Information System (INIS)

    Redei, Miklos

    1987-01-01

    The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs

  12. Uncertainty as Certaint

    Science.gov (United States)

    Petzinger, Tom

    I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.

  13. Orientation and uncertainties

    International Nuclear Information System (INIS)

    Peters, H.P.; Hennen, L.

    1990-01-01

    The authors report on the results of three representative surveys that made a closer inquiry into perceptions and valuations of information and information sources concering Chernobyl. If turns out that the information sources are generally considered little trustworthy. This was generally attributable to the interpretation of the events being tied to attitudes in the atmonic energy issue. The greatest credit was given to television broadcasting. The authors summarize their discourse as follows: There is good reason to interpret the widespread uncertainty after Chernobyl as proof of the fact that large parts of the population are prepared and willing to assume a critical stance towards information and prefer to draw their information from various sources representing different positions. (orig.) [de

  14. DOD ELAP Lab Uncertainties

    Science.gov (United States)

    2012-03-01

    ISO / IEC   17025  Inspection Bodies – ISO / IEC  17020  RMPs – ISO  Guide 34 (Reference...certify to :  ISO  9001 (QMS),  ISO  14001 (EMS),   TS 16949 (US Automotive)  etc. 2 3 DoD QSM 4.2 standard   ISO / IEC   17025 :2005  Each has uncertainty...IPV6, NLLAP, NEFAP  TRAINING Programs  Certification Bodies – ISO / IEC  17021  Accreditation for  Management System 

  15. Traceability and Measurement Uncertainty

    DEFF Research Database (Denmark)

    Tosello, Guido; De Chiffre, Leonardo

    2004-01-01

    . The project partnership aims (composed by 7 partners in 5 countries, thus covering a real European spread in high tech production technology) to develop and implement an advanced e-learning system that integrates contributions from quite different disciplines into a user-centred approach that strictly....... Machine tool testing 9. The role of manufacturing metrology for QM 10. Inspection planning 11. Quality management of measurements incl. Documentation 12. Advanced manufacturing measurement technology The present report (which represents the section 2 - Traceability and Measurement Uncertainty – of the e-learning......This report is made as a part of the project ‘Metro-E-Learn: European e-Learning in Manufacturing Metrology’, an EU project under the program SOCRATES MINERVA (ODL and ICT in Education), Contract No: 101434-CP-1-2002-1-DE-MINERVA, coordinated by Friedrich-Alexander-University Erlangen...

  16. Decision making under uncertainty

    International Nuclear Information System (INIS)

    Cyert, R.M.

    1989-01-01

    This paper reports on ways of improving the reliability of products and systems in this country if we are to survive as a first-rate industrial power. The use of statistical techniques have, since the 1920s, been viewed as one of the methods for testing quality and estimating the level of quality in a universe of output. Statistical quality control is not relevant, generally, to improving systems in an industry like yours, but certainly the use of probability concepts is of significance. In addition, when it is recognized that part of the problem involves making decisions under uncertainty, it becomes clear that techniques such as sequential decision making and Bayesian analysis become major methodological approaches that must be utilized

  17. Sustainability and uncertainty

    DEFF Research Database (Denmark)

    Jensen, Karsten Klint

    2007-01-01

    The widely used concept of sustainability is seldom precisely defined, and its clarification involves making up one's mind about a range of difficult questions. One line of research (bottom-up) takes sustaining a system over time as its starting point and then infers prescriptions from...... this requirement. Another line (top-down) takes an economical interpretation of the Brundtland Commission's suggestion that the present generation's needsatisfaction should not compromise the need-satisfaction of future generations as its starting point. It then measures sustainability at the level of society...... a clarified ethical goal, disagreements can arise. At present we do not know what substitutions will be possible in the future. This uncertainty clearly affects the prescriptions that follow from the measure of sustainability. Consequently, decisions about how to make future agriculture sustainable...

  18. Uncertainty Quantification in Alchemical Free Energy Methods.

    Science.gov (United States)

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-05-02

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  19. Medical Humanities: The Rx for Uncertainty?

    Science.gov (United States)

    Ofri, Danielle

    2017-12-01

    While medical students often fear the avalanche of knowledge they are required to learn during training, it is learning to translate that knowledge into wisdom that is the greatest challenge of becoming a doctor. Part of that challenge is learning to tolerate ambiguity and uncertainty, a difficult feat for doctors who are taught to question anything that is not evidence based or peer reviewed. The medical humanities specialize in this ambiguity and uncertainty, which are hallmarks of actual clinical practice but rarely addressed in medical education. The humanities also force reflection and contemplation-skills that are crucial to thoughtful decision making and to personal wellness. Beyond that, the humanities add a dose of joy and beauty to a training process that is notoriously frugal in these departments. Well integrated, the humanities can be the key to transforming medical knowledge into clinical wisdom.

  20. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  1. Quantum key management

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, Richard John; Thrasher, James Thomas; Nordholt, Jane Elizabeth

    2016-11-29

    Innovations for quantum key management harness quantum communications to form a cryptography system within a public key infrastructure framework. In example implementations, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a Merkle signature scheme (using Winternitz one-time digital signatures or other one-time digital signatures, and Merkle hash trees) to constitute a cryptography system. More generally, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a hash-based signature scheme. This provides a secure way to identify, authenticate, verify, and exchange secret cryptographic keys. Features of the quantum key management innovations further include secure enrollment of users with a registration authority, as well as credential checking and revocation with a certificate authority, where the registration authority and/or certificate authority can be part of the same system as a trusted authority for quantum key distribution.

  2. Uncertainty relations and semi-groups in B-algebras

    International Nuclear Information System (INIS)

    Papaloucas, L.C.

    1980-07-01

    Starting from a B-algebra which satisfies the conditions of a structure theorem, we obtain directly a Lie algebra for which the Lie ring satisfies automatically the Heisenberg uncertainty relations. (author)

  3. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  4. Group key management

    Energy Technology Data Exchange (ETDEWEB)

    Dunigan, T.; Cao, C.

    1997-08-01

    This report describes an architecture and implementation for doing group key management over a data communications network. The architecture describes a protocol for establishing a shared encryption key among an authenticated and authorized collection of network entities. Group access requires one or more authorization certificates. The implementation includes a simple public key and certificate infrastructure. Multicast is used for some of the key management messages. An application programming interface multiplexes key management and user application messages. An implementation using the new IP security protocols is postulated. The architecture is compared with other group key management proposals, and the performance and the limitations of the implementation are described.

  5. Modular Connector Keying Concept

    Science.gov (United States)

    Ishman, Scott; Dukes, Scott; Warnica, Gary; Conrad, Guy; Senigla, Steven

    2013-01-01

    For panel-mount-type connectors, keying is usually "built-in" to the connector body, necessitating different part numbers for each key arrangement. This is costly for jobs that require small quantities. This invention was driven to provide a cost savings and to reduce documentation of individual parts. The keys are removable and configurable in up to 16 combinations. Since the key parts are separate from the connector body, a common design can be used for the plug, receptacle, and key parts. The keying can then be set at the next higher assembly.

  6. Statistical Uncertainty Quantification of Physical Models during Reflood of LBLOCA

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Deog Yeon; Seul, Kwang Won; Woo, Sweng Woong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The use of the best-estimate (BE) computer codes in safety analysis for loss-of-coolant accident (LOCA) is the major trend in many countries to reduce the significant conservatism. A key feature of this BE evaluation requires the licensee to quantify the uncertainty of the calculations. So, it is very important how to determine the uncertainty distribution before conducting the uncertainty evaluation. Uncertainty includes those of physical model and correlation, plant operational parameters, and so forth. The quantification process is often performed mainly by subjective expert judgment or obtained from reference documents of computer code. In this respect, more mathematical methods are needed to reasonably determine the uncertainty ranges. The first uncertainty quantification are performed with the various increments for two influential uncertainty parameters to get the calculated responses and their derivatives. The different data set with two influential uncertainty parameters for FEBA tests, are chosen applying more strict criteria for selecting responses and their derivatives, which may be considered as the user’s effect in the CIRCÉ applications. Finally, three influential uncertainty parameters are considered to study the effect on the number of uncertainty parameters due to the limitation of CIRCÉ method. With the determined uncertainty ranges, uncertainty evaluations for FEBA tests are performed to check whether the experimental responses such as the cladding temperature or pressure drop are inside the limits of calculated uncertainty bounds. A confirmation step will be performed to evaluate the quality of the information in the case of the different reflooding PERICLES experiments. The uncertainty ranges of physical model in MARS-KS thermal-hydraulic code during the reflooding were quantified by CIRCÉ method using FEBA experiment tests, instead of expert judgment. Also, through the uncertainty evaluation for FEBA and PERICLES tests, it was confirmed

  7. Essays on model uncertainty in financial models

    NARCIS (Netherlands)

    Li, Jing

    2018-01-01

    This dissertation studies model uncertainty, particularly in financial models. It consists of two empirical chapters and one theoretical chapter. The first empirical chapter (Chapter 2) classifies model uncertainty into parameter uncertainty and misspecification uncertainty. It investigates the

  8. A new uncertainty importance measure

    International Nuclear Information System (INIS)

    Borgonovo, E.

    2007-01-01

    Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman [A matrix-based approach to uncertainty and sensitivity analysis for fault trees. Risk Anal 1987;7(1):22-33] first introduced uncertainty importance measures

  9. Uncertainty Management and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Rosenbaum, Ralph K.; Georgiadis, Stylianos; Fantke, Peter

    2018-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...

  10. Additivity of entropic uncertainty relations

    Directory of Open Access Journals (Sweden)

    René Schwonnek

    2018-03-01

    Full Text Available We consider the uncertainty between two pairs of local projective measurements performed on a multipartite system. We show that the optimal bound in any linear uncertainty relation, formulated in terms of the Shannon entropy, is additive. This directly implies, against naive intuition, that the minimal entropic uncertainty can always be realized by fully separable states. Hence, in contradiction to proposals by other authors, no entanglement witness can be constructed solely by comparing the attainable uncertainties of entangled and separable states. However, our result gives rise to a huge simplification for computing global uncertainty bounds as they now can be deduced from local ones. Furthermore, we provide the natural generalization of the Maassen and Uffink inequality for linear uncertainty relations with arbitrary positive coefficients.

  11. Technical note: Design flood under hydrological uncertainty

    Science.gov (United States)

    Botto, Anna; Ganora, Daniele; Claps, Pierluigi; Laio, Francesco

    2017-07-01

    Planning and verification of hydraulic infrastructures require a design estimate of hydrologic variables, usually provided by frequency analysis, and neglecting hydrologic uncertainty. However, when hydrologic uncertainty is accounted for, the design flood value for a specific return period is no longer a unique value, but is represented by a distribution of values. As a consequence, the design flood is no longer univocally defined, making the design process undetermined. The Uncertainty Compliant Design Flood Estimation (UNCODE) procedure is a novel approach that, starting from a range of possible design flood estimates obtained in uncertain conditions, converges to a single design value. This is obtained through a cost-benefit criterion with additional constraints that is numerically solved in a simulation framework. This paper contributes to promoting a practical use of the UNCODE procedure without resorting to numerical computation. A modified procedure is proposed by using a correction coefficient that modifies the standard (i.e., uncertainty-free) design value on the basis of sample length and return period only. The procedure is robust and parsimonious, as it does not require additional parameters with respect to the traditional uncertainty-free analysis. Simple equations to compute the correction term are provided for a number of probability distributions commonly used to represent the flood frequency curve. The UNCODE procedure, when coupled with this simple correction factor, provides a robust way to manage the hydrologic uncertainty and to go beyond the use of traditional safety factors. With all the other parameters being equal, an increase in the sample length reduces the correction factor, and thus the construction costs, while still keeping the same safety level.

  12. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    International Nuclear Information System (INIS)

    Wagner, Ryan; Raman, Arvind; Moon, Robert; Pratt, Jon; Shaw, Gordon

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale resolution of both inorganic and biological surfaces and nanomaterials. We present a framework to ascribe uncertainty to local nanomechanical properties of any nanoparticle or surface measured with the AFM by taking into account the main uncertainty sources inherent in such measurements. We demonstrate the framework by quantifying uncertainty in AFM-based measurements of the transverse elastic modulus of cellulose nanocrystals (CNCs), an abundant, plant-derived nanomaterial whose mechanical properties are comparable to Kevlar fibers. For a single, isolated CNC the transverse elastic modulus was found to have a mean of 8.1 GPa and a 95% confidence interval of 2.7–20 GPa. A key result is that multiple replicates of force–distance curves do not sample the important sources of uncertainty, which are systematic in nature. The dominant source of uncertainty is the nondimensional photodiode sensitivity calibration rather than the cantilever stiffness or Z-piezo calibrations. The results underscore the great need for, and open a path towards, quantifying and minimizing uncertainty in AFM-based material property measurements of nanoparticles, nanostructured surfaces, thin films, polymers and biomaterials.

  13. Decommissioning funding: ethics, implementation, uncertainties

    International Nuclear Information System (INIS)

    2006-01-01

    This status report on Decommissioning Funding: Ethics, Implementation, Uncertainties also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). The report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems. (authors)

  14. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib; Galassi, R. Malpica; Valorani, M.

    2016-01-01

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  15. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib

    2016-01-05

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  16. The Uncertainty of Measurement Results

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Factors affecting the uncertainty of measurement are explained, basic statistical formulae given, and the theoretical concept explained in the context of pesticide formulation analysis. Practical guidance is provided on how to determine individual uncertainty components within an analytical procedure. An extended and comprehensive table containing the relevant mathematical/statistical expressions elucidates the relevant underlying principles. Appendix I provides a practical elaborated example on measurement uncertainty estimation, above all utilizing experimental repeatability and reproducibility laboratory data. (author)

  17. Uncertainty analysis of environmental models

    International Nuclear Information System (INIS)

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  18. Communicating weather forecast uncertainty: Do individual differences matter?

    Science.gov (United States)

    Grounds, Margaret A; Joslyn, Susan L

    2018-03-01

    Research suggests that people make better weather-related decisions when they are given numeric probabilities for critical outcomes (Joslyn & Leclerc, 2012, 2013). However, it is unclear whether all users can take advantage of probabilistic forecasts to the same extent. The research reported here assessed key cognitive and demographic factors to determine their relationship to the use of probabilistic forecasts to improve decision quality. In two studies, participants decided between spending resources to prevent icy conditions on roadways or risk a larger penalty when freezing temperatures occurred. Several forecast formats were tested, including a control condition with the night-time low temperature alone and experimental conditions that also included the probability of freezing and advice based on expected value. All but those with extremely low numeracy scores made better decisions with probabilistic forecasts. Importantly, no groups made worse decisions when probabilities were included. Moreover, numeracy was the best predictor of decision quality, regardless of forecast format, suggesting that the advantage may extend beyond understanding the forecast to general decision strategy issues. This research adds to a growing body of evidence that numerical uncertainty estimates may be an effective way to communicate weather danger to general public end users. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  19. Effects of input uncertainty on cross-scale crop modeling

    Science.gov (United States)

    Waha, Katharina; Huth, Neil; Carberry, Peter

    2014-05-01

    The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input

  20. Biometry, the safe key

    Directory of Open Access Journals (Sweden)

    María Fraile-Hurtado

    2010-12-01

    Full Text Available Biometry is the next step in authentication, why do not we take this stepforward in our communication security systems? Keys are the main disadvantage in the cryptography, what if we were our own key?

  1. Financial Key Ratios

    OpenAIRE

    Tănase Alin-Eliodor

    2014-01-01

    This article focuses on computing techniques starting from trial balance data regarding financial key ratios. There are presented activity, liquidity, solvency and profitability financial key ratios. It is presented a computing methodology in three steps based on a trial balance.

  2. Public Key Cryptography.

    Science.gov (United States)

    Tapson, Frank

    1996-01-01

    Describes public key cryptography, also known as RSA, which is a system using two keys, one used to put a message into cipher and another used to decipher the message. Presents examples using small prime numbers. (MKR)

  3. Key Management Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Provides a secure environment to research and develop advanced electronic key management and networked key distribution technologies for the Navy and DoD....

  4. Public Key Infrastructure Study

    National Research Council Canada - National Science Library

    Berkovits, Shimshon

    1994-01-01

    The National Institute of Standards and Technology (NIST) has tasked The MITRE Corporation to study the alternatives for automated management of public keys and of the associated public key certificates for the Federal Government...

  5. Investment choice under uncertainty: A review essay

    Directory of Open Access Journals (Sweden)

    Trifunović Dejan

    2005-01-01

    Full Text Available An investment opportunity whose return is perfectly predictable, hardly exists at all. Instead, investor makes his decisions under conditions of uncertainty. Theory of expected utility is the main analytical tool for description of choice under uncertainty. Critics of the theory contend that individuals have bounded rationality and that the theory of expected utility is not correct. When agents are faced with risky decisions they behave differently, conditional on their attitude towards risk. They can be risk loving, risk averse or risk neutral. In order to make an investment decision it is necessary to compare probability distribution functions of returns. Investment decision making is much simpler if one uses expected values and variances instead of probability distribution functions.

  6. Uncertainty quantification in resonance absorption

    International Nuclear Information System (INIS)

    Williams, M.M.R.

    2012-01-01

    We assess the uncertainty in the resonance escape probability due to uncertainty in the neutron and radiation line widths for the first 21 resonances in 232 Th as given by . Simulation, quadrature and polynomial chaos methods are used and the resonance data are assumed to obey a beta distribution. We find the uncertainty in the total resonance escape probability to be the equivalent, in reactivity, of 75–130 pcm. Also shown are pdfs of the resonance escape probability for each resonance and the variation of the uncertainty with temperature. The viability of the polynomial chaos expansion method is clearly demonstrated.

  7. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  8. Simplified propagation of standard uncertainties

    International Nuclear Information System (INIS)

    Shull, A.H.

    1997-01-01

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper

  9. Calculation of uncertainties; Calculo de incertidumbres

    Energy Technology Data Exchange (ETDEWEB)

    Diaz-Asencio, Misael [Centro de Estudios Ambientales de Cienfuegos (Cuba)

    2012-07-01

    One of the most important aspects in relation to the quality assurance in any analytical activity is the estimation of measurement uncertainty. There is general agreement that 'the expression of the result of a measurement is not complete without specifying its associated uncertainty'. An analytical process is the mechanism for obtaining methodological information (measurand) of a material system (population). This implies the need for the definition of the problem, the choice of methods for sampling and measurement and proper execution of these activities for obtaining information. The result of a measurement is only an approximation or estimate of the value of the measurand, which is complete only when accompanied by an estimate of the uncertainty of the analytical process. According to the 'Vocabulary of Basic and General Terms in Metrology' measurement uncertainty' is the parameter associated with the result of a measurement that characterizes the dispersion of the values that could reasonably be attributed to the measurand (or magnitude). This parameter could be a standard deviation or a confidence interval. The uncertainty evaluation requires detailed look at all possible sources, but not disproportionately. We can make a good estimate of the uncertainty concentrating efforts on the largest contributions. The key steps of the process of determining the uncertainty in the measurements are: - the specification of the measurand; - identification of the sources of uncertainty - the quantification of individual components of uncertainty, - calculate the combined standard uncertainty; - report of uncertainty. [Spanish] Uno de los aspectos mas importantes en relacion con el aseguramiento de la calidad en cualquier actividad analitica es la estimacion de la incertidumbre de la medicion. Existe el acuerdo general que 'la expresion del resultado de una medicion no esta completa sin especificar su incertidumbre asociada'. Un proceso analitico es el mecanismo

  10. Climate Certainties and Uncertainties

    International Nuclear Information System (INIS)

    Morel, Pierre

    2012-01-01

    In issue 380 of Futuribles in December 2011, Antonin Pottier analysed in detail the workings of what is today termed 'climate scepticism' - namely the propensity of certain individuals to contest the reality of climate change on the basis of pseudo-scientific arguments. He emphasized particularly that what fuels the debate on climate change is, largely, the degree of uncertainty inherent in the consequences to be anticipated from observation of the facts, not the description of the facts itself. In his view, the main aim of climate sceptics is to block the political measures for combating climate change. However, since they do not admit to this political posture, they choose instead to deny the scientific reality. This month, Futuribles complements this socio-psychological analysis of climate-sceptical discourse with an - in this case, wholly scientific - analysis of what we know (or do not know) about climate change on our planet. Pierre Morel gives a detailed account of the state of our knowledge in the climate field and what we are able to predict in the medium/long-term. After reminding us of the influence of atmospheric meteorological processes on the climate, he specifies the extent of global warming observed since 1850 and the main origin of that warming, as revealed by the current state of knowledge: the increase in the concentration of greenhouse gases. He then describes the changes in meteorological regimes (showing also the limits of climate simulation models), the modifications of hydrological regimes, and also the prospects for rises in sea levels. He also specifies the mechanisms that may potentially amplify all these phenomena and the climate disasters that might ensue. Lastly, he shows what are the scientific data that cannot be disregarded, the consequences of which are now inescapable (melting of the ice-caps, rises in sea level etc.), the only remaining uncertainty in this connection being the date at which these things will happen. 'In this

  11. Uncertainties in the simulation of groundwater recharge at different scales

    Directory of Open Access Journals (Sweden)

    H. Bogena

    2005-01-01

    Full Text Available Digital spatial data always imply some kind of uncertainty. The source of this uncertainty can be found in their compilation as well as the conceptual design that causes a more or less exact abstraction of the real world, depending on the scale under consideration. Within the framework of hydrological modelling, in which numerous data sets from diverse sources of uneven quality are combined, the various uncertainties are accumulated. In this study, the GROWA model is taken as an example to examine the effects of different types of uncertainties on the calculated groundwater recharge. Distributed input errors are determined for the parameters' slope and aspect using a Monte Carlo approach. Landcover classification uncertainties are analysed by using the conditional probabilities of a remote sensing classification procedure. The uncertainties of data ensembles at different scales and study areas are discussed. The present uncertainty analysis showed that the Gaussian error propagation method is a useful technique for analysing the influence of input data on the simulated groundwater recharge. The uncertainties involved in the land use classification procedure and the digital elevation model can be significant in some parts of the study area. However, for the specific model used in this study it was shown that the precipitation uncertainties have the greatest impact on the total groundwater recharge error.

  12. Assessing Groundwater Model Uncertainty for the Central Nevada Test Area

    International Nuclear Information System (INIS)

    Pohll, Greg; Pohlmann, Karl; Hassan, Ahmed; Chapman, Jenny; Mihevc, Todd

    2002-01-01

    The purpose of this study is to quantify the flow and transport model uncertainty for the Central Nevada Test Area (CNTA). Six parameters were identified as uncertain, including the specified head boundary conditions used in the flow model, the spatial distribution of the underlying welded tuff unit, effective porosity, sorption coefficients, matrix diffusion coefficient, and the geochemical release function which describes nuclear glass dissolution. The parameter uncertainty was described by assigning prior statistical distributions for each of these parameters. Standard Monte Carlo techniques were used to sample from the parameter distributions to determine the full prediction uncertainty. Additional analysis is performed to determine the most cost-beneficial characterization activities. The maximum radius of the tritium and strontium-90 contaminant boundary was used as the output metric for evaluation of prediction uncertainty. The results indicate that combining all of the uncertainty in the parameters listed above propagates to a prediction uncertainty in the maximum radius of the contaminant boundary of 234 to 308 m and 234 to 302 m, for tritium and strontium-90, respectively. Although the uncertainty in the input parameters is large, the prediction uncertainty in the contaminant boundary is relatively small. The relatively small prediction uncertainty is primarily due to the small transport velocities such that large changes in the uncertain input parameters causes small changes in the contaminant boundary. This suggests that the model is suitable in terms of predictive capability for the contaminant boundary delineation

  13. Corporate liquidity and dividend policy under uncertainty

    OpenAIRE

    Koussis, Nicos; Martzoukos, Spiros H.; Trigeorgis, Lenos

    2016-01-01

    We examine optimal liquidity (retained earnings) and dividend choice incorporating debt financing with risk of default and bankruptcy costs as well as growth options under revenue uncertainty. We revisit the conditions for dividend policy irrelevancy and the broader role of retained earnings and dividends. Retained earnings have a net positive impact on firm value in the presence of growth options, high external financing costs and low default risk. High levels of retained earnings enhance de...

  14. Framework of Uncertainty in Medical Decision Making

    DEFF Research Database (Denmark)

    Austin, L; Brodersen, John; Reventlow, Susanne

    Historically, medical decisions have primarily involved diagnosis and treatment of symptomatic patients. Increasingly, medical decisions concern uncertain future health states in asymptomatic people. We construct a taxonomy of five medical decision situations that encompasses these wider...... possibilities. For each, we identify potential sources of uncertainty that should be considered when assessing the degree of belief that a person has, or will have, a condition. Decision trees illustrate the normative structure of each situation. The five decision situations involve: 1) assessing...

  15. Adapt or Perish: A Review of Planning Approaches for Adaptation under Deep Uncertainty

    Directory of Open Access Journals (Sweden)

    Jan H. Kwakkel

    2013-03-01

    Full Text Available There is increasing interest in long-term plans that can adapt to changing situations under conditions of deep uncertainty. We argue that a sustainable plan should not only achieve economic, environmental, and social objectives, but should be robust and able to be adapted over time to (unforeseen future conditions. Large numbers of papers dealing with robustness and adaptive plans have begun to appear, but the literature is fragmented. The papers appear in disparate journals, and deal with a wide variety of policy domains. This paper (1 describes and compares a family of related conceptual approaches to designing a sustainable plan, and (2 describes several computational tools supporting these approaches. The conceptual approaches all have their roots in an approach to long-term planning called Assumption-Based Planning. Guiding principles for the design of a sustainable adaptive plan are: explore a wide variety of relevant uncertainties, connect short-term targets to long-term goals over time, commit to short-term actions while keeping options open, and continuously monitor the world and take actions if necessary. A key computational tool across the conceptual approaches is a fast, simple (policy analysis model that is used to make large numbers of runs, in order to explore the full range of uncertainties and to identify situations in which the plan would fail.

  16. Reward uncertainty enhances incentive salience attribution as sign-tracking

    Science.gov (United States)

    Anselme, Patrick; Robinson, Mike J. F.; Berridge, Kent C.

    2014-01-01

    Conditioned stimuli (CSs) come to act as motivational magnets following repeated association with unconditioned stimuli (UCSs) such as sucrose rewards. By traditional views, the more reliably predictive a Pavlovian CS-UCS association, the more the CS becomes attractive. However, in some cases, less predictability might equal more motivation. Here we examined the effect of introducing uncertainty in CS-UCS association on CS strength as an attractive motivation magnet. In the present study, Experiment 1 assessed the effects of Pavlovian predictability versus uncertainty about reward probability and/or reward magnitude on the acquisition and expression of sign-tracking (ST) and goal-tracking (GT) responses in an autoshaping procedure. Results suggested that uncertainty produced strongest incentive salience expressed as sign-tracking. Experiment 2 examined whether a within-individual temporal shift from certainty to uncertainty conditions could produce a stronger CS motivational magnet when uncertainty began, and found that sign-tracking still increased after the shift. Overall, our results support earlier reports that ST responses become more pronounced in the presence of uncertainty regarding CS-UCS associations, especially when uncertainty combines both probability and magnitude. These results suggest that Pavlovian uncertainty, although diluting predictability, is still able to enhance the incentive motivational power of particular CSs. PMID:23078951

  17. Assessing concentration uncertainty estimates from passive microwave sea ice products

    Science.gov (United States)

    Meier, W.; Brucker, L.; Miller, J. A.

    2017-12-01

    Sea ice concentration is an essential climate variable and passive microwave derived estimates of concentration are one of the longest satellite-derived climate records. However, until recently uncertainty estimates were not provided. Numerous validation studies provided insight into general error characteristics, but the studies have found that concentration error varied greatly depending on sea ice conditions. Thus, an uncertainty estimate from each observation is desired, particularly for initialization, assimilation, and validation of models. Here we investigate three sea ice products that include an uncertainty for each concentration estimate: the NASA Team 2 algorithm product, the EUMETSAT Ocean and Sea Ice Satellite Application Facility (OSI-SAF) product, and the NOAA/NSIDC Climate Data Record (CDR) product. Each product estimates uncertainty with a completely different approach. The NASA Team 2 product derives uncertainty internally from the algorithm method itself. The OSI-SAF uses atmospheric reanalysis fields and a radiative transfer model. The CDR uses spatial variability from two algorithms. Each approach has merits and limitations. Here we evaluate the uncertainty estimates by comparing the passive microwave concentration products with fields derived from the NOAA VIIRS sensor. The results show that the relationship between the product uncertainty estimates and the concentration error (relative to VIIRS) is complex. This may be due to the sea ice conditions, the uncertainty methods, as well as the spatial and temporal variability of the passive microwave and VIIRS products.

  18. Uncertainty of a hydrological climate change impact assessment - Is it really all about climate uncertainty?

    Science.gov (United States)

    Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian

    2013-04-01

    climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on 2 small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment method with 2 different likelihood functions. One was a time-series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was a likelihood function for the flow quantiles directly. Due to the better data coverage and smaller hydrological complexity in one of our test catchments we had better performance from the hydrological model and thus could observe that the relative importance of different uncertainty sources varied between sites, boundary conditions and flow indicators. The uncertainty of future climate was important, but not dominant. The deficiencies of the hydrological model were on the same scale, especially for the sites and flow components where model performance for the past observations was further from optimal (Nash-Sutcliffe index = 0.5 - 0.7). The overall uncertainty of predictions was well beyond the expected change signal even for the best performing site and flow indicator.

  19. Uncertainty Evaluation for SMART Synthesized Power Distribution

    International Nuclear Information System (INIS)

    Cho, J. Y.; Song, J. S.; Lee, C. C.; Park, S. Y.; Kim, K. Y.; Lee, K. H.

    2010-07-01

    This report performs the uncertainty analysis for the SMART synthesis power distribution generated by a SSUN (SMART core SUpporting system coupled by Nuclear design code) code. SSUN runs coupled with the MASTER neutronics code and generates the core 3-D synthesis power distribution by using DPCM3D. The MASTER code plays a role to provide the DPCM3D constants to the SSUN code for the current core states. The uncertainties evaluated in this report are the form of 95%/95% probability/confidence one-sided tolerance limits and can be used in conjunction with Technical Specification limits on these quantities to establish appropriate LCO (Limiting Conditions of Operation) and LSSS (Limiting Safety System Settings) limits. This report is applicable to SMART nuclear reactor using fixed rhodium detector systems. The unknown true power distribution should be given for the uncertainty evaluation of the synthesis power distribution. This report produces virtual distributions for the true power distribution by imposing the CASMO-3/MASTER uncertainty to the MASTER power distribution. Detector signals are generated from these virtual distribution and the DPCM3D constants are from the MASTER power distribution. The SSUN code synthesizes the core 3-D power distribution by using these detector signals and the DPCM3D constants. The following summarizes the uncertainty evaluation procedure for the synthesis power distribution. (1) Generation of 3-D power distribution by MASTER -> Determination of the DPCM3D constants. (2) Generation of virtual power distribution (assumed to be true power distribution) -> Generation of detector signals. (3) Generation of synthesis power distribution. (4) Uncertainty evaluation for the synthesis power distribution. Chi-Square normality test rejects the hypothesis of normal distribution for the synthesis power error distribution. Therefore, the KRUSKAL WALLIS test and the non-parametric statistics are used for data pooling and the tolerance limits. The

  20. Beyond optimality: Multistakeholder robustness tradeoffs for regional water portfolio planning under deep uncertainty

    Science.gov (United States)

    Herman, Jonathan D.; Zeff, Harrison B.; Reed, Patrick M.; Characklis, Gregory W.

    2014-10-01

    While optimality is a foundational mathematical concept in water resources planning and management, "optimal" solutions may be vulnerable to failure if deeply uncertain future conditions deviate from those assumed during optimization. These vulnerabilities may produce severely asymmetric impacts across a region, making it vital to evaluate the robustness of management strategies as well as their impacts for regional stakeholders. In this study, we contribute a multistakeholder many-objective robust decision making (MORDM) framework that blends many-objective search and uncertainty analysis tools to discover key tradeoffs between water supply alternatives and their robustness to deep uncertainties (e.g., population pressures, climate change, and financial risks). The proposed framework is demonstrated for four interconnected water utilities representing major stakeholders in the "Research Triangle" region of North Carolina, U.S. The utilities supply well over one million customers and have the ability to collectively manage drought via transfer agreements and shared infrastructure. We show that water portfolios for this region that compose optimal tradeoffs (i.e., Pareto-approximate solutions) under expected future conditions may suffer significantly degraded performance with only modest changes in deeply uncertain hydrologic and economic factors. We then use the Patient Rule Induction Method (PRIM) to identify which uncertain factors drive the individual and collective vulnerabilities for the four cooperating utilities. Our framework identifies key stakeholder dependencies and robustness tradeoffs associated with cooperative regional planning, which are critical to understanding the tensions between individual versus regional water supply goals. Cooperative demand management was found to be the key factor controlling the robustness of regional water supply planning, dominating other hydroclimatic and economic uncertainties through the 2025 planning horizon. Results

  1. Uncertainty vs. Information (Invited)

    Science.gov (United States)

    Nearing, Grey

    2017-04-01

    Information theory is the branch of logic that describes how rational epistemic states evolve in the presence of empirical data (Knuth, 2005), and any logic of science is incomplete without such a theory. Developing a formal philosophy of science that recognizes this fact results in essentially trivial solutions to several longstanding problems are generally considered intractable, including: • Alleviating the need for any likelihood function or error model. • Derivation of purely logical falsification criteria for hypothesis testing. • Specification of a general quantitative method for process-level model diagnostics. More generally, I make the following arguments: 1. Model evaluation should not proceed by quantifying and/or reducing error or uncertainty, and instead should be approached as a problem of ensuring that our models contain as much information as our experimental data. I propose that the latter is the only question a scientist actually has the ability to ask. 2. Instead of building geophysical models as solutions to differential equations that represent conservation laws, we should build models as maximum entropy distributions constrained by conservation symmetries. This will allow us to derive predictive probabilities directly from first principles. Knuth, K. H. (2005) 'Lattice duality: The origin of probability and entropy', Neurocomputing, 67, pp. 245-274.

  2. Pandemic influenza: certain uncertainties

    Science.gov (United States)

    Morens, David M.; Taubenberger, Jeffery K.

    2011-01-01

    SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, “wave” patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672

  3. Uncertainty enabled Sensor Observation Services

    Science.gov (United States)

    Cornford, Dan; Williams, Matthew; Bastin, Lucy

    2010-05-01

    Almost all observations of reality are contaminated with errors, which introduce uncertainties into the actual observation result. Such uncertainty is often held to be a data quality issue, and quantification of this uncertainty is essential for the principled exploitation of the observations. Many existing systems treat data quality in a relatively ad-hoc manner, however if the observation uncertainty is a reliable estimate of the error on the observation with respect to reality then knowledge of this uncertainty enables optimal exploitation of the observations in further processes, or decision making. We would argue that the most natural formalism for expressing uncertainty is Bayesian probability theory. In this work we show how the Open Geospatial Consortium Sensor Observation Service can be implemented to enable the support of explicit uncertainty about observations. We show how the UncertML candidate standard is used to provide a rich and flexible representation of uncertainty in this context. We illustrate this on a data set of user contributed weather data where the INTAMAP interpolation Web Processing Service is used to help estimate the uncertainty on the observations of unknown quality, using observations with known uncertainty properties. We then go on to discuss the implications of uncertainty for a range of existing Open Geospatial Consortium standards including SWE common and Observations and Measurements. We discuss the difficult decisions in the design of the UncertML schema and its relation and usage within existing standards and show various options. We conclude with some indications of the likely future directions for UncertML in the context of Open Geospatial Consortium services.

  4. The Key Lake project

    International Nuclear Information System (INIS)

    1991-01-01

    Key Lake is located in the Athabasca sand stone basin, 640 kilometers north of Saskatoon, Saskatchewan, Canada. The three sources of ore at Key Lake contain 70 100 tonnes of uranium. Features of the Key Lake Project were described under the key headings: work force, mining, mill process, tailings storage, permanent camp, environmental features, worker health and safety, and economic benefits. Appendices covering the historical background, construction projects, comparisons of western world mines, mining statistics, Northern Saskatchewan surface lease, and Key Lake development and regulatory agencies were included

  5. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    Science.gov (United States)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  6. Uncertainty Assessments in Fast Neutron Activation Analysis

    International Nuclear Information System (INIS)

    W. D. James; R. Zeisler

    2000-01-01

    Fast neutron activation analysis (FNAA) carried out with the use of small accelerator-based neutron generators is routinely used for major/minor element determinations in industry, mineral and petroleum exploration, and to some extent in research. While the method shares many of the operational procedures and therefore errors inherent to conventional thermal neutron activation analysis, its unique implementation gives rise to additional specific concerns that can result in errors or increased uncertainties of measured quantities. The authors were involved in a recent effort to evaluate irreversible incorporation of oxygen into a standard reference material (SRM) by direct measurement of oxygen by FNAA. That project required determination of oxygen in bottles of the SRM stored in varying environmental conditions and a comparison of the results. We recognized the need to accurately describe the total uncertainty of the measurements to accurately characterize any differences in the resulting average concentrations. It is our intent here to discuss the breadth of potential parameters that have the potential to contribute to the random and nonrandom errors of the method and provide estimates of the magnitude of uncertainty introduced. In addition, we will discuss the steps taken in this recent FNAA project to control quality, assess the uncertainty of the measurements, and evaluate results based on the statistical reproducibility

  7. LOCKS AND KEYS SERVICE

    CERN Multimedia

    Locks and Keys Service

    2002-01-01

    The Locks and Keys service (ST/FM) will move from building 55 to building 570 from the 2nd August to the 9th August 2002 included. During this period the service will be closed. Only in case of extreme urgency please call the 164550. Starting from Monday, 12th August, the Locks and Keys Service will continue to follow the activities related to office keys (keys and locks) and will provide the keys for furniture. The service is open from 8h30 to 12h00 and from 13h00 to 17h30. We remind you that your divisional correspondents can help you in the execution of the procedures. We thank you for your comprehension and we remain at your service to help you in solving all the matters related to keys for offices and furniture. Locks and Keys Service - ST Division - FM Group

  8. A commentary on model uncertainty

    International Nuclear Information System (INIS)

    Apostolakis, G.

    1994-01-01

    A framework is proposed for the identification of model and parameter uncertainties in risk assessment models. Two cases are distinguished; in the first case, a set of mutually exclusive and exhaustive hypotheses (models) can be formulated, while, in the second, only one reference model is available. The relevance of this formulation to decision making and the communication of uncertainties is discussed

  9. Mama Software Features: Uncertainty Testing

    Energy Technology Data Exchange (ETDEWEB)

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-30

    This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.

  10. Designing for Uncertainty: Three Approaches

    Science.gov (United States)

    Bennett, Scott

    2007-01-01

    Higher education wishes to get long life and good returns on its investment in learning spaces. Doing this has become difficult because rapid changes in information technology have created fundamental uncertainties about the future in which capital investments must deliver value. Three approaches to designing for this uncertainty are described…

  11. Robustness of ancestral sequence reconstruction to phylogenetic uncertainty.

    Science.gov (United States)

    Hanson-Smith, Victor; Kolaczkowski, Bryan; Thornton, Joseph W

    2010-09-01

    Ancestral sequence reconstruction (ASR) is widely used to formulate and test hypotheses about the sequences, functions, and structures of ancient genes. Ancestral sequences are usually inferred from an alignment of extant sequences using a maximum likelihood (ML) phylogenetic algorithm, which calculates the most likely ancestral sequence assuming a probabilistic model of sequence evolution and a specific phylogeny--typically the tree with the ML. The true phylogeny is seldom known with certainty, however. ML methods ignore this uncertainty, whereas Bayesian methods incorporate it by integrating the likelihood of each ancestral state over a distribution of possible trees. It is not known whether Bayesian approaches to phylogenetic uncertainty improve the accuracy of inferred ancestral sequences. Here, we use simulation-based experiments under both simplified and empirically derived conditions to compare the accuracy of ASR carried out using ML and Bayesian approaches. We show that incorporating phylogenetic uncertainty by integrating over topologies very rarely changes the inferred ancestral state and does not improve the accuracy of the reconstructed ancestral sequence. Ancestral state reconstructions are robust to uncertainty about the underlying tree because the conditions that produce phylogenetic uncertainty also make the ancestral state identical across plausible trees; conversely, the conditions under which different phylogenies yield different inferred ancestral states produce little or no ambiguity about the true phylogeny. Our results suggest that ML can produce accurate ASRs, even in the face of phylogenetic uncertainty. Using Bayesian integration to incorporate this uncertainty is neither necessary nor beneficial.

  12. Key energy technologies for Europe

    International Nuclear Information System (INIS)

    Holst Joergensen, Birte

    2005-09-01

    The report is part of the work undertaken by the High-Level Expert Group to prepare a report on emerging science and technology trends and the implications for EU and Member State research policies. The outline of the report is: 1) In the introductory section, energy technologies are defined and for analytical reasons further narrowed down; 2) The description of the socio-economic challenges facing Europe in the energy field is based on the analysis made by the International Energy Agency going back to 1970 and with forecasts to 2030. Both the world situation and the European situation are described. This section also contains an overview of the main EU policy responses to energy. Both EU energy R and D as well as Member State energy R and D resources are described in view of international efforts; 3) The description of the science and technology base is made for selected energy technologies, including energy efficiency, biomass, hydrogen, and fuel cells, photovoltaics, clean fossil fuel technologies and CO 2 capture and storage, nuclear fission and fusion. When possible, a SWOT is made for each technology and finally summarised; 4) The forward look highlights some of the key problems and uncertainties related to the future energy situation. Examples of recent energy foresights are given, including national energy foresights in Sweden and the UK as well as links to a number of regional and national foresights and roadmaps; 5) Appendix 1 contains a short description of key international organisations dealing with energy technologies and energy research. (ln)

  13. On the Application of Science Systems Engineering and Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    Science.gov (United States)

    Schlegel, Nicole-Jeanne; Boening, Carmen; Larour, Eric; Limonadi, Daniel; Schodlok, Michael; Seroussi, Helene; Watkins, Michael

    2017-04-01

    Research and development activities at the Jet Propulsion Laboratory (JPL) currently support the creation of a framework to formally evaluate the observational needs within earth system science. One of the pilot projects of this effort aims to quantify uncertainties in global mean sea level rise projections, due to contributions from the continental ice sheets. Here, we take advantage of established uncertainty quantification tools embedded within the JPL-University of California at Irvine Ice Sheet System Model (ISSM). We conduct sensitivity and Monte-Carlo style sampling experiments on forward simulations of the Greenland and Antarctic ice sheets. By varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges, we assess the impact of the different parameter ranges on century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  14. Uncertainty relations for approximation and estimation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jaeha, E-mail: jlee@post.kek.jp [Department of Physics, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Tsutsui, Izumi, E-mail: izumi.tsutsui@kek.jp [Department of Physics, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Theory Center, Institute of Particle and Nuclear Studies, High Energy Accelerator Research Organization (KEK), 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan)

    2016-05-27

    We present a versatile inequality of uncertainty relations which are useful when one approximates an observable and/or estimates a physical parameter based on the measurement of another observable. It is shown that the optimal choice for proxy functions used for the approximation is given by Aharonov's weak value, which also determines the classical Fisher information in parameter estimation, turning our inequality into the genuine Cramér–Rao inequality. Since the standard form of the uncertainty relation arises as a special case of our inequality, and since the parameter estimation is available as well, our inequality can treat both the position–momentum and the time–energy relations in one framework albeit handled differently. - Highlights: • Several inequalities interpreted as uncertainty relations for approximation/estimation are derived from a single ‘versatile inequality’. • The ‘versatile inequality’ sets a limit on the approximation of an observable and/or the estimation of a parameter by another observable. • The ‘versatile inequality’ turns into an elaboration of the Robertson–Kennard (Schrödinger) inequality and the Cramér–Rao inequality. • Both the position–momentum and the time–energy relation are treated in one framework. • In every case, Aharonov's weak value arises as a key geometrical ingredient, deciding the optimal choice for the proxy functions.

  15. Uncertainty relations for approximation and estimation

    International Nuclear Information System (INIS)

    Lee, Jaeha; Tsutsui, Izumi

    2016-01-01

    We present a versatile inequality of uncertainty relations which are useful when one approximates an observable and/or estimates a physical parameter based on the measurement of another observable. It is shown that the optimal choice for proxy functions used for the approximation is given by Aharonov's weak value, which also determines the classical Fisher information in parameter estimation, turning our inequality into the genuine Cramér–Rao inequality. Since the standard form of the uncertainty relation arises as a special case of our inequality, and since the parameter estimation is available as well, our inequality can treat both the position–momentum and the time–energy relations in one framework albeit handled differently. - Highlights: • Several inequalities interpreted as uncertainty relations for approximation/estimation are derived from a single ‘versatile inequality’. • The ‘versatile inequality’ sets a limit on the approximation of an observable and/or the estimation of a parameter by another observable. • The ‘versatile inequality’ turns into an elaboration of the Robertson–Kennard (Schrödinger) inequality and the Cramér–Rao inequality. • Both the position–momentum and the time–energy relation are treated in one framework. • In every case, Aharonov's weak value arises as a key geometrical ingredient, deciding the optimal choice for the proxy functions.

  16. Quantum dense key distribution

    International Nuclear Information System (INIS)

    Degiovanni, I.P.; Ruo Berchera, I.; Castelletto, S.; Rastello, M.L.; Bovino, F.A.; Colla, A.M.; Castagnoli, G.

    2004-01-01

    This paper proposes a protocol for quantum dense key distribution. This protocol embeds the benefits of a quantum dense coding and a quantum key distribution and is able to generate shared secret keys four times more efficiently than the Bennet-Brassard 1984 protocol. We hereinafter prove the security of this scheme against individual eavesdropping attacks, and we present preliminary experimental results, showing its feasibility

  17. Quantifying phenomenological importance in best-estimate plus uncertainty analyses

    International Nuclear Information System (INIS)

    Martin, Robert P.

    2009-01-01

    This paper describes a general methodology for quantifying the importance of specific phenomenological elements to analysis measures evaluated from non-parametric best-estimate plus uncertainty evaluation methodologies. The principal objective of an importance analysis is to reveal those uncertainty contributors having the greatest influence on key analysis measures. This characterization supports the credibility of the uncertainty analysis, the applicability of the analytical tools, and even the generic evaluation methodology through the validation of the engineering judgments that guided the evaluation methodology development. A demonstration of the importance analysis is provided using data from a sample problem considered in the development of AREVA's Realistic LBLOCA methodology. The results are presented against the original large-break LOCA Phenomena Identification and Ranking Table developed by the Technical Program Group responsible for authoring the Code Scaling, Applicability and Uncertainty methodology. (author)

  18. Uncertainty Evaluation of Best Estimate Calculation Results

    International Nuclear Information System (INIS)

    Glaeser, H.

    2006-01-01

    Efforts are underway in Germany to perform analysis using best estimate computer codes and to include uncertainty evaluation in licensing. The German Reactor Safety Commission (RSK) issued a recommendation to perform uncertainty analysis in loss of coolant accident safety analyses (LOCA), recently. A more general requirement is included in a draft revision of the German Nuclear Regulation which is an activity of the German Ministry of Environment and Reactor Safety (BMU). According to the recommendation of the German RSK to perform safety analyses for LOCA in licensing the following deterministic requirements have still to be applied: Most unfavourable single failure, Unavailability due to preventive maintenance, Break location, Break size and break type, Double ended break, 100 percent through 200 percent, Large, medium and small break, Loss of off-site power, Core power (at accident initiation the most unfavourable conditions and values have to be assumed which may occur under normal operation taking into account the set-points of integral power and power density control. Measurement and calibration errors can be considered statistically), Time of fuel cycle. Analysis using best estimate codes with evaluation of uncertainties is the only way to quantify conservatisms with regard to code models and uncertainties of plant, fuel parameters and decay heat. This is especially the case for approaching licensing limits, e.g. due to power up-rates, higher burn-up and higher enrichment. Broader use of best estimate analysis is therefore envisaged in the future. Since some deterministic unfavourable assumptions regarding availability of NPP systems are still used, some conservatism in best-estimate analyses remains. Methods of uncertainty analyses have been developed and applied by the vendor Framatome ANP as well as by GRS in Germany. The GRS development was sponsored by the German Ministry of Economy and Labour (BMWA). (author)

  19. A Framework for Understanding Uncertainty in Seismic Risk Assessment.

    Science.gov (United States)

    Foulser-Piggott, Roxane; Bowman, Gary; Hughes, Martin

    2017-10-11

    A better understanding of the uncertainty that exists in models used for seismic risk assessment is critical to improving risk-based decisions pertaining to earthquake safety. Current models estimating the probability of collapse of a building do not consider comprehensively the nature and impact of uncertainty. This article presents a model framework to enhance seismic risk assessment and thus gives decisionmakers a fuller understanding of the nature and limitations of the estimates. This can help ensure that risks are not over- or underestimated and the value of acquiring accurate data is appreciated fully. The methodology presented provides a novel treatment of uncertainties in input variables, their propagation through the model, and their effect on the results. The study presents ranges of possible annual collapse probabilities for different case studies on buildings in different parts of the world, exposed to different levels of seismicity, and with different vulnerabilities. A global sensitivity analysis was conducted to determine the significance of uncertain variables. Two key outcomes are (1) that the uncertainty in ground-motion conversion equations has the largest effect on the uncertainty in the calculation of annual collapse probability; and (2) the vulnerability of a building appears to have an effect on the range of annual collapse probabilities produced, i.e., the level of uncertainty in the estimate of annual collapse probability, with less vulnerable buildings having a smaller uncertainty. © 2017 Society for Risk Analysis.

  20. Nuclear data sensitivity/uncertainty analysis for XT-ADS

    International Nuclear Information System (INIS)

    Sugawara, Takanori; Sarotto, Massimo; Stankovskiy, Alexey; Van den Eynde, Gert

    2011-01-01

    Highlights: → The sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. → The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. → When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. → To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition. - Abstract: The XT-ADS, an accelerator-driven system for an experimental demonstration, has been investigated in the framework of IP EUROTRANS FP6 project. In this study, the sensitivity and uncertainty analyses were performed to comprehend the reliability of the XT-ADS neutronic design. For the sensitivity analysis, it was found that the sensitivity coefficients were significantly different by changing the geometry models and calculation codes. For the uncertainty analysis, it was confirmed that the uncertainties deduced from the covariance data varied significantly by changing them. The uncertainties deduced from the covariance data for the XT-ADS criticality were 0.94%, 1.9% and 1.1% by the SCALE 44-group, TENDL-2009 and JENDL-3.3 data, respectively. When the target accuracy of 0.3%Δk for the criticality was considered, the uncertainties did not satisfy it. To achieve this accuracy, the uncertainties should be improved by experiments under an adequate condition.

  1. Decomposing the uncertainty in climate impact projections of Dynamic Vegetation Models: a test with the forest models LANDCLIM and FORCLIM

    Science.gov (United States)

    Cailleret, Maxime; Snell, Rebecca; von Waldow, Harald; Kotlarski, Sven; Bugmann, Harald

    2015-04-01

    Different levels of uncertainty should be considered in climate impact projections by Dynamic Vegetation Models (DVMs), particularly when it comes to managing climate risks. Such information is useful to detect the key processes and uncertainties in the climate model - impact model chain and may be used to support recommendations for future improvements in the simulation of both climate and biological systems. In addition, determining which uncertainty source is dominant is an important aspect to recognize the limitations of climate impact projections by a multi-model ensemble mean approach. However, to date, few studies have clarified how each uncertainty source (baseline climate data, greenhouse gas emission scenario, climate model, and DVM) affects the projection of ecosystem properties. Focusing on one greenhouse gas emission scenario, we assessed the uncertainty in the projections of a forest landscape model (LANDCLIM) and a stand-scale forest gap model (FORCLIM) that is caused by linking climate data with an impact model. LANDCLIM was used to assess the uncertainty in future landscape properties of the Visp valley in Switzerland that is due to (i) the use of different 'baseline' climate data (gridded data vs. data from weather stations), and (ii) differences in climate projections among 10 GCM-RCM chains. This latter point was also considered for the projections of future forest properties by FORCLIM at several sites along an environmental gradient in Switzerland (14 GCM-RCM chains), for which we also quantified the uncertainty caused by (iii) the model chain specific statistical properties of the climate time-series, and (iv) the stochasticity of the demographic processes included in the model, e.g., the annual number of saplings that establish, or tree mortality. Using methods of variance decomposition analysis, we found that (i) The use of different baseline climate data strongly impacts the prediction of forest properties at the lowest and highest, but

  2. Planning ATES systems under uncertainty

    Science.gov (United States)

    Jaxa-Rozen, Marc; Kwakkel, Jan; Bloemendal, Martin

    2015-04-01

    form a complex adaptive system, for which agent-based modelling provides a useful analysis framework. This study therefore explores the interactions between endogenous ATES adoption processes and the relative performance of different planning schemes, using an agent-based adoption model coupled with a hydrologic model of the subsurface. The models are parameterized to simulate typical operating conditions for ATES systems in a dense urban area. Furthermore, uncertainties relating to planning parameters, adoption processes, and climactic conditions are explicitly considered using exploratory modelling techniques. Results are therefore presented for the performance of different planning policies over a broad range of plausible scenarios.

  3. Ideas underlying the Quantification of Margins and Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Pilch, Martin, E-mail: mpilch@sandia.gov [Department 1514, Sandia National Laboratories, Albuquerque, NM 87185-0828 (United States); Trucano, Timothy G. [Department 1411, Sandia National Laboratories, Albuquerque, NM 87185-0370 (United States); Helton, Jon C. [Department of Mathematics and Statistics, Arizona State University, Tempe, AZ 85287-1804 (United States)

    2011-09-15

    Key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions are described. While QMU is a broad process and methodology for generating critical technical information to be used in U.S. nuclear weapon stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, the following topics are discussed: (i) the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, (ii) the need to separate aleatory and epistemic uncertainty in QMU, and (iii) the properties of risk-informed decision making (RIDM) that are best suited for effective application of QMU. The paper is written at a high level, but provides an extensive bibliography of useful papers for interested readers to deepen their understanding of the presented ideas.

  4. Uncertainties in Nuclear Proliferation Modeling

    International Nuclear Information System (INIS)

    Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok

    2015-01-01

    There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies

  5. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  6. Model uncertainty in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U; Huovinen, T [VTT Automation, Espoo (Finland). Industrial Automation

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.).

  7. Robustness for slope stability modelling under deep uncertainty

    Science.gov (United States)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2015-04-01

    Landslides can have large negative societal and economic impacts, such as loss of life and damage to infrastructure. However, the ability of slope stability assessment to guide management is limited by high levels of uncertainty in model predictions. Many of these uncertainties cannot be easily quantified, such as those linked to climate change and other future socio-economic conditions, restricting the usefulness of traditional decision analysis tools. Deep uncertainty can be managed more effectively by developing robust, but not necessarily optimal, policies that are expected to perform adequately under a wide range of future conditions. Robust strategies are particularly valuable when the consequences of taking a wrong decision are high as is often the case of when managing natural hazard risks such as landslides. In our work a physically based numerical model of hydrologically induced slope instability (the Combined Hydrology and Stability Model - CHASM) is applied together with robust decision making to evaluate the most important uncertainties (storm events, groundwater conditions, surface cover, slope geometry, material strata and geotechnical properties) affecting slope stability. Specifically, impacts of climate change on long-term slope stability are incorporated, accounting for the deep uncertainty in future climate projections. Our findings highlight the potential of robust decision making to aid decision support for landslide hazard reduction and risk management under conditions of deep uncertainty.

  8. The time course of attention modulation elicited by spatial uncertainty.

    Science.gov (United States)

    Huang, Dan; Liang, Huilou; Xue, Linyan; Wang, Meijian; Hu, Qiyi; Chen, Yao

    2017-09-01

    Uncertainty regarding the target location is an influential factor for spatial attention. Modulation in spatial uncertainty can lead to adjustments in attention scope and variations in attention effects. Hence, investigating spatial uncertainty modulation is important for understanding the underlying mechanism of spatial attention. However, the temporal dynamics of this modulation remains unclear. To evaluate the time course of spatial uncertainty modulation, we adopted a Posner-like attention orienting paradigm with central or peripheral cues. Different numbers of cues were used to indicate the potential locations of the target and thereby manipulate the spatial uncertainty level. The time interval between the onsets of the cue and the target (stimulus onset asynchrony, SOA) varied from 50 to 2000ms. We found that under central cueing, the effect of spatial uncertainty modulation could be detected from 200 to 2000ms after the presence of the cues. Under peripheral cueing, the effect of spatial uncertainty modulation was observed from 50 to 2000ms after cueing. Our results demonstrate that spatial uncertainty modulation produces robust and sustained effects on target detection speed. The time course of this modulation is influenced by the cueing method, which suggests that discrepant processing procedures are involved under different cueing conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Dealing with uncertainties in environmental burden of disease assessment

    Directory of Open Access Journals (Sweden)

    van der Sluijs Jeroen P

    2009-04-01

    Full Text Available Abstract Disability Adjusted Life Years (DALYs combine the number of people affected by disease or mortality in a population and the duration and severity of their condition into one number. The environmental burden of disease is the number of DALYs that can be attributed to environmental factors. Environmental burden of disease estimates enable policy makers to evaluate, compare and prioritize dissimilar environmental health problems or interventions. These estimates often have various uncertainties and assumptions which are not always made explicit. Besides statistical uncertainty in input data and parameters – which is commonly addressed – a variety of other types of uncertainties may substantially influence the results of the assessment. We have reviewed how different types of uncertainties affect environmental burden of disease assessments, and we give suggestions as to how researchers could address these uncertainties. We propose the use of an uncertainty typology to identify and characterize uncertainties. Finally, we argue that uncertainties need to be identified, assessed, reported and interpreted in order for assessment results to adequately support decision making.

  10. Perceived Uncertainty Sources in Wind Power Plant Design

    Energy Technology Data Exchange (ETDEWEB)

    Damiani, Rick R [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-01-03

    This presentation for the Fourth Wind Energy Systems Engineering Workshop covers some of the uncertainties that still impact turbulent wind operation and how these affect design and structural reliability; identifies key sources and prioritization for R and D; and summarizes an analysis of current procedures, industry best practice, standards, and expert opinions.

  11. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    Science.gov (United States)

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  12. 3D Approach for Representing Uncertainties of Underground Utility Data

    NARCIS (Netherlands)

    olde Scholtenhuis, Léon Luc; Zlatanova, S.; den Duijn, Xander; Lin, Ken-Yu; El-Gohary, Nora; Tang, Pingbo

    Availability of 3D underground information models is key to designing and managing urban infrastructure construction projects. Buried utilities information is often registered by using different types of location data with different uncertainties. These data variances are, however, not considered in

  13. Impact of inherent meteorology uncertainty on air quality model predictions

    Science.gov (United States)

    It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is impor...

  14. Neural correlates of intolerance of uncertainty in clinical disorders

    NARCIS (Netherlands)

    Wever, Mirjam; Smeets, Paul; Sternheim, Lot

    2015-01-01

    Intolerance of uncertainty is a key contributor to anxiety-related disorders. Recent studies highlight its importance in other clinical disorders. The link between its clinical presentation and the underlying neural correlates remains unclear. This review summarizes the emerging literature on the

  15. Neural Correlates of Intolerance of Uncertainty in Clinical Disorders

    NARCIS (Netherlands)

    Wever, M.; Smeets, P.A.M.; Sternheim, L.

    2015-01-01

    Intolerance of uncertainty is a key contributor to anxiety-related disorders. Recent studies highlight its importance in other clinical disorders. The link between its clinical presentation and the underlying neural correlates remains unclear. This review summarizes the emerging literature on the

  16. Uncertainty estimation in nuclear material weighing

    Energy Technology Data Exchange (ETDEWEB)

    Thaure, Bernard [Institut de Radioprotection et de Surete Nucleaire, Fontenay aux Roses, (France)

    2011-12-15

    The assessment of nuclear material quantities located in nuclear plants requires knowledge of additions and subtractions of amounts of different types of materials. Most generally, the quantity of nuclear material held is deduced from 3 parameters: a mass (or a volume of product); a concentration of nuclear material in the product considered; and an isotopic composition. Global uncertainties associated with nuclear material quantities depend upon the confidence level of results obtained in the measurement of every different parameter. Uncertainties are generally estimated by considering five influencing parameters (ISHIKAWA's rule): the material itself; the measurement system; the applied method; the environmental conditions; and the operator. A good practice guide, to be used to deal with weighing errors and problems encountered, is presented in the paper.

  17. Realising the Uncertainty Enabled Model Web

    Science.gov (United States)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address

  18. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  19. Uncertainty quantification of ion chemistry in lean and stoichiometric homogenous mixtures of methane, oxygen, and argon

    KAUST Repository

    Kim, Daesang

    2015-07-01

    Uncertainty quantification (UQ) methods are implemented to obtain a quantitative characterization of the evolution of electrons and ions during the ignition of methane-oxygen mixtures under lean and stoichiometric conditions. The GRI-Mech 3.0 mechanism is combined with an extensive set of ion chemistry pathways and the forward propagation of uncertainty from model parameters to observables is performed using response surfaces. The UQ analysis considers 22 uncertain rate parameters, which include both chemi-ionization, proton transfer, and electron attachment reactions as well as neutral reactions pertaining to the chemistry of the CH radical. The uncertainty ranges for each rate parameter are discussed. Our results indicate that the uncertainty in the time evolution of the electron number density is due mostly to the chemi-ionization reaction CH+O⇌HCO+ +E- and to the main CH consumption reaction CH+O2 ⇌O+HCO. Similar conclusions hold for the hydronium ion H3O+, since electrons and H3O+ account for more than 99% of the total negative and positive charge density, respectively. Surprisingly, the statistics of the number density of charged species show very little sensitivity to the uncertainty in the rate of the recombination reaction H3O+ +E- →products, until very late in the decay process, when the electron number density has fallen below 20% of its peak value. Finally, uncertainties in the secondary reactions within networks leading to the formation of minor ions (e.g., C2H3O+, HCO+, OH-, and O-) do not play any role in controlling the mean and variance of electrons and H3O+, but do affect the statistics of the minor ions significantly. The observed trends point to the role of key neutral reactions in controlling the mean and variance of the charged species number density in an indirect fashion. Furthermore, total sensitivity indices provide quantitative metrics to focus future efforts aiming at improving the rates of key reactions responsible for the

  20. Uncertainty quantification of ion chemistry in lean and stoichiometric homogenous mixtures of methane, oxygen, and argon

    KAUST Repository

    Kim, Daesang; Rizzi, Francesco; Cheng, Kwok Wah; Han, Jie; Bisetti, Fabrizio; Knio, Omar Mohamad

    2015-01-01

    Uncertainty quantification (UQ) methods are implemented to obtain a quantitative characterization of the evolution of electrons and ions during the ignition of methane-oxygen mixtures under lean and stoichiometric conditions. The GRI-Mech 3.0 mechanism is combined with an extensive set of ion chemistry pathways and the forward propagation of uncertainty from model parameters to observables is performed using response surfaces. The UQ analysis considers 22 uncertain rate parameters, which include both chemi-ionization, proton transfer, and electron attachment reactions as well as neutral reactions pertaining to the chemistry of the CH radical. The uncertainty ranges for each rate parameter are discussed. Our results indicate that the uncertainty in the time evolution of the electron number density is due mostly to the chemi-ionization reaction CH+O⇌HCO+ +E- and to the main CH consumption reaction CH+O2 ⇌O+HCO. Similar conclusions hold for the hydronium ion H3O+, since electrons and H3O+ account for more than 99% of the total negative and positive charge density, respectively. Surprisingly, the statistics of the number density of charged species show very little sensitivity to the uncertainty in the rate of the recombination reaction H3O+ +E- →products, until very late in the decay process, when the electron number density has fallen below 20% of its peak value. Finally, uncertainties in the secondary reactions within networks leading to the formation of minor ions (e.g., C2H3O+, HCO+, OH-, and O-) do not play any role in controlling the mean and variance of electrons and H3O+, but do affect the statistics of the minor ions significantly. The observed trends point to the role of key neutral reactions in controlling the mean and variance of the charged species number density in an indirect fashion. Furthermore, total sensitivity indices provide quantitative metrics to focus future efforts aiming at improving the rates of key reactions responsible for the

  1. Key improvements to XTR

    NARCIS (Netherlands)

    Lenstra, A.K.; Verheul, E.R.; Okamoto, T.

    2000-01-01

    This paper describes improved methods for XTR key representation and parameter generation (cf. [4]). If the field characteristic is properly chosen, the size of the XTR public key for signature applications can be reduced by a factor of three at the cost of a small one time computation for the

  2. OPPORTUNITIES AND RISKS OF CROSS-BORDER COOPERATION OF REGIONS OF THE SOUTHERN MACROREGION OF RUSSIA AND REGIONS OF THE SOUTH-EAST OF UKRAINE IN THE CONDITIONS OF UNCERTAINTY

    Directory of Open Access Journals (Sweden)

    Inna Mitrofanova

    2015-09-01

    Full Text Available In article the political and economic innovations connected with the transformation of the region «Donbass» in the context of decrease of the risks caused by a military-political and economic conflict in Ukraine are investigated. On the basis of creative synthesis of theoretical and practical approaches of studying the evolution of nonlinear economic systems and the formation of megaregions, and also the analysis of the geopolitical situation developing on the world scene between Russia, the USA and China provisions are developed according to which the region «Donbass» can be considered as «a critical point» of the European regionalization. Authors believe that the important strategic prospect of border cooperation of Russia and Ukraine is connected with the formation of a cross-border agglomeration «Nizhnedonbassky» and «Verkhnedonbassky» the creation of which consists in realization of the linking function between Nizhnedonbassky, Volga region, Moscow and Petersburg transport corridors. One of the condition of social and economic stabilization in the subjects of the foreign Caspian and the Black Sea zones is the realization of geotransit capacity of the region «Donbass» with the formation of a geotransit architecture of its economy. Strategically realization of the processes of an international city formation is possible either on the basis of federal principles or by a geopolitical split of the territory of the region “Donbass” on the line Kharkov – Donetsk – Lugansk with a bent to Russia.

  3. A Bayesian belief network approach for assessing uncertainty in conceptual site models at contaminated sites

    Science.gov (United States)

    Thomsen, Nanna I.; Binning, Philip J.; McKnight, Ursula S.; Tuxen, Nina; Bjerg, Poul L.; Troldborg, Mads

    2016-05-01

    A key component in risk assessment of contaminated sites is in the formulation of a conceptual site model (CSM). A CSM is a simplified representation of reality and forms the basis for the mathematical modeling of contaminant fate and transport at the site. The CSM should therefore identify the most important site-specific features and processes that may affect the contaminant transport behavior at the site. However, the development of a CSM will always be associated with uncertainties due to limited data and lack of understanding of the site conditions. CSM uncertainty is often found to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert opinion at different knowledge levels. The developed BBNs combine data from desktop studies and initial site investigations with expert opinion to assess which of the CSMs are more likely to reflect the actual site conditions. The method is demonstrated on a Danish field site, contaminated with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based on data from three investigation stages (a screening investigation, a more detailed investigation, and an expert consultation) to demonstrate that the belief can be updated as more information

  4. Decision-making under great uncertainty

    International Nuclear Information System (INIS)

    Hansson, S.O.

    1992-01-01

    Five types of decision-uncertainty are distinguished: uncertainty of consequences, of values, of demarcation, of reliance, and of co-ordination. Strategies are proposed for each type of uncertainty. The general conclusion is that it is meaningful for decision theory to treat cases with greater uncertainty than the textbook case of 'decision-making under uncertainty'. (au)

  5. Toward best practice framing of uncertainty in scientific publications: A review of Water Resources Research abstracts

    Science.gov (United States)

    Guillaume, Joseph H. A.; Helgeson, Casey; Elsawah, Sondoss; Jakeman, Anthony J.; Kummu, Matti

    2017-08-01

    Uncertainty is recognized as a key issue in water resources research, among other sciences. Discussions of uncertainty typically focus on tools and techniques applied within an analysis, e.g., uncertainty quantification and model validation. But uncertainty is also addressed outside the analysis, in writing scientific publications. The language that authors use conveys their perspective of the role of uncertainty when interpreting a claim—what we call here "framing" the uncertainty. This article promotes awareness of uncertainty framing in four ways. (1) It proposes a typology of eighteen uncertainty frames, addressing five questions about uncertainty. (2) It describes the context in which uncertainty framing occurs. This is an interdisciplinary topic, involving philosophy of science, science studies, linguistics, rhetoric, and argumentation. (3) We analyze the use of uncertainty frames in a sample of 177 abstracts from the Water Resources Research journal in 2015. This helped develop and tentatively verify the typology, and provides a snapshot of current practice. (4) We make provocative recommendations to achieve a more influential, dynamic science. Current practice in uncertainty framing might be described as carefully considered incremental science. In addition to uncertainty quantification and degree of belief (present in ˜5% of abstracts), uncertainty is addressed by a combination of limiting scope, deferring to further work (˜25%) and indicating evidence is sufficient (˜40%)—or uncertainty is completely ignored (˜8%). There is a need for public debate within our discipline to decide in what context different uncertainty frames are appropriate. Uncertainty framing cannot remain a hidden practice evaluated only by lone reviewers.

  6. Implicit knowledge of visual uncertainty guides decisions with asymmetric outcomes

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma; Sahani, Maneesh

    2008-01-01

    under conditions of uncertainty. Here we show that human observers performing a simple visual choice task under an externally imposed loss function approach the optimal strategy, as defined by Bayesian probability and decision theory (Berger, 1985; Cox, 1961). In concert with earlier work, this suggests...... are pre-existing, widespread, and can be propagated to decision-making areas of the brain....... that observers possess a model of their internal uncertainty and can utilize this model in the neural computations that underlie their behavior (Knill & Pouget, 2004). In our experiment, optimal behavior requires that observers integrate the loss function with an estimate of their internal uncertainty rather...

  7. CRCP-Prey choice of corallivorous snails and enhanced susceptibility to predation in corals of compromised condition in Florida Keys from 2013-07-02 to 2013-09-04 (NCEI Accession 0162230)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set examined variation in prey choice of C. abbreviata based on host-origin, prey condition, and potential social interaction using paired choice assay...

  8. The Uncertainties of Risk Management

    DEFF Research Database (Denmark)

    Vinnari, Eija; Skærbæk, Peter

    2014-01-01

    for expanding risk management. More generally, such uncertainties relate to the professional identities and responsibilities of operational managers as defined by the framing devices. Originality/value – The paper offers three contributions to the extant literature: first, it shows how risk management itself......Purpose – The purpose of this paper is to analyse the implementation of risk management as a tool for internal audit activities, focusing on unexpected effects or uncertainties generated during its application. Design/methodology/approach – Public and confidential documents as well as semi......-structured interviews are analysed through the lens of actor-network theory to identify the effects of risk management devices in a Finnish municipality. Findings – The authors found that risk management, rather than reducing uncertainty, itself created unexpected uncertainties that would otherwise not have emerged...

  9. Climate Projections and Uncertainty Communication.

    Science.gov (United States)

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. Copyright © 2015 Cognitive Science Society, Inc.

  10. Relational uncertainty in service dyads

    DEFF Research Database (Denmark)

    Kreye, Melanie

    2017-01-01

    in service dyads and how they resolve it through suitable organisational responses to increase the level of service quality. Design/methodology/approach: We apply the overall logic of Organisational Information-Processing Theory (OIPT) and present empirical insights from two industrial case studies collected...... the relational uncertainty increased the functional quality while resolving the partner’s organisational uncertainty increased the technical quality of the delivered service. Originality: We make two contributions. First, we introduce relational uncertainty to the OM literature as the inability to predict...... and explain the actions of a partnering organisation due to a lack of knowledge about their abilities and intentions. Second, we present suitable organisational responses to relational uncertainty and their effect on service quality....

  11. Advanced LOCA code uncertainty assessment

    International Nuclear Information System (INIS)

    Wickett, A.J.; Neill, A.P.

    1990-11-01

    This report describes a pilot study that identified, quantified and combined uncertainties for the LOBI BL-02 3% small break test. A ''dials'' version of TRAC-PF1/MOD1, called TRAC-F, was used. (author)

  12. How to live with uncertainties?

    International Nuclear Information System (INIS)

    Michel, R.

    2012-01-01

    In a short introduction, the problem of uncertainty as a general consequence of incomplete information as well as the approach to quantify uncertainty in metrology are addressed. A little history of the more than 30 years of the working group AK SIGMA is followed by an appraisal of its up-to-now achievements. Then, the potential future of the AK SIGMA is discussed based on its actual tasks and on open scientific questions and future topics. (orig.)

  13. Some remarks on modeling uncertainties

    International Nuclear Information System (INIS)

    Ronen, Y.

    1983-01-01

    Several topics related to the question of modeling uncertainties are considered. The first topic is related to the use of the generalized bias operator method for modeling uncertainties. The method is expanded to a more general form of operators. The generalized bias operator is also used in the inverse problem and applied to determine the anisotropic scattering law. The last topic discussed is related to the question of the limit to accuracy and how to establish its value. (orig.) [de

  14. Propagation of dynamic measurement uncertainty

    International Nuclear Information System (INIS)

    Hessling, J P

    2011-01-01

    The time-dependent measurement uncertainty has been evaluated in a number of recent publications, starting from a known uncertain dynamic model. This could be defined as the 'downward' propagation of uncertainty from the model to the targeted measurement. The propagation of uncertainty 'upward' from the calibration experiment to a dynamic model traditionally belongs to system identification. The use of different representations (time, frequency, etc) is ubiquitous in dynamic measurement analyses. An expression of uncertainty in dynamic measurements is formulated for the first time in this paper independent of representation, joining upward as well as downward propagation. For applications in metrology, the high quality of the characterization may be prohibitive for any reasonably large and robust model to pass the whiteness test. This test is therefore relaxed by not directly requiring small systematic model errors in comparison to the randomness of the characterization. Instead, the systematic error of the dynamic model is propagated to the uncertainty of the measurand, analogously but differently to how stochastic contributions are propagated. The pass criterion of the model is thereby transferred from the identification to acceptance of the total accumulated uncertainty of the measurand. This increases the relevance of the test of the model as it relates to its final use rather than the quality of the calibration. The propagation of uncertainty hence includes the propagation of systematic model errors. For illustration, the 'upward' propagation of uncertainty is applied to determine if an appliance box is damaged in an earthquake experiment. In this case, relaxation of the whiteness test was required to reach a conclusive result

  15. Optimal Taxation under Income Uncertainty

    OpenAIRE

    Xianhua Dai

    2011-01-01

    Optimal taxation under income uncertainty has been extensively developed in expected utility theory, but it is still open for inseparable utility function between income and effort. As an alternative of decision-making under uncertainty, prospect theory (Kahneman and Tversky (1979), Tversky and Kahneman (1992)) has been obtained empirical support, for example, Kahneman and Tversky (1979), and Camerer and Lowenstein (2003). It is beginning to explore optimal taxation in the context of prospect...

  16. New Perspectives on Policy Uncertainty

    OpenAIRE

    Hlatshwayo, Sandile

    2017-01-01

    In recent years, the ubiquitous and intensifying nature of economic policy uncertainty has made it a popular explanation for weak economic performance in developed and developing markets alike. The primary channel for this effect is decreased and delayed investment as firms adopt a ``wait and see'' approach to irreversible investments (Bernanke, 1983; Dixit and Pindyck, 1994). Deep empirical examination of policy uncertainty's impact is rare because of the difficulty associated in measuring i...

  17. Uncertainty analysis of scintillometers methods in measuring sensible heat fluxes of forest ecosystem

    Science.gov (United States)

    Zheng, N.

    2017-12-01

    Sensible heat flux (H) is one of the driving factors of surface turbulent motion and energy exchange. Therefore, it is particularly important to measure sensible heat flux accurately at the regional scale. However, due to the heterogeneity of the underlying surface, hydrothermal regime, and different weather conditions, it is difficult to estimate the represented flux at the kilometer scale. The scintillometer have been developed into an effective and universal equipment for deriving heat flux at the regional-scale which based on the turbulence effect of light in the atmosphere since the 1980s. The parameter directly obtained by the scintillometer is the structure parameter of the refractive index of air based on the changes of light intensity fluctuation. Combine with parameters such as temperature structure parameter, zero-plane displacement, surface roughness, wind velocity, air temperature and the other meteorological data heat fluxes can be derived. These additional parameters increase the uncertainties of flux because the difference between the actual feature of turbulent motion and the applicable conditions of turbulence theory. Most previous studies often focused on the constant flux layers that are above the rough sub-layers and homogeneous flat surfaces underlying surfaces with suitable weather conditions. Therefore, the criteria and modified forms of key parameters are invariable. In this study, we conduct investment over the hilly area of northern China with different plants, such as cork oak, cedar-black and locust. On the basis of key research on the threshold and modified forms of saturation with different turbulence intensity, modified forms of Bowen ratio with different drying-and-wetting conditions, universal function for the temperature structure parameter under different atmospheric stability, the dominant sources of uncertainty will be determined. The above study is significant to reveal influence mechanism of uncertainty and explore influence

  18. Position-momentum uncertainty relations in the presence of quantum memory

    DEFF Research Database (Denmark)

    Furrer, Fabian; Berta, Mario; Tomamichel, Marco

    2014-01-01

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear oper....... As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states....

  19. Pharmacological Fingerprints of Contextual Uncertainty.

    Directory of Open Access Journals (Sweden)

    Louise Marshall

    2016-11-01

    Full Text Available Successful interaction with the environment requires flexible updating of our beliefs about the world. By estimating the likelihood of future events, it is possible to prepare appropriate actions in advance and execute fast, accurate motor responses. According to theoretical proposals, agents track the variability arising from changing environments by computing various forms of uncertainty. Several neuromodulators have been linked to uncertainty signalling, but comprehensive empirical characterisation of their relative contributions to perceptual belief updating, and to the selection of motor responses, is lacking. Here we assess the roles of noradrenaline, acetylcholine, and dopamine within a single, unified computational framework of uncertainty. Using pharmacological interventions in a sample of 128 healthy human volunteers and a hierarchical Bayesian learning model, we characterise the influences of noradrenergic, cholinergic, and dopaminergic receptor antagonism on individual computations of uncertainty during a probabilistic serial reaction time task. We propose that noradrenaline influences learning of uncertain events arising from unexpected changes in the environment. In contrast, acetylcholine balances attribution of uncertainty to chance fluctuations within an environmental context, defined by a stable set of probabilistic associations, or to gross environmental violations following a contextual switch. Dopamine supports the use of uncertainty representations to engender fast, adaptive responses.

  20. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    Buslik, A.

    1994-01-01

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  1. Uncertainties in carbon residence time and NPP-driven carbon uptake in terrestrial ecosystems of the conterminous USA: a Bayesian approach

    Directory of Open Access Journals (Sweden)

    Xuhui Zhou

    2012-10-01

    Full Text Available Carbon (C residence time is one of the key factors that determine the capacity of ecosystem C storage. However, its uncertainties have not been well quantified, especially at regional scales. Assessing uncertainties of C residence time is thus crucial for an improved understanding of terrestrial C sequestration. In this study, the Bayesian inversion and Markov Chain Monte Carlo (MCMC technique were applied to a regional terrestrial ecosystem (TECO-R model to quantify C residence times and net primary productivity (NPP-driven ecosystem C uptake and assess their uncertainties in the conterminous USA. The uncertainty was represented by coefficient of variation (CV. The 13 spatially distributed data sets of C pools and fluxes have been used to constrain TECO-R model for each biome (totally eight biomes. Our results showed that estimated ecosystem C residence times ranged from 16.6±1.8 (cropland to 85.9±15.3 yr (evergreen needleleaf forest with an average of 56.8±8.8 yr in the conterminous USA. The ecosystem C residence times and their CV were spatially heterogeneous and varied with vegetation types and climate conditions. Large uncertainties appeared in the southern and eastern USA. Driven by NPP changes from 1982 to 1998, terrestrial ecosystems in the conterminous USA would absorb 0.20±0.06 Pg C yr−1. Their spatial pattern was closely related to the greenness map in the summer with larger uptake in central and southeast regions. The lack of data or timescale mismatching between the available data and the estimated parameters lead to uncertainties in the estimated C residence times, which together with initial NPP resulted in the uncertainties in the estimated NPP-driven C uptake. The Bayesian approach with MCMC inversion provides an effective tool to estimate spatially distributed C residence time and assess their uncertainties in the conterminous USA.

  2. Risk and uncertainty.

    Science.gov (United States)

    Carter, Tony

    2010-01-01

    Volatile business conditions have led to drastic corporate downsizing, meaning organizations are expected to do more with less. Managers must be more knowledgeable and possess a more eclectic myriad of business skills, many of which have not even been seen until recently. Many internal and external changes have occurred to organizations that have dictated the need to do business differently. Changes such as technological advances; globalization; catastrophic business crises; a more frantic competitive climate; and more demanding, sophisticated customers are examples of some of the shifts in the external business environment. Internal changes to organizations have been in the form of reengineering, accompanied by structural realignments and downsizing; greater emphasis on quality levels in product and service output; faster communication channels; and a more educated, skilled employee base with higher expectations from management.

  3. Supporting qualified database for V and V and uncertainty evaluation of best-estimate system codes

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.

    2014-01-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS- 52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The' RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QP' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering

  4. Key Facts about Tularemia

    Science.gov (United States)

    ... Submit What's this? Submit Button Key Facts About Tularemia Recommend on Facebook Tweet Share Compartir This fact ... and Prevention (CDC) Tularemia Web site . What is Tularemia? Tularemia is a potentially serious illness that occurs ...

  5. Key technologies book

    International Nuclear Information System (INIS)

    1997-01-01

    In this book can be found all the useful information on the French industry key technologies of the years 2000-2005. 136 technologies at the junction of the science advances and of the markets expectations are divided into 9 sectors. Among them, only 4 are interesting here: the environment, the transports, the materials and the energy. In 1995, the secretary's office of State for industry has published a first synthesis book on these key technologies. This 1997 new key technologies book extends and completes the initial study. For each key technology, an encyclopedic sheet is given. Each sheet combines thus some exact and practical information on: advance state of the technology, market characteristics, development forecasts, occupation and involved sectors, technology acquisition cost, research programs but also contacts of the main concerned efficiency poles. (O.M.)

  6. The Key Lake project

    International Nuclear Information System (INIS)

    Glattes, G.

    1985-01-01

    Aspects of project financing for the share of the Canadian subsidiary of Uranerzbergbau-GmbH, Bonn, in the uranium mining and milling facility at Key Lake, Saskatchewan, by a Canadian bank syndicate. (orig.) [de

  7. Developing first time-series of land surface temperature from AATSR with uncertainty estimates

    Science.gov (United States)

    Ghent, Darren; Remedios, John

    2013-04-01

    Land surface temperature (LST) is the radiative skin temperature of the land, and is one of the key parameters in the physics of land-surface processes on regional and global scales. Earth Observation satellites provide the opportunity to obtain global coverage of LST approximately every 3 days or less. One such source of satellite retrieved LST has been the Advanced Along-Track Scanning Radiometer (AATSR); with LST retrieval being implemented in the AATSR Instrument Processing Facility in March 2004. Here we present first regional and global time-series of LST data from AATSR with estimates of uncertainty. Mean changes in temperature over the last decade will be discussed along with regional patterns. Although time-series across all three ATSR missions have previously been constructed (Kogler et al., 2012), the use of low resolution auxiliary data in the retrieval algorithm and non-optimal cloud masking resulted in time-series artefacts. As such, considerable ESA supported development has been carried out on the AATSR data to address these concerns. This includes the integration of high resolution auxiliary data into the retrieval algorithm and subsequent generation of coefficients and tuning parameters, plus the development of an improved cloud mask based on the simulation of clear sky conditions from radiance transfer modelling (Ghent et al., in prep.). Any inference on this LST record is though of limited value without the accompaniment of an uncertainty estimate; wherein the Joint Committee for Guides in Metrology quote an uncertainty as "a parameter associated with the result of a measurement that characterizes the dispersion of the values that could reasonably be attributed to the measurand that is the value of the particular quantity to be measured". Furthermore, pixel level uncertainty fields are a mandatory requirement in the on-going preparation of the LST product for the upcoming Sea and Land Surface Temperature (SLSTR) instrument on-board Sentinel-3

  8. Uncertainty Management for Diagnostics and Prognostics of Batteries using Bayesian Techniques

    Science.gov (United States)

    Saha, Bhaskar; Goebel, kai

    2007-01-01

    Uncertainty management has always been the key hurdle faced by diagnostics and prognostics algorithms. A Bayesian treatment of this problem provides an elegant and theoretically sound approach to the modern Condition- Based Maintenance (CBM)/Prognostic Health Management (PHM) paradigm. The application of the Bayesian techniques to regression and classification in the form of Relevance Vector Machine (RVM), and to state estimation as in Particle Filters (PF), provides a powerful tool to integrate the diagnosis and prognosis of battery health. The RVM, which is a Bayesian treatment of the Support Vector Machine (SVM), is used for model identification, while the PF framework uses the learnt model, statistical estimates of noise and anticipated operational conditions to provide estimates of remaining useful life (RUL) in the form of a probability density function (PDF). This type of prognostics generates a significant value addition to the management of any operation involving electrical systems.

  9. Mastering demand and supply uncertainty with combined product and process configuration

    NARCIS (Netherlands)

    Verdouw, C.N.; Beulens, A.J.M.; Trienekens, J.H.; Verwaart, D.

    2010-01-01

    The key challenge for mastering high uncertainty of both demand and supply is to attune products and business processes in the entire supply chain continuously to customer requirements. Product configurators have proven to be powerful tools for managing demand uncertainty. This paper assesses how

  10. Uncertainties in modelling the spatial and temporal variations in aerosol concentrations

    NARCIS (Netherlands)

    Meij, de A.

    2009-01-01

    Aerosols play a key role in air quality (health aspects) and climate. In this thesis atmospheric chemistry transport models are used to study the uncertainties in aerosol modelling and to evaluate the effects of emission reduction scenarios on air quality. Uncertainties in: the emissions of gas and

  11. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    Science.gov (United States)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G; Ring, David; Parisien, Robert

    2016-06-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if other factors are involved. With added experience, uncertainty could be expected to diminish, but perhaps more influential are things like physician confidence, belief in the veracity of what is published, and even one's religious beliefs. In addition, it is plausible that the kind of practice a physician works in can affect the experience of uncertainty. Practicing physicians may not be immediately aware of these effects on how uncertainty is experienced in their clinical decision-making. We asked: (1) Does uncertainty and overconfidence bias decrease with years of practice? (2) What sociodemographic factors are independently associated with less recognition of uncertainty, in particular belief in God or other deity or deities, and how is atheism associated with recognition of uncertainty? (3) Do confidence bias (confidence that one's skill is greater than it actually is), degree of trust in the orthopaedic evidence, and degree of statistical sophistication correlate independently with recognition of uncertainty? We created a survey to establish an overall recognition of uncertainty score (four questions), trust in the orthopaedic evidence base (four questions), confidence bias (three questions), and statistical understanding (six questions). Seven hundred six members of the Science of Variation Group, a collaboration that aims to study variation in the definition and treatment of human illness, were approached to complete our survey. This group represents mainly orthopaedic surgeons specializing in trauma or hand and wrist surgery, practicing in Europe and North America, of whom the majority is involved in teaching. Approximately half of the group has more than 10 years

  12. Key energy technologies for Europe

    Energy Technology Data Exchange (ETDEWEB)

    Holst Joergensen, Birte

    2005-09-01

    The report is part of the work undertaken by the High-Level Expert Group to prepare a report on emerging science and technology trends and the implications for EU and Member State research policies. The outline of the report is: 1) In the introductory section, energy technologies are defined and for analytical reasons further narrowed down; 2) The description of the socio-economic challenges facing Europe in the energy field is based on the analysis made by the International Energy Agency going back to 1970 and with forecasts to 2030. Both the world situation and the European situation are described. This section also contains an overview of the main EU policy responses to energy. Both EU energy R and D as well as Member State energy R and D resources are described in view of international efforts; 3) The description of the science and technology base is made for selected energy technologies, including energy efficiency, biomass, hydrogen, and fuel cells, photovoltaics, clean fossil fuel technologies and CO{sub 2} capture and storage, nuclear fission and fusion. When possible, a SWOT is made for each technology and finally summarised; 4) The forward look highlights some of the key problems and uncertainties related to the future energy situation. Examples of recent energy foresights are given, including national energy foresights in Sweden and the UK as well as links to a number of regional and national foresights and roadmaps; 5) Appendix 1 contains a short description of key international organisations dealing with energy technologies and energy research. (ln)

  13. Exploring the notion of a 'capability for uncertainty' and the implications for leader development

    Directory of Open Access Journals (Sweden)

    Kathy Bennett

    2016-10-01

    them to engage more effectively with uncertainty and to more positively manage their experience of uncertainty in these increasingly turbulent times. Contribution/value-add: The key contribution is the identification of five crucial components constituting a capability for uncertainty, which can be used to inform leadership development interventions designed to develop such capability in leaders.

  14. Sources/treatment of uncertainties in the performance assessment of geologic radioactive waste repositories

    International Nuclear Information System (INIS)

    Cranwell, R.M.

    1987-01-01

    Uncertainties in the performance assessment of geologic radioactive waste repositories have several sources. The more important ones include: 1) uncertainty in the conditions of a disposal system over the temporal scales set forth in regulations, 2) uncertainty in the conceptualization of the geohydrologic system, 3) uncertainty in the theoretical description of a given conceptual model of the system, 4) uncertainty in the development of computer codes to implement the solution of a mathematical model, and 5) uncertainty in the parameters and data required in the models and codes used to assess the long-term performance of the disposal system. This paper discusses each of these uncertainties and outlines methods for addressing these uncertainties

  15. Uncertainty Management for Diagnostics and Prognostics of Batteries using Bayesian Techniques

    Data.gov (United States)

    National Aeronautics and Space Administration — Uncertainty management has always been the key hurdle faced by diagnostics and prognostics algorithms. A Bayesian treatment of this problem provides an elegant and...

  16. 3.8 Proposed approach to uncertainty quantification and sensitivity analysis in the next PA

    Energy Technology Data Exchange (ETDEWEB)

    Flach, Greg [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Wohlwend, Jen [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-10-02

    This memorandum builds upon Section 3.8 of SRNL (2016) and Flach (2017) by defining key error analysis, uncertainty quantification, and sensitivity analysis concepts and terms, in preparation for the next E-Area Performance Assessment (WSRC 2008) revision.

  17. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    Energy Technology Data Exchange (ETDEWEB)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  18. Ruminations On NDA Measurement Uncertainty Compared TO DA Uncertainty

    International Nuclear Information System (INIS)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-01-01

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  19. ESFR core optimization and uncertainty studies

    International Nuclear Information System (INIS)

    Rineiski, A.; Vezzoni, B.; Zhang, D.; Marchetti, M.; Gabrielli, F.; Maschek, W.; Chen, X.-N.; Buiron, L.; Krepel, J.; Sun, K.; Mikityuk, K.; Polidoro, F.; Rochman, D.; Koning, A.J.; DaCruz, D.F.; Tsige-Tamirat, H.; Sunderland, R.

    2015-01-01

    In the European Sodium Fast Reactor (ESFR) project supported by EURATOM in 2008-2012, a concept for a large 3600 MWth sodium-cooled fast reactor design was investigated. In particular, reference core designs with oxide and carbide fuel were optimized to improve their safety parameters. Uncertainties in these parameters were evaluated for the oxide option. Core modifications were performed first to reduce the sodium void reactivity effect. Introduction of a large sodium plenum with an absorber layer above the core and a lower axial fertile blanket improve the total sodium void effect appreciably, bringing it close to zero for a core with fresh fuel, in line with results obtained worldwide, while not influencing substantially other core physics parameters. Therefore an optimized configuration, CONF2, with a sodium plenum and a lower blanket was established first and used as a basis for further studies in view of deterioration of safety parameters during reactor operation. Further options to study were an inner fertile blanket, introduction of moderator pins, a smaller core height, special designs for pins, such as 'empty' pins, and subassemblies. These special designs were proposed to facilitate melted fuel relocation in order to avoid core re-criticality under severe accident conditions. In the paper further CONF2 modifications are compared in terms of safety and fuel balance. They may bring further improvements in safety, but their accurate assessment requires additional studies, including transient analyses. Uncertainty studies were performed by employing a so-called Total Monte-Carlo method, for which a large number of nuclear data files is produced for single isotopes and then used in Monte-Carlo calculations. The uncertainties for the criticality, sodium void and Doppler effects, effective delayed neutron fraction due to uncertainties in basic nuclear data were assessed for an ESFR core. They prove applicability of the available nuclear data for ESFR

  20. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  1. Critical loads - assessment of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Barkman, A.

    1998-10-01

    The effects of data uncertainty in applications of the critical loads concept were investigated on different spatial resolutions in Sweden and northern Czech Republic. Critical loads of acidity (CL) were calculated for Sweden using the biogeochemical model PROFILE. Three methods with different structural complexity were used to estimate the adverse effects of S0{sub 2} concentrations in northern Czech Republic. Data uncertainties in the calculated critical loads/levels and exceedances (EX) were assessed using Monte Carlo simulations. Uncertainties within cumulative distribution functions (CDF) were aggregated by accounting for the overlap between site specific confidence intervals. Aggregation of data uncertainties within CDFs resulted in lower CL and higher EX best estimates in comparison with percentiles represented by individual sites. Data uncertainties were consequently found to advocate larger deposition reductions to achieve non-exceedance based on low critical loads estimates on 150 x 150 km resolution. Input data were found to impair the level of differentiation between geographical units at all investigated resolutions. Aggregation of data uncertainty within CDFs involved more constrained confidence intervals for a given percentile. Differentiation as well as identification of grid cells on 150 x 150 km resolution subjected to EX was generally improved. Calculation of the probability of EX was shown to preserve the possibility to differentiate between geographical units. Re-aggregation of the 95%-ile EX on 50 x 50 km resolution generally increased the confidence interval for each percentile. Significant relationships were found between forest decline and the three methods addressing risks induced by S0{sub 2} concentrations. Modifying S0{sub 2} concentrations by accounting for the length of the vegetation period was found to constitute the most useful trade-off between structural complexity, data availability and effects of data uncertainty. Data

  2. Wind energy: Overcoming inadequate wind and modeling uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Kane, Vivek

    2010-09-15

    'Green Energy' is the call of the day, and significance of Wind Energy can never be overemphasized. But the key question here is - What if the wind resources are inadequate? Studies reveal that the probability of finding favorable wind at a given place on land is only 15%. Moreover, there are inherent uncertainties associated with wind business. Can we overcome inadequate wind resources? Can we scientifically quantify uncertainty and model it to make business sense? This paper proposes a solution, by way of break-through Wind Technologies, combined with advanced tools for Financial Modeling, enabling vital business decisions.

  3. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  4. Uncertainty in spatial planning proceedings

    Directory of Open Access Journals (Sweden)

    Aleš Mlakar

    2009-01-01

    Full Text Available Uncertainty is distinctive of spatial planning as it arises from the necessity to co-ordinate the various interests within the area, from the urgency of adopting spatial planning decisions, the complexity of the environment, physical space and society, addressing the uncertainty of the future and from the uncertainty of actually making the right decision. Response to uncertainty is a series of measures that mitigate the effects of uncertainty itself. These measures are based on two fundamental principles – standardization and optimization. The measures are related to knowledge enhancement and spatial planning comprehension, in the legal regulation of changes, in the existence of spatial planning as a means of different interests co-ordination, in the active planning and the constructive resolution of current spatial problems, in the integration of spatial planning and the environmental protection process, in the implementation of the analysis as the foundation of spatial planners activities, in the methods of thinking outside the parameters, in forming clear spatial concepts and in creating a transparent management spatial system and also in the enforcement the participatory processes.

  5. Uncertainty modeling and decision support

    International Nuclear Information System (INIS)

    Yager, Ronald R.

    2004-01-01

    We first formulate the problem of decision making under uncertainty. The importance of the representation of our knowledge about the uncertainty in formulating a decision process is pointed out. We begin with a brief discussion of the case of probabilistic uncertainty. Next, in considerable detail, we discuss the case of decision making under ignorance. For this case the fundamental role of the attitude of the decision maker is noted and its subjective nature is emphasized. Next the case in which a Dempster-Shafer belief structure is used to model our knowledge of the uncertainty is considered. Here we also emphasize the subjective choices the decision maker must make in formulating a decision function. The case in which the uncertainty is represented by a fuzzy measure (monotonic set function) is then investigated. We then return to the Dempster-Shafer belief structure and show its relationship to the fuzzy measure. This relationship allows us to get a deeper understanding of the formulation the decision function used Dempster- Shafer framework. We discuss how this deeper understanding allows a decision analyst to better make the subjective choices needed in the formulation of the decision function

  6. Evaluation of the uncertainty of environmental measurements of radioactivity

    International Nuclear Information System (INIS)

    Heydorn, K.

    2003-01-01

    Full text: The almost universal acceptance of the concept of uncertainty has led to its introduction into the ISO 17025 standard for general requirements to testing and calibration laboratories. This means that not only scientists, but also legislators, politicians, the general population - and perhaps even the press - expect to see all future results associated with an expression of their uncertainty. Results obtained by measurement of radioactivity have routinely been associated with an expression of their uncertainty, based on the so-called counting statistics. This is calculated together with the actual result on the assumption that the number of counts observed has a Poisson distribution with equal mean and variance. Most of the nuclear scientific community has therefore assumed that it already complied with the latest ISO 17025 requirements. Counting statistics, however, express only the variability observed among repeated measurements of the same sample under the same counting conditions, which is equivalent to the term repeatability used in quantitative analysis. Many other sources of uncertainty need to be taken into account before a statement of the uncertainty of the actual result can be made. As the first link in the traceability chain calibration is always an important uncertainty component in any kind of measurement. For radioactivity measurements in particular we find that counting geometry assumes the greatest importance, because it is often not possible to measure a standard and a control sample under exactly the same conditions. In the case of large samples we have additional uncertainty components associated with sample heterogeneity and its influence on self-absorption and counting efficiency. In low-level environmental measurements we have an additional risk of sample contamination, but the most important contribution to uncertainty is usually the representativity of the sample being analysed. For uniform materials this can be expressed by the

  7. Uncertainties in workplace external dosimetry - An analytical approach

    International Nuclear Information System (INIS)

    Ambrosi, P.

    2006-01-01

    The uncertainties associated with external dosimetry measurements at workplaces depend on the type of dosemeter used together with its performance characteristics and the information available on the measurement conditions. Performance characteristics were determined in the course of a type test and information about the measurement conditions can either be general, e.g. 'research' and 'medicine', or specific, e.g. 'X-ray testing equipment for aluminium wheel rims'. This paper explains an analytical approach to determine the measurement uncertainty. It is based on the Draft IEC Technical Report IEC 62461 Radiation Protection Instrumentation - Determination of Uncertainty in Measurement. Both this paper and the report cannot eliminate the fact that the determination of the uncertainty requires a larger effort than performing the measurement itself. As a counterbalance, the process of determining the uncertainty results not only in a numerical value of the uncertainty but also produces the best estimate of the quantity to be measured, which may differ from the indication of the instrument. Thus it also improves the result of the measurement. (authors)

  8. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  9. Internal design of technical systems under conditions of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Krasnoshchekov, P S; Morozov, V V; Fedorov, V V

    1982-03-01

    An investigation is made of a model of internal design of a complex technical system in the presence of uncertain factors. The influence of an opponent on the design is examined. The concepts of hierarchical and balanced compatibility between the criteria of the designer, the opponent and the segregations functions are introduced and studied. The connection between the approach proposed and the methods of artificial intelligence is discussed. 5 references.

  10. Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.

    Science.gov (United States)

    Soleimani, Hossein; Hensman, James; Saria, Suchi

    2017-08-21

    Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.

  11. On the uncertainty principle. V

    International Nuclear Information System (INIS)

    Halpern, O.

    1976-01-01

    The treatment of ideal experiments connected with the uncertainty principle is continued. The author analyzes successively measurements of momentum and position, and discusses the common reason why the results in all cases differ from the conventional ones. A similar difference exists for the measurement of field strengths. The interpretation given by Weizsaecker, who tried to interpret Bohr's complementarity principle by introducing a multi-valued logic is analyzed. The treatment of the uncertainty principle ΔE Δt is deferred to a later paper as is the interpretation of the method of variation of constants. Every ideal experiment discussed shows various lower limits for the value of the uncertainty product which limits depend on the experimental arrangement and are always (considerably) larger than h. (Auth.)

  12. Decommissioning Funding: Ethics, Implementation, Uncertainties

    International Nuclear Information System (INIS)

    2007-01-01

    This status report on decommissioning funding: ethics, implementation, uncertainties is based on a review of recent literature and materials presented at NEA meetings in 2003 and 2004, and particularly at a topical session organised in November 2004 on funding issues associated with the decommissioning of nuclear power facilities. The report also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). This report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems

  13. Correlated uncertainties in integral data

    International Nuclear Information System (INIS)

    McCracken, A.K.

    1978-01-01

    The use of correlated uncertainties in calculational data is shown in cases investigated to lead to a reduction in the uncertainty of calculated quantities of importance to reactor design. It is stressed however that such reductions are likely to be important in a minority of cases of practical interest. The effect of uncertainties in detector cross-sections is considered and is seen to be, in some cases, of equal importance to that in the data used in calculations. Numerical investigations have been limited by the sparse information available on data correlations; some comparisons made of these data reveal quite large inconsistencies for both detector cross-sections and cross-section of interest for reactor calculations

  14. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  15. Uncertainty, God, and scrupulosity: Uncertainty salience and priming God concepts interact to cause greater fears of sin.

    Science.gov (United States)

    Fergus, Thomas A; Rowatt, Wade C

    2015-03-01

    Difficulties tolerating uncertainty are considered central to scrupulosity, a moral/religious presentation of obsessive-compulsive disorder (OCD). We examined whether uncertainty salience (i.e., exposure to a state of uncertainty) caused fears of sin and fears of God, as well as whether priming God concepts affected the impact of uncertainty salience on those fears. An internet sample of community adults (N = 120) who endorsed holding a belief in God or a higher power were randomly assigned to an experimental manipulation of (1) salience (uncertainty or insecurity) and (2) prime (God concepts or neutral). As predicted, participants who received the uncertainty salience and God concept priming reported the greatest fears of sin. There were no mean-level differences in the other conditions. The effect was not attributable to religiosity and the manipulations did not cause negative affect. We used a nonclinical sample recruited from the internet. These results support cognitive-behavioral models suggesting that religious uncertainty is important to scrupulosity. Implications of these results for future research are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. UNCERTAINTY IN THE PROCESS INTEGRATION FOR THE BIOREFINERIES DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Meilyn González Cortés

    2015-07-01

    Full Text Available This paper presents how the design approaches with high level of flexibility can reduce the additional costs of the strategies that apply overdesign factors to consider parameters with uncertainty that impact on the economic feasibility of a project. The elements with associate uncertainties and that are important in the configurations of the process integration under a biorefinery scheme are: raw material, raw material technologies of conversion, and variety of products that can be obtained. From the analysis it is obtained that in the raw materials and products with potentialities in a biorefinery scheme, there are external uncertainties such as availability, demands and prices in the market. Those external uncertainties can determine their impact on the biorefinery and also in the product prices we can find minimum and maximum limits that can be identified in intervals which should be considered for the project economic evaluation and the sensibility analysis due to varied conditions.

  17. Uncertainty relation and simultaneous measurements in quantum theory

    International Nuclear Information System (INIS)

    Busch, P.

    1982-01-01

    In this thesis the question for the interpretation of the uncertainty relation is picked up, and a program for the justification of its individualistic interpretation is formulated. By means of quantum mechanical models for the position and momentum measurement a justification of the interpretaton has been tried by reconstruction of the origin of the uncertainties from the conditions of the measuring devices and the determination of the relation of the measured results to the object. By means of a model of the common measurement it could be shown how the uncertainty relation results from the not eliminable mutual disturbance of the devices and the uncertainty relation for the measuring system. So finally the commutation relation is conclusive. For the illustration the split experiment is discussed, first according to Heisenberg with fixed split, then for the quantum mechanical, movable split (Bohr-Einstein). (orig./HSI) [de

  18. Collaborative framework for PIV uncertainty quantification: comparative assessment of methods

    International Nuclear Information System (INIS)

    Sciacchitano, Andrea; Scarano, Fulvio; Neal, Douglas R; Smith, Barton L; Warner, Scott O; Vlachos, Pavlos P; Wieneke, Bernhard

    2015-01-01

    A posteriori uncertainty quantification of particle image velocimetry (PIV) data is essential to obtain accurate estimates of the uncertainty associated with a given experiment. This is particularly relevant when measurements are used to validate computational models or in design and decision processes. In spite of the importance of the subject, the first PIV uncertainty quantification (PIV-UQ) methods have been developed only in the last three years. The present work is a comparative assessment of four approaches recently proposed in the literature: the uncertainty surface method (Timmins et al 2012), the particle disparity approach (Sciacchitano et al 2013), the peak ratio criterion (Charonko and Vlachos 2013) and the correlation statistics method (Wieneke 2015). The analysis is based upon experiments conducted for this specific purpose, where several measurement techniques are employed simultaneously. The performances of the above approaches are surveyed across different measurement conditions and flow regimes. (paper)

  19. Comparison is key.

    Science.gov (United States)

    Stone, Mark H; Stenner, A Jackson

    2014-01-01

    Several concepts from Georg Rasch's last papers are discussed. The key one is comparison because Rasch considered the method of comparison fundamental to science. From the role of comparison stems scientific inference made operational by a properly developed frame of reference producing specific objectivity. The exact specifications Rasch outlined for making comparisons are explicated from quotes, and the role of causality derived from making comparisons is also examined. Understanding causality has implications for what can and cannot be produced via Rasch measurement. His simple examples were instructive, but the implications are far reaching upon first establishing the key role of comparison.

  20. Key World Energy Statistics

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-07-01

    The IEA produced its first handy, pocket-sized summary of key energy data in 1997. This new edition responds to the enormously positive reaction to the book since then. Key World Energy Statistics produced by the IEA contains timely, clearly-presented data on supply, transformation and consumption of all major energy sources. The interested businessman, journalist or student will have at his or her fingertips the annual Canadian production of coal, the electricity consumption in Thailand, the price of diesel oil in Spain and thousands of other useful energy facts. It exists in different formats to suit our readers' requirements.