WorldWideScience

Sample records for identify important uncertainties

  1. Global sensitivity analysis for identifying important parameters of nitrogen nitrification and denitrification under model uncertainty and scenario uncertainty

    Science.gov (United States)

    Chen, Zhuowei; Shi, Liangsheng; Ye, Ming; Zhu, Yan; Yang, Jinzhong

    2018-06-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. By using a new variance-based global sensitivity analysis method, this paper identifies important parameters for nitrogen reactive transport with simultaneous consideration of these three uncertainties. A combination of three scenarios of soil temperature and two scenarios of soil moisture creates a total of six scenarios. Four alternative models describing the effect of soil temperature and moisture content are used to evaluate the reduction functions used for calculating actual reaction rates. The results show that for nitrogen reactive transport problem, parameter importance varies substantially among different models and scenarios. Denitrification and nitrification process is sensitive to soil moisture content status rather than to the moisture function parameter. Nitrification process becomes more important at low moisture content and low temperature. However, the changing importance of nitrification activity with respect to temperature change highly relies on the selected model. Model-averaging is suggested to assess the nitrification (or denitrification) contribution by reducing the possible model error. Despite the introduction of biochemical heterogeneity or not, fairly consistent parameter importance rank is obtained in this study: optimal denitrification rate (Kden) is the most important parameter; reference temperature (Tr) is more important than temperature coefficient (Q10); empirical constant in moisture response function (m) is the least important one. Vertical distribution of soil moisture but not temperature plays predominant role controlling nitrogen reaction. This study provides insight into the nitrogen reactive transport modeling and demonstrates an effective strategy of selecting the important parameters when future temperature and soil moisture carry uncertainties or when modelers face with multiple ways of establishing nitrogen

  2. Reducing uncertainty at minimal cost: a method to identify important input parameters and prioritize data collection

    NARCIS (Netherlands)

    Uwizeye, U.A.; Groen, E.A.; Gerber, P.J.; Schulte, Rogier P.O.; Boer, de I.J.M.

    2016-01-01

    The study aims to illustrate a method to identify important input parameters that explain most of the output variance ofenvironmental assessment models. The method is tested for the computation of life-cycle nitrogen (N) use efficiencyindicators among mixed dairy production systems in Rwanda. We

  3. A new uncertainty importance measure

    International Nuclear Information System (INIS)

    Borgonovo, E.

    2007-01-01

    Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman [A matrix-based approach to uncertainty and sensitivity analysis for fault trees. Risk Anal 1987;7(1):22-33] first introduced uncertainty importance measures

  4. Information-theoretic approach to uncertainty importance

    International Nuclear Information System (INIS)

    Park, C.K.; Bari, R.A.

    1985-01-01

    A method is presented for importance analysis in probabilistic risk assessments (PRA) for which the results of interest are characterized by full uncertainty distributions and not just point estimates. The method is based on information theory in which entropy is a measure of uncertainty of a probability density function. We define the relative uncertainty importance between two events as the ratio of the two exponents of the entropies. For the log-normal and log-uniform distributions the importance measure is comprised of the median (central tendency) and of the logarithm of the error factor (uncertainty). Thus, if accident sequences are ranked this way, and the error factors are not all equal, then a different rank order would result than if the sequences were ranked by the central tendency measure alone. As an illustration, the relative importance of internal events and in-plant fires was computed on the basis of existing PRA results

  5. Quantifying phenomenological importance in best-estimate plus uncertainty analyses

    International Nuclear Information System (INIS)

    Martin, Robert P.

    2009-01-01

    This paper describes a general methodology for quantifying the importance of specific phenomenological elements to analysis measures evaluated from non-parametric best-estimate plus uncertainty evaluation methodologies. The principal objective of an importance analysis is to reveal those uncertainty contributors having the greatest influence on key analysis measures. This characterization supports the credibility of the uncertainty analysis, the applicability of the analytical tools, and even the generic evaluation methodology through the validation of the engineering judgments that guided the evaluation methodology development. A demonstration of the importance analysis is provided using data from a sample problem considered in the development of AREVA's Realistic LBLOCA methodology. The results are presented against the original large-break LOCA Phenomena Identification and Ranking Table developed by the Technical Program Group responsible for authoring the Code Scaling, Applicability and Uncertainty methodology. (author)

  6. A decision-oriented measure of uncertainty importance for use in PSA

    International Nuclear Information System (INIS)

    Poern, Kurt

    1997-01-01

    For the interpretation of the results of probabilistic risk assessments it is important to have measures which identify the basic events that contribute most to the frequency of the top event but also to identify basic events that are the main contributors to the uncertainty in this frequency. Both types of measures, often called Importance Measure and Measure of Uncertainty Importance, respectively, have been the subject of interest for many researchers in the reliability field. The most frequent mode of uncertainty analysis in connection with probabilistic risk assessment has been to propagate the uncertainty of all model parameters up to an uncertainty distribution for the top event frequency. Various uncertainty importance measures have been proposed in order to point out the parameters that in some sense are the main contributors to the top event distribution. The new measure of uncertainty importance suggested here goes a step further in that it has been developed within a decision theory framework, thereby providing an indication of on what basic event it would be most valuable, from the decision-making point of view, to procure more information

  7. Sensitivity, uncertainty, and importance analysis of a risk assessment

    International Nuclear Information System (INIS)

    Andsten, R.S.; Vaurio, J.K.

    1992-01-01

    In this paper a number of supplementary studies and applications associated with probabilistic safety assessment (PSA) are described, including sensitivity and importance evaluations of failures, errors, systems, and groups of components. The main purpose is to illustrate the usefulness of a PSA for making decisions about safety improvements, training, allowed outage times, and test intervals. A useful measure of uncertainty importance is presented, and it points out areas needing development, such as reactor vessel aging phenomena, for reducing overall uncertainty. A time-dependent core damage frequency is also presented, illustrating the impact of testing scenarios and intervals. Tea methods and applications presented are based on the Level 1 PSA carried out for the internal initiating event of the Loviisa 1 nuclear power station. Steam generator leakages and associated operator actions are major contributors to the current core-damage frequency estimate of 2 x10 -4 /yr. The results are used to improve the plant and procedures and to guide future improvements

  8. Performance testing of supercapacitors: Important issues and uncertainties

    Science.gov (United States)

    Zhao, Jingyuan; Gao, Yinghan; Burke, Andrew F.

    2017-09-01

    Supercapacitors are a promising technology for high power energy storage, which have been used in some industrial and vehicles applications. Hence, it is important that information concerning the performance of supercapacitors be detailed and reliable so system designers can make rational decisions regarding the selection of the energy storage components. This paper is concerned with important issues and uncertainties regarding the performance testing of supercapacitors. The effect of different test procedures on the measured characteristics of both commercial and prototype supercapacitors including hybrid supercapacitors have been studied. It was found that the test procedure has a relatively minor effect on the capacitance of carbon/carbon devices and a more significant effect on the capacitance of hybrid supercapacitors. The device characteristic with the greatest uncertainty is the resistance and subsequently the claimed power capability of the device. The energy density should be measured by performing constant power discharges between appropriate voltage limits. This is particularly important in the case of hybrid supercapacitors for which the energy density is rate dependent and the simple relationship E = ½CV2 does not yield accurate estimates of the energy stored. In general, most of the important issues for testing carbon/carbon devices become more serious for hybrid supercapacitors.

  9. Measures of uncertainty, importance and sensitivity of the SEDA code

    International Nuclear Information System (INIS)

    Baron, J.; Caruso, A.; Vinate, H.

    1996-01-01

    The purpose of this work is the estimation of the uncertainty on the results of the SEDA code (Sistema de Evaluacion de Dosis en Accidentes) in accordance with the input data and its parameters. The SEDA code has been developed by the Comision Nacional de Energia Atomica for the estimation of doses during emergencies in the vicinity of Atucha and Embalse, nuclear power plants. The user should feed the code with meteorological data, source terms and accident data (timing involved, release height, thermal content of the release, etc.) It is designed to be used during the emergency, and to bring fast results that enable to make decisions. The uncertainty in the results of the SEDA code is quantified in the present paper. This uncertainty is associated both with the data the user inputs to the code, and with the uncertain parameters of the code own models. The used method consisted in the statistical characterization of the parameters and variables, assigning them adequate probability distributions. These distributions have been sampled with the Latin Hypercube Sampling method, which is a stratified multi-variable Monte-Carlo technique. The code has been performed for each of the samples and finally, a result sample has been obtained. These results have been characterized from the statistical point of view (obtaining their mean, most probable value, distribution shape, etc.) for several distances from the source. Finally, the Partial Correlation Coefficients and Standard Regression Coefficients techniques have been used to obtain the relative importance of each input variable, and the Sensitivity of the code to its variations. The measures of Importance and Sensitivity have been obtained for several distances from the source and various cases of atmospheric stability, making comparisons possible. This paper allowed to confide in the results of the code, and the association of their uncertainty to them, as a way to know the limits in which the results can vary in a real

  10. The Importance of identifiers: IWGSC Meeting 20170720

    OpenAIRE

    Haak, Laurel

    2017-01-01

    Presentation by Laure Haak at the 20 July 2017 meeting of the IWGSC, about use of identifiers in connecting researchers, funding, facilities, and publications. Description of approach and initial results of User Facilities and Publications Working Group, and applications for Scientific Collections.

  11. Uncertainty importance analysis using parametric moment ratio functions.

    Science.gov (United States)

    Wei, Pengfei; Lu, Zhenzhou; Song, Jingwen

    2014-02-01

    This article presents a new importance analysis framework, called parametric moment ratio function, for measuring the reduction of model output uncertainty when the distribution parameters of inputs are changed, and the emphasis is put on the mean and variance ratio functions with respect to the variances of model inputs. The proposed concepts efficiently guide the analyst to achieve a targeted reduction on the model output mean and variance by operating on the variances of model inputs. The unbiased and progressive unbiased Monte Carlo estimators are also derived for the parametric mean and variance ratio functions, respectively. Only a set of samples is needed for implementing the proposed importance analysis by the proposed estimators, thus the computational cost is free of input dimensionality. An analytical test example with highly nonlinear behavior is introduced for illustrating the engineering significance of the proposed importance analysis technique and verifying the efficiency and convergence of the derived Monte Carlo estimators. Finally, the moment ratio function is applied to a planar 10-bar structure for achieving a targeted 50% reduction of the model output variance. © 2013 Society for Risk Analysis.

  12. Small break LOCA RELAP5/MOD3 uncertainty quantification: Bias and uncertainty evaluation for important phenomena

    International Nuclear Information System (INIS)

    Ortiz, M.G.; Ghan, L.S.; Vogl, J.

    1991-01-01

    The Nuclear Regulatory Commission (NRC) revised the Emergency Core Cooling System (ECCS) licensing rule to allow the use of Best Estimate (BE) computer codes, provided the uncertainty of the calculations are quantified and used in the licensing and regulation process. The NRC developed a generic methodology called Code Scaling, Applicability and Uncertainty (CSAU) to evaluate BE code uncertainties. The CSAU methodology was demonstrated with a specific application to a pressurized water reactor (PWR), experiencing a postulated large break loss-of-coolant accident (LBLOCA). The current work is part of an effort to adapt and demonstrate the CSAU methodology to a small break (SB) LOCA in a PWR of B and W design using RELAP5/MOD3 as the simulation tool. The subject of this paper is the Assessment and Ranging of Parameters (Element 2 of the CSAU methodology), which determines the contribution to uncertainty of specific models in the code

  13. Identifying influences on model uncertainty: an application using a forest carbon budget model

    Science.gov (United States)

    James E. Smith; Linda S. Heath

    2001-01-01

    Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...

  14. Parameter importance and uncertainty in predicting runoff pesticide reduction with filter strips.

    Science.gov (United States)

    Muñoz-Carpena, Rafael; Fox, Garey A; Sabbagh, George J

    2010-01-01

    Vegetative filter strips (VFS) are an environmental management tool used to reduce sediment and pesticide transport from surface runoff. Numerical models of VFS such as the Vegetative Filter Strip Modeling System (VFSMOD-W) are capable of predicting runoff, sediment, and pesticide reduction and can be useful tools to understand the effectiveness of VFS and environmental conditions under which they may be ineffective. However, as part of the modeling process, it is critical to identify input factor importance and quantify uncertainty in predicted runoff, sediment, and pesticide reductions. This research used state-of-the-art global sensitivity and uncertainty analysis tools, a screening method (Morris) and a variance-based method (extended Fourier Analysis Sensitivity Test), to evaluate VFSMOD-W under a range of field scenarios. The three VFS studies analyzed were conducted on silty clay loam and silt loam soils under uniform, sheet flow conditions and included atrazine, chlorpyrifos, cyanazine, metolachlor, pendimethalin, and terbuthylazine data. Saturated hydraulic conductivity was the most important input factor for predicting infiltration and runoff, explaining >75% of the total output variance for studies with smaller hydraulic loading rates ( approximately 100-150 mm equivalent depths) and approximately 50% for the higher loading rate ( approximately 280-mm equivalent depth). Important input factors for predicting sedimentation included hydraulic conductivity, average particle size, and the filter's Manning's roughness coefficient. Input factor importance for pesticide trapping was controlled by infiltration and, therefore, hydraulic conductivity. Global uncertainty analyses suggested a wide range of reductions for runoff (95% confidence intervals of 7-93%), sediment (84-100%), and pesticide (43-100%) . Pesticide trapping probability distributions fell between runoff and sediment reduction distributions as a function of the pesticides' sorption. Seemingly

  15. Identifying Importance-Performance Matrix Analysis (IPMA) of ...

    African Journals Online (AJOL)

    Identifying Importance-Performance Matrix Analysis (IPMA) of intellectual capital and Islamic work ethics in Malaysian SMES. ... capital and Islamic work ethics significantly influenced business performance. ... AJOL African Journals Online.

  16. Identifying the important factors in simulation models with many factors

    NARCIS (Netherlands)

    Bettonvil, B.; Kleijnen, J.P.C.

    1994-01-01

    Simulation models may have many parameters and input variables (together called factors), while only a few factors are really important (parsimony principle). For such models this paper presents an effective and efficient screening technique to identify and estimate those important factors. The

  17. A new measure of uncertainty importance based on distributional sensitivity analysis for PSA

    International Nuclear Information System (INIS)

    Han, Seok Jung; Tak, Nam Il; Chun, Moon Hyun

    1996-01-01

    The main objective of the present study is to propose a new measure of uncertainty importance based on distributional sensitivity analysis. The new measure is developed to utilize a metric distance obtained from cumulative distribution functions (cdfs). The measure is evaluated for two cases: one is a cdf given by a known analytical distribution and the other given by an empirical distribution generated by a crude Monte Carlo simulation. To study its applicability, the present measure has been applied to two different cases. The results are compared with those of existing three methods. The present approach is a useful measure of uncertainty importance which is based on cdfs. This method is simple and easy to calculate uncertainty importance without any complex process. On the basis of the results obtained in the present work, the present method is recommended to be used as a tool for the analysis of uncertainty importance

  18. An uncertainty importance measure using a distance metric for the change in a cumulative distribution function

    International Nuclear Information System (INIS)

    Chun, Moon-Hyun; Han, Seok-Jung; Tak, Nam-IL

    2000-01-01

    A simple measure of uncertainty importance using the entire change of cumulative distribution functions (CDFs) has been developed for use in probability safety assessments (PSAs). The entire change of CDFs is quantified in terms of the metric distance between two CDFs. The metric distance measure developed in this study reflects the relative impact of distributional changes of inputs on the change of an output distribution, while most of the existing uncertainty importance measures reflect the magnitude of relative contribution of input uncertainties to the output uncertainty. The present measure has been evaluated analytically for various analytical distributions to examine its characteristics. To illustrate the applicability and strength of the present measure, two examples are provided. The first example is an application of the present measure to a typical problem of a system fault tree analysis and the second one is for a hypothetical non-linear model. Comparisons of the present result with those obtained by existing uncertainty importance measures show that the metric distance measure is a useful tool to express the measure of uncertainty importance in terms of the relative impact of distributional changes of inputs on the change of an output distribution

  19. Improving the precision of lake ecosystem metabolism estimates by identifying predictors of model uncertainty

    Science.gov (United States)

    Rose, Kevin C.; Winslow, Luke A.; Read, Jordan S.; Read, Emily K.; Solomon, Christopher T.; Adrian, Rita; Hanson, Paul C.

    2014-01-01

    Diel changes in dissolved oxygen are often used to estimate gross primary production (GPP) and ecosystem respiration (ER) in aquatic ecosystems. Despite the widespread use of this approach to understand ecosystem metabolism, we are only beginning to understand the degree and underlying causes of uncertainty for metabolism model parameter estimates. Here, we present a novel approach to improve the precision and accuracy of ecosystem metabolism estimates by identifying physical metrics that indicate when metabolism estimates are highly uncertain. Using datasets from seventeen instrumented GLEON (Global Lake Ecological Observatory Network) lakes, we discovered that many physical characteristics correlated with uncertainty, including PAR (photosynthetically active radiation, 400-700 nm), daily variance in Schmidt stability, and wind speed. Low PAR was a consistent predictor of high variance in GPP model parameters, but also corresponded with low ER model parameter variance. We identified a threshold (30% of clear sky PAR) below which GPP parameter variance increased rapidly and was significantly greater in nearly all lakes compared with variance on days with PAR levels above this threshold. The relationship between daily variance in Schmidt stability and GPP model parameter variance depended on trophic status, whereas daily variance in Schmidt stability was consistently positively related to ER model parameter variance. Wind speeds in the range of ~0.8-3 m s–1 were consistent predictors of high variance for both GPP and ER model parameters, with greater uncertainty in eutrophic lakes. Our findings can be used to reduce ecosystem metabolism model parameter uncertainty and identify potential sources of that uncertainty.

  20. Application of a new importance measure for parametric uncertainty in PSA

    International Nuclear Information System (INIS)

    Poern, K.

    1997-04-01

    The traditional approach to uncertainty analysis in PSA, with propagation of basic event uncertainties through the PSA model, generates as an end product the uncertainty distribution of the top event frequency. This distribution, however, is not of much value for the decision maker. Most decisions are made under uncertainty. What the decision maker needs, to enhance the decision-making quality, is an adequate uncertainty importance measure that provides the decision maker with an indication of on what basic parameters it would be most valuable - as to the quality of the decision making in the specific situation - to procure more information. This paper will describe an application of a new measure of uncertainty importance that has been developed in the ongoing joint Nordic project NKS/RAK-1:3. The measure is called ''decision oriented'' because it is defined within a decision theoretic framework. It is defined as the expected value of a certain additional information about each basic parameter, and utilizes both the system structure and the complete uncertainty distributions of the basic parameters. The measure provides the analyst and the decision maker with a diagnostic information pointing to parameters on which more information would be most valuable to procure in order to enhance the decision-making quality. This uncertainty importance measure must not be confused with the more well-known, traditional importance measures of various kinds that are used to depict the contributions of each basic event or parameter (represented by point values) to the top event frequency. In this study the new measure is practically demonstrated through a real application on the top event: Water overflow through steam generator safety valves caused by steam generator tube rupture. This application object is one of the event sequences that the fore mentioned Nordic project has analysed with an integrated approach. The project has been funded by the Swedish Nuclear Power

  1. An audit of the global carbon budget: identifying and reducing sources of uncertainty

    Science.gov (United States)

    Ballantyne, A. P.; Tans, P. P.; Marland, G.; Stocker, B. D.

    2012-12-01

    Uncertainties in our carbon accounting practices may limit our ability to objectively verify emission reductions on regional scales. Furthermore uncertainties in the global C budget must be reduced to benchmark Earth System Models that incorporate carbon-climate interactions. Here we present an audit of the global C budget where we try to identify sources of uncertainty for major terms in the global C budget. The atmospheric growth rate of CO2 has increased significantly over the last 50 years, while the uncertainty in calculating the global atmospheric growth rate has been reduced from 0.4 ppm/yr to 0.2 ppm/yr (95% confidence). Although we have greatly reduced global CO2 growth rate uncertainties, there remain regions, such as the Southern Hemisphere, Tropics and Arctic, where changes in regional sources/sinks will remain difficult to detect without additional observations. Increases in fossil fuel (FF) emissions are the primary factor driving the increase in global CO2 growth rate; however, our confidence in FF emission estimates has actually gone down. Based on a comparison of multiple estimates, FF emissions have increased from 2.45 ± 0.12 PgC/yr in 1959 to 9.40 ± 0.66 PgC/yr in 2010. Major sources of increasing FF emission uncertainty are increased emissions from emerging economies, such as China and India, as well as subtle differences in accounting practices. Lastly, we evaluate emission estimates from Land Use Change (LUC). Although relative errors in emission estimates from LUC are quite high (2 sigma ~ 50%), LUC emissions have remained fairly constant in recent decades. We evaluate the three commonly used approaches to estimating LUC emissions- Bookkeeping, Satellite Imagery, and Model Simulations- to identify their main sources of error and their ability to detect net emissions from LUC.; Uncertainties in Fossil Fuel Emissions over the last 50 years.

  2. A new computational method of a moment-independent uncertainty importance measure

    International Nuclear Information System (INIS)

    Liu Qiao; Homma, Toshimitsu

    2009-01-01

    For a risk assessment model, the uncertainty in input parameters is propagated through the model and leads to the uncertainty in the model output. The study of how the uncertainty in the output of a model can be apportioned to the uncertainty in the model inputs is the job of sensitivity analysis. Saltelli [Sensitivity analysis for importance assessment. Risk Analysis 2002;22(3):579-90] pointed out that a good sensitivity indicator should be global, quantitative and model free. Borgonovo [A new uncertainty importance measure. Reliability Engineering and System Safety 2007;92(6):771-84] further extended these three requirements by adding the fourth feature, moment-independence, and proposed a new sensitivity measure, δ i . It evaluates the influence of the input uncertainty on the entire output distribution without reference to any specific moment of the model output. In this paper, a new computational method of δ i is proposed. It is conceptually simple and easier to implement. The feasibility of this new method is proved by applying it to two examples.

  3. Benefits of using an optimization methodology for identifying robust process integration investments under uncertainty-A pulp mill example

    International Nuclear Information System (INIS)

    Svensson, Elin; Berntsson, Thore; Stroemberg, Ann-Brith

    2009-01-01

    This paper presents a case study on the optimization of process integration investments in a pulp mill considering uncertainties in future electricity and biofuel prices and CO 2 emissions charges. The work follows the methodology described in Svensson et al. [Svensson, E., Berntsson, T., Stroemberg, A.-B., Patriksson, M., 2008b. An optimization methodology for identifying robust process integration investments under uncertainty. Energy Policy, in press, (doi:10.1016/j.enpol.2008.10.023)] where a scenario-based approach is proposed for the modelling of uncertainties. The results show that the proposed methodology provides a way to handle the time dependence and the uncertainties of the parameters. For the analyzed case, a robust solution is found which turns out to be a combination of two opposing investment strategies. The difference between short-term and strategic views for the investment decision is analyzed and it is found that uncertainties are increasingly important to account for as a more strategic view is employed. Furthermore, the results imply that the obvious effect of policy instruments aimed at decreasing CO 2 emissions is, in applications like this, an increased profitability for all energy efficiency investments, and not as much a shift between different alternatives

  4. Benefits of using an optimization methodology for identifying robust process integration investments under uncertainty-A pulp mill example

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Elin [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden)], E-mail: elin.svensson@chalmers.se; Berntsson, Thore [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden); Stroemberg, Ann-Brith [Fraunhofer-Chalmers Research Centre for Industrial Mathematics, Chalmers Science Park, SE-412 88 Gothenburg (Sweden)

    2009-03-15

    This paper presents a case study on the optimization of process integration investments in a pulp mill considering uncertainties in future electricity and biofuel prices and CO{sub 2} emissions charges. The work follows the methodology described in Svensson et al. [Svensson, E., Berntsson, T., Stroemberg, A.-B., Patriksson, M., 2008b. An optimization methodology for identifying robust process integration investments under uncertainty. Energy Policy, in press, (doi:10.1016/j.enpol.2008.10.023)] where a scenario-based approach is proposed for the modelling of uncertainties. The results show that the proposed methodology provides a way to handle the time dependence and the uncertainties of the parameters. For the analyzed case, a robust solution is found which turns out to be a combination of two opposing investment strategies. The difference between short-term and strategic views for the investment decision is analyzed and it is found that uncertainties are increasingly important to account for as a more strategic view is employed. Furthermore, the results imply that the obvious effect of policy instruments aimed at decreasing CO{sub 2} emissions is, in applications like this, an increased profitability for all energy efficiency investments, and not as much a shift between different alternatives.

  5. Benefits of using an optimization methodology for identifying robust process integration investments under uncertainty. A pulp mill example

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Elin; Berntsson, Thore [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden); Stroemberg, Ann-Brith [Fraunhofer-Chalmers Research Centre for Industrial Mathematics, Chalmers Science Park, SE-412 88 Gothenburg (Sweden)

    2009-03-15

    This paper presents a case study on the optimization of process integration investments in a pulp mill considering uncertainties in future electricity and biofuel prices and CO{sub 2} emissions charges. The work follows the methodology described in Svensson et al. [Svensson, E., Berntsson, T., Stroemberg, A.-B., Patriksson, M., 2008b. An optimization methodology for identifying robust process integration investments under uncertainty. Energy Policy, in press, doi:10.1016/j.enpol.2008.10.023] where a scenario-based approach is proposed for the modelling of uncertainties. The results show that the proposed methodology provides a way to handle the time dependence and the uncertainties of the parameters. For the analyzed case, a robust solution is found which turns out to be a combination of two opposing investment strategies. The difference between short-term and strategic views for the investment decision is analyzed and it is found that uncertainties are increasingly important to account for as a more strategic view is employed. Furthermore, the results imply that the obvious effect of policy instruments aimed at decreasing CO{sub 2} emissions is, in applications like this, an increased profitability for all energy efficiency investments, and not as much a shift between different alternatives. (author)

  6. Identifying Selected Behavioral Determinants of Risk and Uncertainty on the Real Estate Market

    Directory of Open Access Journals (Sweden)

    Brzezicka Justyna

    2014-07-01

    Full Text Available Various market behaviors can be characterized as risky or uncertain, thus their observation is important to the real estate market system. The extensive use of behavioral factors facilitates their implementation and studies in relation to the real estate market system. The behavioral approach has established its own instrumentation which enables elements of risk and uncertainty to be quantified.

  7. Importance analysis within a Dempster-Shafer theory of evidence framework of uncertainty representation - 15203

    International Nuclear Information System (INIS)

    Lo, C.K.; Zio, E.

    2015-01-01

    In nuclear power plant (NPP) probability risk assessment (PRA) practice, a ranking of the contribution of the single Basic Events (BE) to the Core Damage Frequency (CDF) is performed by computing importance measures, such as the Fussel-Vesely (F-V), Risk Achievement Worth (RAW) and Risk Reduction Worth (RRW) indices. Traditionally, these importance indices are calculated as point (e.g., mean) values without accounting for the epistemic uncertainty affecting the parameters (e.g., BE probabilities, failures rates...) of PRA models. On the other hand, such epistemic uncertainty has a significant impact on the evaluation of the importance indices (that are thus not described by a single point value, but rather by a distribution of possible values): this obviously affects the BE ranking and the corresponding safety-related decisions. In this paper, the epistemic uncertainty in the BE probabilities of NPP PRA modes is represented by belief/plausibility functions within a Dempster-Shafer Theory of Evidence (DSTE) framework: as a consequence, also the corresponding importance indices are described by Dempster-Shafer structures. Due to the overlap and the dependences of focal intervals of component important measures, it is difficult to rank them. We propose a method called RAWC to rank the BE importance with accounting for the uncertainty. However, RWAC can only give us an overall picture about ranking

  8. The importance of input interactions in the uncertainty and sensitivity analysis of nuclear fuel behavior

    Energy Technology Data Exchange (ETDEWEB)

    Ikonen, T., E-mail: timo.ikonen@vtt.fi; Tulkki, V.

    2014-08-15

    Highlights: • Uncertainty and sensitivity analysis of modeled nuclear fuel behavior is performed. • Burnup dependency of the uncertainties and sensitivities is characterized. • Input interactions significantly increase output uncertainties for irradiated fuel. • Identification of uncertainty sources is greatly improved with higher order methods. • Results stress the importance of using methods that take interactions into account. - Abstract: The propagation of uncertainties in a PWR fuel rod under steady-state irradiation is analyzed by computational means. A hypothetical steady-state scenario of the Three Mile Island 1 reactor fuel rod is modeled with the fuel performance FRAPCON, using realistic input uncertainties for the fabrication and model parameters, boundary conditions and material properties. The uncertainty and sensitivity analysis is performed by extensive Monte Carlo sampling of the inputs’ probability distribution and by applying correlation coefficient and Sobol’ variance decomposition analyses. The latter includes evaluation of the second order and total effect sensitivity indices, allowing the study of interactions between input variables. The results show that the interactions play a large role in the propagation of uncertainties, and first order methods such as the correlation coefficient analyses are in general insufficient for sensitivity analysis of the fuel rod. Significant improvement over the first order methods can be achieved by using higher order methods. The results also show that both the magnitude of the uncertainties and their propagation depends not only on the output in question, but also on burnup. The latter is due to onset of new phenomena (such as the fission gas release) and the gradual closure of the pellet-cladding gap with increasing burnup. Increasing burnup also affects the importance of input interactions. Interaction effects are typically highest in the moderate burnup (of the order of 10–40 MWd

  9. Identifying the effects of parameter uncertainty on the reliability of riverbank stability modelling

    Science.gov (United States)

    Samadi, A.; Amiri-Tokaldany, E.; Darby, S. E.

    2009-05-01

    Bank retreat is a key process in fluvial dynamics affecting a wide range of physical, ecological and socioeconomic issues in the fluvial environment. To predict the undesirable effects of bank retreat and to inform effective measures to prevent it, a wide range of bank stability models have been presented in the literature. These models typically express bank stability by defining a factor of safety as the ratio of driving and resisting forces acting on the incipient failure block. These forces are affected by a range of controlling factors that include such aspects as the bank profile (bank height and angle), the geotechnical properties of the bank materials, as well as the hydrological status of the riverbanks. In this paper we evaluate the extent to which uncertainties in the parameterization of these controlling factors feed through to influence the reliability of the resulting bank stability estimate. This is achieved by employing a simple model of riverbank stability with respect to planar failure (which is the most common type of bank stability model) in a series of sensitivity tests and Monte Carlo analyses to identify, for each model parameter, the range of values that induce significant changes in the simulated factor of safety. These identified parameter value ranges are compared to empirically derived parameter uncertainties to determine whether they are likely to confound the reliability of the resulting bank stability calculations. Our results show that parameter uncertainties are typically high enough that the likelihood of generating unreliable predictions is typically very high (> ˜ 80% for predictions requiring a precision of < ± 15%). Because parameter uncertainties are derived primarily from the natural variability of the parameters, rather than measurement errors, much more careful attention should be paid to field sampling strategies, such that the parameter uncertainties and consequent prediction unreliabilities can be quantified more

  10. Uncertainties

    Indian Academy of Sciences (India)

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  11. Uncertainty

    International Nuclear Information System (INIS)

    Silva, T.A. da

    1988-01-01

    The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt

  12. Value change in oil and gas production: V. Incorporation of uncertainties and determination of relative importance

    International Nuclear Information System (INIS)

    Lerche, I.; Noeth, S.

    2002-01-01

    The influence of two fundamentally different types of uncertainty on the value of oil field production are investigated here. First considered is the uncertainty caused by the fact that the expected value estimate is not one of the possible outcomes. To correctly allow for the risk attendant upon using the expected value as a measure of worth, even with statistically sharp parameters, one needs to incorporate the uncertainty of the expected value. Using a simple example we show how such incorporation allows for a clear determination of the relative risk of projects that may have the same expected value but very different risks. We also show how each project can be risked on its own using the expected value and variance. This uncertainty type is due to the possible pathways for different outcomes even when parameters categorizing the system are taken to be known. Second considered is the risk due to the fact that parameters in oil field estimates are just estimates and, as such, have their own intrinsic errors that influence the possible outcomes and make them less certain. This sort of risk depends upon the uncertainty of each parameter, and also the type of distribution the parameters are taken to be drawn from. In addition, not all uncertainties in parameters values are of equal importance in influencing an outcome probability. We show how can determine the relative importance for the parameters and so determine where to place effort to resolve the dominant contributions to risk if it is possible to do so. Considerations of whether to acquire new information, and also whether to undertake further studies under such an uncertain environment, are used as vehicles to address these concerns of risk due to uncertainty. In general, an oil field development project has to contend with all the above types of risk and uncertainty. It is therefore of importance to have quantitative measures of risk so that one can compare and contrast the various effects, and so that

  13. Identifying important nodes by adaptive LeaderRank

    Science.gov (United States)

    Xu, Shuang; Wang, Pei

    2017-03-01

    Spreading process is a common phenomenon in complex networks. Identifying important nodes in complex networks is of great significance in real-world applications. Based on the spreading process on networks, a lot of measures have been proposed to evaluate the importance of nodes. However, most of the existing measures are appropriate to static networks, which are fragile to topological perturbations. Many real-world complex networks are dynamic rather than static, meaning that the nodes and edges of such networks may change with time, which challenge numerous existing centrality measures. Based on a new weighted mechanism and the newly proposed H-index and LeaderRank (LR), this paper introduces a variant of the LR measure, called adaptive LeaderRank (ALR), which is a new member of the LR-family. Simulations on six real-world networks reveal that the new measure can well balance between prediction accuracy and robustness. More interestingly, the new measure can better adapt to the adjustment or local perturbations of network topologies, as compared with the existing measures. By discussing the detailed properties of the measures from the LR-family, we illustrate that the ALR has its competitive advantages over the other measures. The proposed algorithm enriches the measures to understand complex networks, and may have potential applications in social networks and biological systems.

  14. Identifying important motivational factors for professionals in Greek hospitals

    Science.gov (United States)

    Kontodimopoulos, Nick; Paleologou, Victoria; Niakas, Dimitris

    2009-01-01

    Background The purpose of this study was to identify important motivational factors according to the views of health-care professionals in Greek hospitals and particularly to determine if these might differ in the public and private sectors. Methods A previously developed -and validated- instrument addressing four work-related motivators (job attributes, remuneration, co-workers and achievements) was used. Three categories of health care professionals, doctors (N = 354), nurses (N = 581) and office workers (N = 418), working in public and private hospitals, participated and motivation was compared across socio-demographic and occupational variables. Results The range of reported motivational factors was mixed and Maslow's conclusions that lower level motivational factors must be met before ascending to the next level were not confirmed. The highest ranked motivator for the entire sample, and by professional subgroup, was achievements (P motivators were similar, and only one significant difference was observed, namely between doctors and nurses in respect to co-workers (P motivated by all factors significantly more than their public-hospital counterparts. Conclusion The results are in agreement with the literature which focuses attention to management approaches employing both monetary and non-monetary incentives to motivate health care workers. This study showed that intrinsic factors are particularly important and should become a target for effective employee motivation. PMID:19754968

  15. An uncertainty inclusive un-mixing model to identify tracer non-conservativeness

    Science.gov (United States)

    Sherriff, Sophie; Rowan, John; Franks, Stewart; Fenton, Owen; Jordan, Phil; hUallacháin, Daire Ó.

    2015-04-01

    sensitive screening technique than assessing target values against source data. Non-conservative behaviour was identified in field data however only at a significant degree of corruption. Whilst further testing is required to determine the impact of individual and combined uncertainty components on synthetic, controlled experiments and field data, this study provides a framework for future assessment of uncertainty in un-mixing models.

  16. Ocean Heat and Carbon Uptake in Transient Climate Change: Identifying Model Uncertainty

    Science.gov (United States)

    Romanou, Anastasia; Marshall, John

    2015-01-01

    Global warming on decadal and centennial timescales is mediated and ameliorated by the oceansequestering heat and carbon into its interior. Transient climate change is a function of the efficiency by whichanthropogenic heat and carbon are transported away from the surface into the ocean interior (Hansen et al. 1985).Gregory and Mitchell (1997) and Raper et al. (2002) were the first to identify the importance of the ocean heat uptakeefficiency in transient climate change. Observational estimates (Schwartz 2012) and inferences from coupledatmosphere-ocean general circulation models (AOGCMs; Gregory and Forster 2008; Marotzke et al. 2015), suggest thatocean heat uptake efficiency on decadal timescales lies in the range 0.5-1.5 W/sq m/K and is thus comparable to theclimate feedback parameter (Murphy et al. 2009). Moreover, the ocean not only plays a key role in setting the timing ofwarming but also its regional patterns (Marshall et al. 2014), which is crucial to our understanding of regional climate,carbon and heat uptake, and sea-level change. This short communication is based on a presentation given by A.Romanou at a recent workshop, Oceans Carbon and Heat Uptake: Uncertainties and Metrics, co-hosted by US CLIVARand OCB. As briefly reviewed below, we have incomplete but growing knowledge of how ocean models used in climatechange projections sequester heat and carbon into the interior. To understand and thence reduce errors and biases inthe ocean component of coupled models, as well as elucidate the key mechanisms at work, in the final section we outlinea proposed model intercomparison project named FAFMIP. In FAFMIP, coupled integrations would be carried out withprescribed overrides of wind stress and freshwater and heat fluxes acting at the sea surface.

  17. Analysis of convergence of uncertainty and important factors affecting uncertainty in level 1 PSA for pressurized water reactors

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2002-01-01

    We analyzed how the convergence of mean core damage frequency (CDF) depends on the number of minimal cut sets, the sampling method and the random seed, using level 1 PSA models for Surry 1 and a Japanese 4 loop PWR plant. As a result, the followings were clarified: the good convergence efficiency of the latin hypercube sampling (LHS), the relationship between number of minimal cut sets and mean CDF, as well as the standard deviation and the easy method of judgment for mean CDF convergence. In addition, it was seen that the relationship between the number of probability variables (i.e. the number of basic events) and the number of samplings needed to converge for mean CDF. Analysis of important factors affecting uncertainty was also performed. As a result, it was found that the initiating events (especially loss of coolant accidents) were the dominant important factors. Finally, comparisons were made for the 95% confidence interval of the calculated results from the operating experience of the worldwide nuclear power plants with (1) the mean core damage frequency by PSA for 108 US plants and 51 Japanese plants and (2) the 95% confidence interval of the US and the Japanese Plant PSA model used in this research. As a result, it was clarified that the mean core damage frequency of almost all US pressurized and boiling light water reactors in the US was in the 90% confidence interval calculated from the operating experience of the nuclear power plants (PWRs and BWRs) in the world, but that of those reactors in Japan was smaller then that level. (author)

  18. Analysis of convergence of uncertainty and important factors affecting uncertainty in level 1 PSA for pressurized water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Shimada, Yoshio [Inst. of Nuclear Safety System Inc., Mihama, Fukui (Japan)

    2002-09-01

    We analyzed how the convergence of mean core damage frequency (CDF) depends on the number of minimal cut sets, the sampling method and the random seed, using level 1 PSA models for Surry 1 and a Japanese 4 loop PWR plant. As a result, the followings were clarified: the good convergence efficiency of the latin hypercube sampling (LHS), the relationship between number of minimal cut sets and mean CDF, as well as the standard deviation and the easy method of judgment for mean CDF convergence. In addition, it was seen that the relationship between the number of probability variables (i.e. the number of basic events) and the number of samplings needed to converge for mean CDF. Analysis of important factors affecting uncertainty was also performed. As a result, it was found that the initiating events (especially loss of coolant accidents) were the dominant important factors. Finally, comparisons were made for the 95% confidence interval of the calculated results from the operating experience of the worldwide nuclear power plants with (1) the mean core damage frequency by PSA for 108 US plants and 51 Japanese plants and (2) the 95% confidence interval of the US and the Japanese Plant PSA model used in this research. As a result, it was clarified that the mean core damage frequency of almost all US pressurized and boiling light water reactors in the US was in the 90% confidence interval calculated from the operating experience of the nuclear power plants (PWRs and BWRs) in the world, but that of those reactors in Japan was smaller then that level. (author)

  19. Importance of atmospheric turbidity and associated uncertainties in solar radiation and luminous efficacy modelling

    International Nuclear Information System (INIS)

    Gueymard, Christian A.

    2005-01-01

    For many solar-related applications, it is important to separately predict the direct and diffuse components of irradiance or illuminance. Under clear skies, turbidity plays a determinant role in quantitatively affecting these components. In this paper, various aspects of the effect of turbidity on both spectral and broadband radiation are addressed, as well as the uncertainty in irradiance predictions due to inaccurate turbidity data, and the current improvements in obtaining the necessary turbidity data

  20. Identifying important parameters for a continuous bioscouring process

    NARCIS (Netherlands)

    Lenting, H.B.M.; Lenting, H.B.M.; Zwier, E.; Nierstrasz, Vincent

    2002-01-01

    Compared to a bioscouring process in the batch mode, a continuously operating process requires relatively short processing steps. This study focusses on minimizing the required enzymatic incubation time. It is clear that the presence of a sufficient level of surfactant is of major importance in

  1. A landscape ecology approach identifies important drivers of urban biodiversity.

    Science.gov (United States)

    Turrini, Tabea; Knop, Eva

    2015-04-01

    Cities are growing rapidly worldwide, yet a mechanistic understanding of the impact of urbanization on biodiversity is lacking. We assessed the impact of urbanization on arthropod diversity (species richness and evenness) and abundance in a study of six cities and nearby intensively managed agricultural areas. Within the urban ecosystem, we disentangled the relative importance of two key landscape factors affecting biodiversity, namely the amount of vegetated area and patch isolation. To do so, we a priori selected sites that independently varied in the amount of vegetated area in the surrounding landscape at the 500-m scale and patch isolation at the 100-m scale, and we hold local patch characteristics constant. As indicator groups, we used bugs, beetles, leafhoppers, and spiders. Compared to intensively managed agricultural ecosystems, urban ecosystems supported a higher abundance of most indicator groups, a higher number of bug species, and a lower evenness of bug and beetle species. Within cities, a high amount of vegetated area increased species richness and abundance of most arthropod groups, whereas evenness showed no clear pattern. Patch isolation played only a limited role in urban ecosystems, which contrasts findings from agro-ecological studies. Our results show that urban areas can harbor a similar arthropod diversity and abundance compared to intensively managed agricultural ecosystems. Further, negative consequences of urbanization on arthropod diversity can be mitigated by providing sufficient vegetated space in the urban area, while patch connectivity is less important in an urban context. This highlights the need for applying a landscape ecological approach to understand the mechanisms shaping urban biodiversity and underlines the potential of appropriate urban planning for mitigating biodiversity loss. © 2015 John Wiley & Sons Ltd.

  2. Importance measures in nuclear PSA: how to control their uncertainty and develop new applications

    International Nuclear Information System (INIS)

    Duflot, N.

    2007-01-01

    This PhD thesis deals with the importance measures based on nuclear probabilistic safety analyses (PSA). With these indicators, the importance towards risk of the events considered in the PSA models can be measured. The first part of this thesis sets out the framework in which they are currently used. The information extracted from importance measures evaluation is used in industrial decision-making processes that may impact the safety of nuclear plants. In the second part of the thesis, we thus try to meet the requirements of reliability and simplicity with an approach minimising the uncertainties due to modelling. We also lay out a new truncation process of the set of the minimal cut set (MCS) corresponding to the baseline case which allows a quick, automatic and precise calculation of the importance measures. As PSA are increasingly used in risk-informed decision-making approaches, we have examined the extension of importance measures to groups of basic events. The third part of the thesis therefore presents the definition of the importance of events such as the failure of a system or the loss of a function, as well as their potential applications. PSA being considered to be a useful tool to design new nuclear power plants, the fourth part of the thesis sketches out a design process based both on classical importance measures and on new ones. (author)

  3. Identifying and Analyzing Uncertainty Structures in the TRMM Microwave Imager Precipitation Product over Tropical Ocean Basins

    Science.gov (United States)

    Liu, Jianbo; Kummerow, Christian D.; Elsaesser, Gregory S.

    2016-01-01

    Despite continuous improvements in microwave sensors and retrieval algorithms, our understanding of precipitation uncertainty is quite limited, due primarily to inconsistent findings in studies that compare satellite estimates to in situ observations over different parts of the world. This study seeks to characterize the temporal and spatial properties of uncertainty in the Tropical Rainfall Measuring Mission Microwave Imager surface rainfall product over tropical ocean basins. Two uncertainty analysis frameworks are introduced to qualitatively evaluate the properties of uncertainty under a hierarchy of spatiotemporal data resolutions. The first framework (i.e. 'climate method') demonstrates that, apart from random errors and regionally dependent biases, a large component of the overall precipitation uncertainty is manifested in cyclical patterns that are closely related to large-scale atmospheric modes of variability. By estimating the magnitudes of major uncertainty sources independently, the climate method is able to explain 45-88% of the monthly uncertainty variability. The percentage is largely resolution dependent (with the lowest percentage explained associated with a 1 deg x 1 deg spatial/1 month temporal resolution, and highest associated with a 3 deg x 3 deg spatial/3 month temporal resolution). The second framework (i.e. 'weather method') explains regional mean precipitation uncertainty as a summation of uncertainties associated with individual precipitation systems. By further assuming that self-similar recurring precipitation systems yield qualitatively comparable precipitation uncertainties, the weather method can consistently resolve about 50 % of the daily uncertainty variability, with only limited dependence on the regions of interest.

  4. The Ethics of Ambiguity: Rethinking the Role and Importance of Uncertainty in Medical Education and Practice.

    Science.gov (United States)

    Domen, Ronald E

    2016-01-01

    Understanding and embracing uncertainty are critical for effective teacher-learner relationships as well as for shared decision-making in the physician-patient relationship. However, ambiguity has not been given serious consideration in either the undergraduate or graduate medical curricula or in the role it plays in patient-centered care. In this article, the author examines the ethics of ambiguity and argues for a pedagogy that includes education in the importance of, and tolerance of, ambiguity that is inherent in medical education and practice. Common threads running through the ethics of ambiguity are the virtue of respect, and the development of a culture of respect is required for the successful understanding and implementation of a pedagogy of ambiguity.

  5. Identifying uncertainty of the mean of some water quality variables along water quality monitoring network of Bahr El Baqar drain

    Directory of Open Access Journals (Sweden)

    Hussein G. Karaman

    2013-10-01

    Full Text Available Assigning objectives to the environmental monitoring network is the pillar of the design to these kinds of networks. Conflicting network objectives may affect the adequacy of the design in terms of sampling frequency and the spatial distribution of the monitoring stations which in turn affect the accuracy of the data and the information extracted. The first step in resolving this problem is to identify the uncertainty inherent in the network as the result of the vagueness of the design objective. Entropy has been utilized and adopted over the past decades to identify uncertainty in similar water data sets. Therefore it is used to identify the uncertainties inherent in the water quality monitoring network of Bahr El-Baqar drain located in the Eastern Delta. Toward investigating the applicability of the Entropy methodology, comprehensive analysis at the selected drain as well as their data sets is carried out. Furthermore, the uncertainty calculated by the entropy function will be presented by the means of the geographical information system to give the decision maker a global view to these uncertainties and to open the door to other researchers to find out innovative approaches to lower these uncertainties reaching optimal monitoring network in terms of the spatial distribution of the monitoring stations.

  6. An optimization methodology for identifying robust process integration investments under uncertainty

    International Nuclear Information System (INIS)

    Svensson, Elin; Berntsson, Thore; Stroemberg, Ann-Brith; Patriksson, Michael

    2009-01-01

    Uncertainties in future energy prices and policies strongly affect decisions on investments in process integration measures in industry. In this paper, we present a five-step methodology for the identification of robust investment alternatives incorporating explicitly such uncertainties in the optimization model. Methods for optimization under uncertainty (or, stochastic programming) are thus combined with a deep understanding of process integration and process technology in order to achieve a framework for decision-making concerning the investment planning of process integration measures under uncertainty. The proposed methodology enables the optimization of investments in energy efficiency with respect to their net present value or an environmental objective. In particular, as a result of the optimization approach, complex investment alternatives, allowing for combinations of energy efficiency measures, can be analyzed. Uncertainties as well as time-dependent parameters, such as energy prices and policies, are modelled using a scenario-based approach, enabling the identification of robust investment solutions. The methodology is primarily an aid for decision-makers in industry, but it will also provide insight for policy-makers into how uncertainties regarding future price levels and policy instruments affect the decisions on investments in energy efficiency measures. (author)

  7. An optimization methodology for identifying robust process integration investments under uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Elin; Berntsson, Thore [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden); Stroemberg, Ann-Brith [Fraunhofer-Chalmers Research Centre for Industrial Mathematics, Chalmers Science Park, SE-412 88 Gothenburg (Sweden); Patriksson, Michael [Department of Mathematical Sciences, Chalmers University of Technology and Department of Mathematical Sciences, University of Gothenburg, SE-412 96 Goeteborg (Sweden)

    2009-02-15

    Uncertainties in future energy prices and policies strongly affect decisions on investments in process integration measures in industry. In this paper, we present a five-step methodology for the identification of robust investment alternatives incorporating explicitly such uncertainties in the optimization model. Methods for optimization under uncertainty (or, stochastic programming) are thus combined with a deep understanding of process integration and process technology in order to achieve a framework for decision-making concerning the investment planning of process integration measures under uncertainty. The proposed methodology enables the optimization of investments in energy efficiency with respect to their net present value or an environmental objective. In particular, as a result of the optimization approach, complex investment alternatives, allowing for combinations of energy efficiency measures, can be analyzed. Uncertainties as well as time-dependent parameters, such as energy prices and policies, are modelled using a scenario-based approach, enabling the identification of robust investment solutions. The methodology is primarily an aid for decision-makers in industry, but it will also provide insight for policy-makers into how uncertainties regarding future price levels and policy instruments affect the decisions on investments in energy efficiency measures. (author)

  8. Use of Sobol's quasirandom sequence generator for integration of modified uncertainty importance measure

    International Nuclear Information System (INIS)

    Homma, Toshimitsu; Saltelli, A.

    1995-01-01

    Sensitivity analysis of model output is relevant to a number of practices, including verification of models and computer code quality assurance. It deals with the identification of influential model parameters, especially in complex models implemented in computer programs with many uncertain input variables. In a recent article a new method for sensitivity analysis, named HIM * based on a rank transformation of the uncertainty importance measure suggested by Hora and Iman was proved very powerful for performing automated sensitivity analysis of model output, even in presence of model non-monotonicity. The same was not true of other widely used non-parametric techniques such as standardized rank regression coefficients. A drawback of the HIM * method was the large dimension of the stochastic sample needed for its estimation, which made HIM * impracticable for systems with large number of uncertain parameters. In the present note a more effective sampling algorithm, based on Sobol's quasirandom generator is coupled with HIM * , thereby greatly reducing the sample size needed for an effective identification of influential variables. The performances of the new technique are investigated for two different benchmarks. (author)

  9. Managerial conflict management in five European countries : The importance of power distance, uncertainty avoidance, and masculinity

    NARCIS (Netherlands)

    Van Oudenhoven, J.P.; Mechelse, L.; De Dreu, C.K.W.

    This research deals with managerial conflict management in Denmark, United Kingdom, The Netherlands, Spain, and Belgium. According to Hofstede (1991). these countries' cultures differ primarily in terms of uncertainty avoidance, power distance, and masculinity-femininity. The differences in

  10. Spillovers between energy and FX markets: The importance of asymmetry, uncertainty and business cycle

    International Nuclear Information System (INIS)

    Khalifa, Ahmed; Caporin, Massimiliano; Hammoudeh, Shawkat

    2015-01-01

    This study constructs a theoretical volatility transmission model for petroleum and FX markets, taking into account major stylized facts and uncertainty measures and the interactions between them under stages of the business cycle. It examines the impacts of those different specifications and economic factors on the spillovers between those considered markets. The results show that the impacts of the “own” shocks (petroleum on petroleum and currency on currency) are statistically significant and positive in almost all cases as expected for the models of natural gas and WTI oil, irrespectively of the currency considered. The asymmetry effect is stronger in the oil than in the natural gas markets. There is stronger and significant evidence that uncertainty affects volatility much more the mean. For the WTI oil, almost all policy and other uncertainty measures lead to an increase in the conditional variance. For currencies, coefficients are commonly significant independent of the presence of petroleum commodities in the bivariate model. The striking result for natural gas is the limited statistical relevance of the economic policy and other uncertainty measures due to the long contracts that characterize this market. Finally, common macroeconomic forces associated with the business cycle can drive these petroleum and currency markets and may cause jumps and co-jumps in the volatility of these markets. The conclusion provides policy implications of the paper’s results. - Highlights: • Examine the impacts of uncertainty measures on energy and currency interaction. • Examine the impacts of asymmetry on energy and currency interactions. • There is stronger asymmetry in oil compared to natural gas. • Uncertainty measures have an impact on volatility dynamics for oil and currencies. • Uncertainty measures do not have an impact on natural gas.

  11. A bottom-up approach in estimating the measurement uncertainty and other important considerations for quantitative analyses in drug testing for horses.

    Science.gov (United States)

    Leung, Gary N W; Ho, Emmie N M; Kwok, W Him; Leung, David K K; Tang, Francis P W; Wan, Terence S M; Wong, April S Y; Wong, Colton H F; Wong, Jenny K Y; Yu, Nola H

    2007-09-07

    Quantitative determination, particularly for threshold substances in biological samples, is much more demanding than qualitative identification. A proper assessment of any quantitative determination is the measurement uncertainty (MU) associated with the determined value. The International Standard ISO/IEC 17025, "General requirements for the competence of testing and calibration laboratories", has more prescriptive requirements on the MU than its superseded document, ISO/IEC Guide 25. Under the 2005 or 1999 versions of the new standard, an estimation of the MU is mandatory for all quantitative determinations. To comply with the new requirement, a protocol was established in the authors' laboratory in 2001. The protocol has since evolved based on our practical experience, and a refined version was adopted in 2004. This paper describes our approach in establishing the MU, as well as some other important considerations, for the quantification of threshold substances in biological samples as applied in the area of doping control for horses. The testing of threshold substances can be viewed as a compliance test (or testing to a specified limit). As such, it should only be necessary to establish the MU at the threshold level. The steps in a "Bottom-Up" approach adopted by us are similar to those described in the EURACHEM/CITAC guide, "Quantifying Uncertainty in Analytical Measurement". They involve first specifying the measurand, including the relationship between the measurand and the input quantities upon which it depends. This is followed by identifying all applicable uncertainty contributions using a "cause and effect" diagram. The magnitude of each uncertainty component is then calculated and converted to a standard uncertainty. A recovery study is also conducted to determine if the method bias is significant and whether a recovery (or correction) factor needs to be applied. All standard uncertainties with values greater than 30% of the largest one are then used to

  12. Identifying significant uncertainties in thermally dependent processes for repository performance analysis

    International Nuclear Information System (INIS)

    Gansemer, J.D.; Lamont, A.

    1994-01-01

    In order to study the performance of the potential Yucca Mountain Nuclear Waste Repository, scientific investigations are being conducted to reduce the uncertainty about process models and system parameters. This paper is intended to demonstrate a method for determining a strategy for the cost effective management of these investigations. It is not meant to be a complete study of all processes and interactions, but does outline a method which can be applied to more in-depth investigations

  13. Calibration/Validation Error Budgets, Uncertainties, Traceability and Their Importance to Imaging Spectrometry

    Science.gov (United States)

    Thome, K.

    2016-01-01

    Knowledge of uncertainties and errors are essential for comparisons of remote sensing data across time, space, and spectral domains. Vicarious radiometric calibration is used to demonstrate the need for uncertainty knowledge and to provide an example error budget. The sample error budget serves as an example of the questions and issues that need to be addressed by the calibrationvalidation community as accuracy requirements for imaging spectroscopy data will continue to become more stringent in the future. Error budgets will also be critical to ensure consistency between the range of imaging spectrometers expected to be launched in the next five years.

  14. Assessing Reliability of Cellulose Hydrolysis Models to Support Biofuel Process Design – Identifiability and Uncertainty Analysis

    DEFF Research Database (Denmark)

    Sin, Gürkan; Meyer, Anne S.; Gernaey, Krist

    2010-01-01

    The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done in the ori......The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done...

  15. Funnel plot control limits to identify poorly performing healthcare providers when there is uncertainty in the value of the benchmark.

    Science.gov (United States)

    Manktelow, Bradley N; Seaton, Sarah E; Evans, T Alun

    2016-12-01

    There is an increasing use of statistical methods, such as funnel plots, to identify poorly performing healthcare providers. Funnel plots comprise the construction of control limits around a benchmark and providers with outcomes falling outside the limits are investigated as potential outliers. The benchmark is usually estimated from observed data but uncertainty in this estimate is usually ignored when constructing control limits. In this paper, the use of funnel plots in the presence of uncertainty in the value of the benchmark is reviewed for outcomes from a Binomial distribution. Two methods to derive the control limits are shown: (i) prediction intervals; (ii) tolerance intervals Tolerance intervals formally include the uncertainty in the value of the benchmark while prediction intervals do not. The probability properties of 95% control limits derived using each method were investigated through hypothesised scenarios. Neither prediction intervals nor tolerance intervals produce funnel plot control limits that satisfy the nominal probability characteristics when there is uncertainty in the value of the benchmark. This is not necessarily to say that funnel plots have no role to play in healthcare, but that without the development of intervals satisfying the nominal probability characteristics they must be interpreted with care. © The Author(s) 2014.

  16. Importance of tree basic density in biomass estimation and associated uncertainties

    DEFF Research Database (Denmark)

    Njana, Marco Andrew; Meilby, Henrik; Eid, Tron

    2016-01-01

    Key message Aboveground and belowground tree basic densities varied between and within the three mangrove species. If appropriately determined and applied, basic density may be useful in estimation of tree biomass. Predictive accuracy of the common (i.e. multi-species) models including aboveground...... of sustainable forest management, conservation and enhancement of carbon stocks (REDD+) initiatives offer an opportunity for sustainable management of forests including mangroves. In carbon accounting for REDD+, it is required that carbon estimates prepared for monitoring reporting and verification schemes...... and examine uncertainties in estimation of tree biomass using indirect methods. Methods This study focused on three dominant mangrove species (Avicennia marina (Forssk.) Vierh, Sonneratia alba J. Smith and Rhizophora mucronata Lam.) in Tanzania. A total of 120 trees were destructively sampled for aboveground...

  17. The National Ecosystem Services Classification System: A Framework for Identifying and Reducing Relevant Uncertainties

    Science.gov (United States)

    Rhodes, C. R.; Sinha, P.; Amanda, N.

    2013-12-01

    In recent years the gap between what scientists know and what policymakers should appreciate in environmental decision making has received more attention, as the costs of the disconnect have become more apparent to both groups. Particularly for water-related policies, the EPA's Office of Water has struggled with benefit estimates held low by the inability to quantify ecological and economic effects that theory, modeling, and anecdotal or isolated case evidence suggest may prove to be larger. Better coordination with ecologists and hydrologists is being explored as a solution. The ecosystem services (ES) concept now nearly two decades old links ecosystem functions and processes to the human value system. But there remains no clear mapping of which ecosystem goods and services affect which individual or economic values. The National Ecosystem Services Classification System (NESCS, 'nexus') project brings together ecologists, hydrologists, and social scientists to do this mapping for aquatic and other ecosystem service-generating systems. The objective is to greatly reduce the uncertainty in water-related policy making by mapping and ultimately quantifying the various functions and products of aquatic systems, as well as how changes to aquatic systems impact the human economy and individual levels of non-monetary appreciation for those functions and products. Primary challenges to fostering interaction between scientists, social scientists, and policymakers are lack of a common vocabulary, and the need for a cohesive comprehensive framework that organizes concepts across disciplines and accommodates scientific data from a range of sources. NESCS builds the vocabulary and the framework so both may inform a scalable transdisciplinary policy-making application. This talk presents for discussion the process and progress in developing both this vocabulary and a classifying framework capable of bridging the gap between a newer but existing ecosystem services classification

  18. Narrowing Historical Uncertainty: Probabilistic Classification of Ambiguously Identified Tree Species in Historical Forest Survey Data

    Science.gov (United States)

    David J. Mladenoff; Sally E. Dahir; Eric V. Nordheim; Lisa A. Schulte; Glenn G. Gutenspergen

    2002-01-01

    Historical data have increasingly become appreciated for insight into the past conditions of ecosystems, Uses of such data include assessing the extent of ecosystem change; deriving ecological baselines for management, restoration, and modeling; and assessing the importance of past conditions on the composition and function of current systems. One historical data set...

  19. Assessing Uncertainty in Deep Learning Techniques that Identify Atmospheric Rivers in Climate Simulations

    Science.gov (United States)

    Mahesh, A.; Mudigonda, M.; Kim, S. K.; Kashinath, K.; Kahou, S.; Michalski, V.; Williams, D. N.; Liu, Y.; Prabhat, M.; Loring, B.; O'Brien, T. A.; Collins, W. D.

    2017-12-01

    Atmospheric rivers (ARs) can be the difference between CA facing drought or hurricane-level storms. ARs are a form of extreme weather defined as long, narrow columns of moisture which transport water vapor outside the tropics. When they make landfall, they release the vapor as rain or snow. Convolutional neural networks (CNNs), a machine learning technique that uses filters to recognize features, are the leading computer vision mechanism for classifying multichannel images. CNNs have been proven to be effective in identifying extreme weather events in climate simulation output (Liu et. al. 2016, ABDA'16, http://bit.ly/2hlrFNV). Here, we compare three different CNN architectures, tuned with different hyperparameters and training schemes. We compare two-layer, three-layer, four-layer, and sixteen-layer CNNs' ability to recognize ARs in Community Atmospheric Model version 5 output, and we explore the ability of data augmentation and pre-trained models to increase the accuracy of the classifier. Because pre-training the model with regular images (i.e. benches, stoves, and dogs) yielded the highest accuracy rate, this strategy, also known as transfer learning, may be vital in future scientific CNNs, which likely will not have access to a large labelled training dataset. By choosing the most effective CNN architecture, climate scientists can build an accurate historical database of ARs, which can be used to develop a predictive understanding of these phenomena.

  20. Risk management of energy system for identifying optimal power mix with financial-cost minimization and environmental-impact mitigation under uncertainty

    International Nuclear Information System (INIS)

    Nie, S.; Li, Y.P.; Liu, J.; Huang, Charley Z.

    2017-01-01

    An interval-stochastic risk management (ISRM) method is launched to control the variability of the recourse cost as well as to capture the notion of risk in stochastic programming. The ISRM method can examine various policy scenarios that are associated with economic penalties under uncertainties presented as probability distributions and interval values. An ISRM model is then formulated to identify the optimal power mix for the Beijing's energy system. Tradeoffs between risk and cost are evaluated, indicating any change in targeted cost and risk level would yield different expected costs. Results reveal that the inherent uncertainty of system components and risk attitude of decision makers have significant effects on the city's energy-supply and electricity-generation schemes as well as system cost and probabilistic penalty. Results also disclose that import electricity as a recourse action to compensate the local shortage would be enforced. The import electricity would increase with a reduced risk level; under every risk level, more electricity would be imported with an increased demand. The findings can facilitate the local authority in identifying desired strategies for the city's energy planning and management in association with financial-cost minimization and environmental-impact mitigation. - Highlights: • Interval-stochastic risk management method is launched to identify optimal power mix. • It is advantageous in capturing the notion of risk in stochastic programming. • Results reveal that risk attitudes can affect optimal power mix and financial cost. • Developing renewable energies would enhance the sustainability of energy management. • Import electricity as an action to compensate the local shortage would be enforced.

  1. Methodology for identifying boundaries of systems important to safety in CANDU nuclear power plants

    International Nuclear Information System (INIS)

    Therrien, S.; Komljenovic, D.; Therrien, P.; Ruest, C.; Prevost, P.; Vaillancourt, R.

    2007-01-01

    This paper presents a methodology developed to identify the boundaries of the systems important to safety (SIS) at the Gentilly-2 Nuclear Power Plant (NPP), Hydro-Quebec. The SIS boundaries identification considers nuclear safety only. Components that are not identified as important to safety are systematically identified as related to safety. A global assessment process such as WANO/INPO AP-913 'Equipment Reliability Process' will be needed to implement adequate changes in the management rules of those components. The paper depicts results in applying the methodology to the Shutdown Systems 1 and 2 (SDS 1, 2), and to the Emergency Core Cooling System (ECCS). This validation process enabled fine tuning the methodology, performing a better estimate of the effort required to evaluate a system, and identifying components important to safety of these systems. (author)

  2. Genes Important for Schizosaccharomyces pombe Meiosis Identified Through a Functional Genomics Screen

    Science.gov (United States)

    Blyth, Julie; Makrantoni, Vasso; Barton, Rachael E.; Spanos, Christos; Rappsilber, Juri; Marston, Adele L.

    2018-01-01

    Meiosis is a specialized cell division that generates gametes, such as eggs and sperm. Errors in meiosis result in miscarriages and are the leading cause of birth defects; however, the molecular origins of these defects remain unknown. Studies in model organisms are beginning to identify the genes and pathways important for meiosis, but the parts list is still poorly defined. Here we present a comprehensive catalog of genes important for meiosis in the fission yeast, Schizosaccharomyces pombe. Our genome-wide functional screen surveyed all nonessential genes for roles in chromosome segregation and spore formation. Novel genes important at distinct stages of the meiotic chromosome segregation and differentiation program were identified. Preliminary characterization implicated three of these genes in centrosome/spindle pole body, centromere, and cohesion function. Our findings represent a near-complete parts list of genes important for meiosis in fission yeast, providing a valuable resource to advance our molecular understanding of meiosis. PMID:29259000

  3. Transient flow conditions in probabilistic wellhead protection: importance and ways to manage spatial and temporal uncertainty in capture zone delineation

    Science.gov (United States)

    Enzenhoefer, R.; Rodriguez-Pretelin, A.; Nowak, W.

    2012-12-01

    "From an engineering standpoint, the quantification of uncertainty is extremely important not only because it allows estimating risk but mostly because it allows taking optimal decisions in an uncertain framework" (Renard, 2007). The most common way to account for uncertainty in the field of subsurface hydrology and wellhead protection is to randomize spatial parameters, e.g. the log-hydraulic conductivity or porosity. This enables water managers to take robust decisions in delineating wellhead protection zones with rationally chosen safety margins in the spirit of probabilistic risk management. Probabilistic wellhead protection zones are commonly based on steady-state flow fields. However, several past studies showed that transient flow conditions may substantially influence the shape and extent of catchments. Therefore, we believe they should be accounted for in the probabilistic assessment and in the delineation process. The aim of our work is to show the significance of flow transients and to investigate the interplay between spatial uncertainty and flow transients in wellhead protection zone delineation. To this end, we advance our concept of probabilistic capture zone delineation (Enzenhoefer et al., 2012) that works with capture probabilities and other probabilistic criteria for delineation. The extended framework is able to evaluate the time fraction that any point on a map falls within a capture zone. In short, we separate capture probabilities into spatial/statistical and time-related frequencies. This will provide water managers additional information on how to manage a well catchment in the light of possible hazard conditions close to the capture boundary under uncertain and time-variable flow conditions. In order to save computational costs, we take advantage of super-positioned flow components with time-variable coefficients. We assume an instantaneous development of steady-state flow conditions after each temporal change in driving forces, following

  4. Uncertainties in hydrogen combustion

    International Nuclear Information System (INIS)

    Stamps, D.W.; Wong, C.C.; Nelson, L.S.

    1988-01-01

    Three important areas of hydrogen combustion with uncertainties are identified: high-temperature combustion, flame acceleration and deflagration-to-detonation transition, and aerosol resuspension during hydrogen combustion. The uncertainties associated with high-temperature combustion may affect at least three different accident scenarios: the in-cavity oxidation of combustible gases produced by core-concrete interactions, the direct containment heating hydrogen problem, and the possibility of local detonations. How these uncertainties may affect the sequence of various accident scenarios is discussed and recommendations are made to reduce these uncertainties. 40 references

  5. Use of amplified fragment length polymorphism analysis to identify medically important Candida spp., including C. dubliniensis.

    NARCIS (Netherlands)

    Borst, A; Theelen, B; Reinders, E; Boekhout, T; Fluit, AC; Savelkoul, P.H.M.

    2003-01-01

    Non-Candida albicans Candida species are increasingly being isolated. These species show differences in levels of resistance to antimycotic agents and mortality. Therefore, it is important to be able to correctly identify the causative organism to the species level. Identification of C. dubliniensis

  6. Identifying and assessing uncertainty in hydrological pathways: a novel approach to end member mixing in a Scottish agricultural catchment

    Science.gov (United States)

    Soulsby, C.; Petry, J.; Brewer, M. J.; Dunn, S. M.; Ott, B.; Malcolm, I. A.

    2003-04-01

    A hydrograph separation based upon end member mixing was carried out to assess the relative importance of the hydrological pathways providing the main sources of runoff during five storm events in a 14.5 km 2 agricultural catchment in north east Scotland. The method utilised event specific end member chemistries to differentiate three catchment-scale hydrological pathways on the basis of observed Si and NO 3-N concentrations in sampled source waters. These were overland flow (OF) (low Si and intermediate NO 3-N); subsurface storm flow (high Si and high NO 3-N) and groundwater flow (high Si and intermediate NO 3-N). The hydrograph separation explicitly accounted for uncertainty in the spatial and temporal variation in end member chemistry using Bayesian statistical methods which assumed that each end member arose from a bivariate normal distribution whose mean vectors and co-variance matrices could be estimated. Markov Chain-Monte Carlo methods were used to model the average and 95 percentile maximum and minimum contributions that each end member made to stream water samples during storm events. Although there is large uncertainty over the contributions of each end member to specific events, the analysis produced hydrograph separations that were broadly believable on the basis of hydrometric observations in the catchment. Moreover, by using event specific end member compositions, the method appeared sensitive to the unique combination of event characteristics, antecedent conditions and seasonality in terms of producing feasible separations for very different events. It is concluded that OF generally dominates the storm peak and provides the main flow path by which P is transferred to stream channels during storm events, whilst subsurface storm flows usually dominate the storm hydrograph volumetrically and route NO 3-rich soil water into streams. Consequently, nutrient enrichment in such streams is largely mediated by event-based hydrological flow paths, a finding

  7. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    Science.gov (United States)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  8. Study of a methodology of identifying important research problems by the PIRT process

    International Nuclear Information System (INIS)

    Aoki, Takayuki; Takagi, Toshiyuki; Urayama, Ryoichi; Komura, Ichiro; Furukawa, Takashi; Yusa, Noritaka

    2014-01-01

    In this paper, we propose a new methodology of identifying important research problems to be solved to improve the performance of some specific scientific technologies by the phenomena identification and ranking table (PIRT) process which has been used as a methodology for demonstrating the validity of the best estimate simulation codes in US Nuclear Regulatory Commission (USNRC) licensing of nuclear power plants. The new methodology makes it possible to identify important factors affecting the performance of the technologies from the viewpoint of the figure of merit and problems associated with them while it keeps the fundamental concepts of the original PIRT process. Also in this paper, we demonstrate the effectiveness of the new methodology by applying it to a task of extracting research problems for improving an inspection accuracy of ultrasonic testing or eddy current testing in the inspection of objects having cracks due to fatigue or stress corrosion cracking. (author)

  9. Study of a methodology of identifying important research problems by the PIRT process

    International Nuclear Information System (INIS)

    Aoki, Takayuki; Takagi, Toshiyuki; Urayama, Ryoichi; Komura, Ichiro; Furukawa, Takashi; Yusa, Noritaka

    2013-01-01

    In this paper, we propose a new methodology of identifying important research problems to be solved to improve the performance of some specific scientific technologies by the phenomena identification and ranking table (PIRT) process, which has been used as a methodology for demonstrating the validity of the best estimate simulation codes in USNRC licensing of nuclear power plants. It keeps the fundamental concepts of the original PIRT process but makes it possible to identify important factors affecting the performance of the technologies from the viewpoint of the figure of merit and problems associated with them, which need to be solved to improve the performance. Also in this paper, we demonstrate the effectiveness of the developed method by showing a specific example of the application to physical events or phenomena in objects having fatigue or SCC crack(s) under ultrasonic testing and eddy current testing. (author)

  10. A framework for model-based optimization of bioprocesses under uncertainty: Identifying critical parameters and operating variables

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2011-01-01

    This study presents the development and application of a systematic model-based framework for bioprocess optimization, evaluated on a cellulosic ethanol production case study. The implementation of the framework involves the use of dynamic simulations, sophisticated uncertainty analysis (Monte...

  11. UNCERTAINTY AND ORIENTATION TOWARDS ERRORS IN TIMES OF CRISIS. THE IMPORTANCE OF BUILDING CONFIDENCE, ENCOURAGING COLLECTIVE EFFICACY

    Directory of Open Access Journals (Sweden)

    Carmen Tabernero

    2014-05-01

    Full Text Available The current economic crisis is triggering a new scenario of uncertainty, which is affecting the organizational behavior of individuals and working teams. In contexts of uncertainty, organizational performance suffers a significant decline—workers are faced with the perceived threat of job loss, individuals distrust their organization and perceive that they must compete with their peers. This paper analyzes the effect of uncertainty on both performance and the affective states of workers, as well as the cognitive, affective and personality strategies (goals and error orientation to cope with uncertainty as either learning pportunities or as situations to be avoided. Moreover, this paper explores gender differences in both coping styles in situations of uncertainty and the results of a training program based on error affect inoculation in which positive emotional responses were emphasized. Finally, we discuss the relevance of generating practices and experiences of team cooperation that build trust and promote collective efficacy in work teams.

  12. Resequencing 50 accessions of cultivated and wild rice yields markers for identifying agronomically important genes

    DEFF Research Database (Denmark)

    Xu, Xun; Liu, Xin; Ge, Song

    2012-01-01

    Rice is a staple crop that has undergone substantial phenotypic and physiological changes during domestication. Here we resequenced the genomes of 40 cultivated accessions selected from the major groups of rice and 10 accessions of their wild progenitors (Oryza rufipogon and Oryza nivara) to >15 x...... diversity in cultivated but not wild rice, which represent candidate regions selected during domestication. Some of these variants are associated with important biological features, whereas others have yet to be functionally characterized. The molecular markers we have identified should be valuable...... raw data coverage. We investigated genome-wide variation patterns in rice and obtained 6.5 million high-quality single nucleotide polymorphisms (SNPs) after excluding sites with missing data in any accession. Using these population SNP data, we identified thousands of genes with significantly lower...

  13. On the importance of identifying, characterizing, and predicting fundamental phenomena towards microbial electrochemistry applications.

    Science.gov (United States)

    Torres, César Iván

    2014-06-01

    The development of microbial electrochemistry research toward technological applications has increased significantly in the past years, leading to many process configurations. This short review focuses on the need to identify and characterize the fundamental phenomena that control the performance of microbial electrochemical cells (MXCs). Specifically, it discusses the importance of recent efforts to discover and characterize novel microorganisms for MXC applications, as well as recent developments to understand transport limitations in MXCs. As we increase our understanding of how MXCs operate, it is imperative to continue modeling efforts in order to effectively predict their performance, design efficient MXC technologies, and implement them commercially. Thus, the success of MXC technologies largely depends on the path of identifying, understanding, and predicting fundamental phenomena that determine MXC performance. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Identifying the effects of parameter uncertainty on the reliability of modeling the stability of overhanging, multi-layered, river banks

    Science.gov (United States)

    Samadi, A.; Amiri-Tokaldany, E.; Davoudi, M. H.; Darby, S. E.

    2011-11-01

    Composite river banks consist of a basal layer of non-cohesive material overlain by a cohesive layer of fine-grained material. In such banks, fluvial erosion of the lower, non-cohesive, layer typically occurs at a much higher rate than erosion of the upper part of the bank. Consequently, such banks normally develop a cantilevered bank profile, with bank retreat of the upper part of the bank taking place predominantly by the failure of these cantilevers. To predict the undesirable impacts of this type of bank retreat, a number of bank stability models have been presented in the literature. These models typically express bank stability by defining a factor of safety as the ratio of resisting and driving forces acting on the incipient failure block. These forces are affected by a range of controlling factors that include such aspects as the overhanging block geometry, and the geotechnical properties of the bank materials. In this paper, we introduce a new bank stability relation (for shear-type cantilever failures) that considers the hydrological status of cantilevered riverbanks, while beam-type failures are analyzed using a previously proposed relation. We employ these stability models to evaluate the effects of parameter uncertainty on the reliability of riverbank stability modeling of overhanging banks. This is achieved by employing a simple model of overhanging failure with respect to shear and beam failure mechanisms in a series of sensitivity tests and Monte Carlo analyses to identify, for each model parameter, the range of values that induce significant changes in the simulated factor of safety. The results show that care is required in parameterising (i) the geometrical shape of the overhanging-block and (ii) the bank material cohesion and unit weight, as predictions of bank stability are sensitive to variations of these factors.

  15. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-04-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.

  16. Using the Developmental Gene Bicoid to Identify Species of Forensically Important Blowflies (Diptera: Calliphoridae

    Directory of Open Access Journals (Sweden)

    Seong Hwan Park

    2013-01-01

    Full Text Available Identifying species of insects used to estimate postmortem interval (PMI is a major subject in forensic entomology. Because forensic insect specimens are morphologically uniform and are obtained at various developmental stages, DNA markers are greatly needed. To develop new autosomal DNA markers to identify species, partial genomic sequences of the bicoid (bcd genes, containing the homeobox and its flanking sequences, from 12 blowfly species (Aldrichina grahami, Calliphora vicina, Calliphora lata, Triceratopyga calliphoroides, Chrysomya megacephala, Chrysomya pinguis, Phormia regina, Lucilia ampullacea, Lucilia caesar, Lucilia illustris, Hemipyrellia ligurriens and Lucilia sericata; Calliphoridae: Diptera were determined and analyzed. This study first sequenced the ten blowfly species other than C. vicina and L. sericata. Based on the bcd sequences of these 12 blowfly species, a phylogenetic tree was constructed that discriminates the subfamilies of Calliphoridae (Luciliinae, Chrysomyinae, and Calliphorinae and most blowfly species. Even partial genomic sequences of about 500 bp can distinguish most blowfly species. The short intron 2 and coding sequences downstream of the bcd homeobox in exon 3 could be utilized to develop DNA markers for forensic applications. These gene sequences are important in the evolution of insect developmental biology and are potentially useful for identifying insect species in forensic science.

  17. Using the Developmental Gene Bicoid to Identify Species of Forensically Important Blowflies (Diptera: Calliphoridae)

    Science.gov (United States)

    Park, Seong Hwan; Park, Chung Hyun; Zhang, Yong; Piao, Huguo; Chung, Ukhee; Kim, Seong Yoon; Ko, Kwang Soo; Yi, Cheong-Ho; Jo, Tae-Ho; Hwang, Juck-Joon

    2013-01-01

    Identifying species of insects used to estimate postmortem interval (PMI) is a major subject in forensic entomology. Because forensic insect specimens are morphologically uniform and are obtained at various developmental stages, DNA markers are greatly needed. To develop new autosomal DNA markers to identify species, partial genomic sequences of the bicoid (bcd) genes, containing the homeobox and its flanking sequences, from 12 blowfly species (Aldrichina grahami, Calliphora vicina, Calliphora lata, Triceratopyga calliphoroides, Chrysomya megacephala, Chrysomya pinguis, Phormia regina, Lucilia ampullacea, Lucilia caesar, Lucilia illustris, Hemipyrellia ligurriens and Lucilia sericata; Calliphoridae: Diptera) were determined and analyzed. This study first sequenced the ten blowfly species other than C. vicina and L. sericata. Based on the bcd sequences of these 12 blowfly species, a phylogenetic tree was constructed that discriminates the subfamilies of Calliphoridae (Luciliinae, Chrysomyinae, and Calliphorinae) and most blowfly species. Even partial genomic sequences of about 500 bp can distinguish most blowfly species. The short intron 2 and coding sequences downstream of the bcd homeobox in exon 3 could be utilized to develop DNA markers for forensic applications. These gene sequences are important in the evolution of insect developmental biology and are potentially useful for identifying insect species in forensic science. PMID:23586044

  18. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  19. Identifying trial recruitment uncertainties using a James Lind Alliance Priority Setting Partnership - the PRioRiTy (Prioritising Recruitment in Randomised Trials) study.

    Science.gov (United States)

    Healy, Patricia; Galvin, Sandra; Williamson, Paula R; Treweek, Shaun; Whiting, Caroline; Maeso, Beccy; Bray, Christopher; Brocklehurst, Peter; Moloney, Mary Clarke; Douiri, Abdel; Gamble, Carrol; Gardner, Heidi R; Mitchell, Derick; Stewart, Derek; Jordan, Joan; O'Donnell, Martin; Clarke, Mike; Pavitt, Sue H; Guegan, Eleanor Woodford; Blatch-Jones, Amanda; Smith, Valerie; Reay, Hannah; Devane, Declan

    2018-03-01

    Despite the problem of inadequate recruitment to randomised trials, there is little evidence to guide researchers on decisions about how people are effectively recruited to take part in trials. The PRioRiTy study aimed to identify and prioritise important unanswered trial recruitment questions for research. The PRioRiTy study - Priority Setting Partnership (PSP) included members of the public approached to take part in a randomised trial or who have represented participants on randomised trial steering committees, health professionals and research staff with experience of recruiting to randomised trials, people who have designed, conducted, analysed or reported on randomised trials and people with experience of randomised trials methodology. This partnership was aided by the James Lind Alliance and involved eight stages: (i) identifying a unique, relevant prioritisation area within trial methodology; (ii) establishing a steering group (iii) identifying and engaging with partners and stakeholders; (iv) formulating an initial list of uncertainties; (v) collating the uncertainties into research questions; (vi) confirming that the questions for research are a current recruitment challenge; (vii) shortlisting questions and (viii) final prioritisation through a face-to-face workshop. A total of 790 survey respondents yielded 1693 open-text answers to 6 questions, from which 1880 potential questions for research were identified. After merging duplicates, the number of questions was reduced to 496. Questions were combined further, and those that were submitted by fewer than 15 people and/or fewer than 6 of the 7 stakeholder groups were excluded from the next round of prioritisation resulting in 31 unique questions for research. All 31 questions were confirmed as being unanswered after checking relevant, up-to-date research evidence. The 10 highest priority questions were ranked at a face-to-face workshop. The number 1 ranked question was "How can randomised trials become

  20. Identify the Important Decision Factors of Online Shopping Adoption in Indonesia

    Directory of Open Access Journals (Sweden)

    Lailatul HIJRAH

    2017-12-01

    Full Text Available The objective of this study is to identify factors encouraging a consumer to engage in online shopping activities. The expected contribution of this study is for online entrepreneurs, in order to develop the most suitable business strategy, so that it will be clearly identified and sorted out which factors are the most important and the main motivation of Indonesian consumers to shop via online by using responses from respondents who usually shop online and offline in 3 cities in Indonesia, Jakarta, Surabaya and Samarinda. The research instruments were developed by conducting FGDs on relevant groups, either academics, online shopping activists, suppliers and courier businessmen in Jakarta, Surabaya and Samarinda Cities in effort to extract any information that encourages consumers to online shopping. After conducting FGD, the researcher produced 48 items proposed for factor analysis and after extracted to form eleven constructs, some items were removed because they had less loading factors. The eleven constructs or dimensions are trust, risk, consumer factors, website factors, price, service quality, convenience, subjective norm, product guarantee, variety of products and lifestyle. The implications of this study provide valuable insights about consumer decisions to online shopping or not online shopping.

  1. Systematic reviews identify important methodological flaws in stroke rehabilitation therapy primary studies: review of reviews.

    Science.gov (United States)

    Santaguida, Pasqualina; Oremus, Mark; Walker, Kathryn; Wishart, Laurie R; Siegel, Karen Lohmann; Raina, Parminder

    2012-04-01

    A "review of reviews" was undertaken to assess methodological issues in studies evaluating nondrug rehabilitation interventions in stroke patients. MEDLINE, CINAHL, PsycINFO, and the Cochrane Database of Systematic Reviews were searched from January 2000 to January 2008 within the stroke rehabilitation setting. Electronic searches were supplemented by reviews of reference lists and citations identified by experts. Eligible studies were systematic reviews; excluded citations were narrative reviews or reviews of reviews. Review characteristics and criteria for assessing methodological quality of primary studies within them were extracted. The search yielded 949 English-language citations. We included a final set of 38 systematic reviews. Cochrane reviews, which have a standardized methodology, were generally of higher methodological quality than non-Cochrane reviews. Most systematic reviews used standardized quality assessment criteria for primary studies, but not all were comprehensive. Reviews showed that primary studies had problems with randomization, allocation concealment, and blinding. Baseline comparability, adverse events, and co-intervention or contamination were not consistently assessed. Blinding of patients and providers was often not feasible and was not evaluated as a source of bias. The eligible systematic reviews identified important methodological flaws in the evaluated primary studies, suggesting the need for improvement of research methods and reporting. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. Estimating the uncertainty from sampling in pollution crime investigation: The importance of metrology in the forensic interpretation of environmental data.

    Science.gov (United States)

    Barazzetti Barbieri, Cristina; de Souza Sarkis, Jorge Eduardo

    2018-07-01

    The forensic interpretation of environmental analytical data is usually challenging due to the high geospatial variability of these data. The measurements' uncertainty includes contributions from the sampling and from the sample handling and preparation processes. These contributions are often disregarded in analytical techniques results' quality assurance. A pollution crime investigation case was used to carry out a methodology able to address these uncertainties in two different environmental compartments, freshwater sediments and landfill leachate. The methodology used to estimate the uncertainty was the duplicate method (that replicates predefined steps of the measurement procedure in order to assess its precision) and the parameters used to investigate the pollution were metals (Cr, Cu, Ni, and Zn) in the leachate, the suspect source, and in the sediment, the possible sink. The metal analysis results were compared to statutory limits and it was demonstrated that Cr and Ni concentrations in sediment samples exceeded the threshold levels at all sites downstream the pollution sources, considering the expanded uncertainty U of the measurements and a probability of contamination >0.975, at most sites. Cu and Zn concentrations were above the statutory limits at two sites, but the classification was inconclusive considering the uncertainties of the measurements. Metal analyses in leachate revealed that Cr concentrations were above the statutory limits with a probability of contamination >0.975 in all leachate ponds while the Cu, Ni and Zn probability of contamination was below 0.025. The results demonstrated that the estimation of the sampling uncertainty, which was the dominant component of the combined uncertainty, is required for a comprehensive interpretation of the environmental analyses results, particularly in forensic cases. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Statistically based uncertainty analysis for ranking of component importance in the thermal-hydraulic safety analysis of the Advanced Neutron Source Reactor

    International Nuclear Information System (INIS)

    Wilson, G.E.

    1992-01-01

    The Analytic Hierarchy Process (AHP) has been used to help determine the importance of components and phenomena in thermal-hydraulic safety analyses of nuclear reactors. The AHP results are based, in part on expert opinion. Therefore, it is prudent to evaluate the uncertainty of the AHP ranks of importance. Prior applications have addressed uncertainty with experimental data comparisons and bounding sensitivity calculations. These methods work well when a sufficient experimental data base exists to justify the comparisons. However, in the case of limited or no experimental data the size of the uncertainty is normally made conservatively large. Accordingly, the author has taken another approach, that of performing a statistically based uncertainty analysis. The new work is based on prior evaluations of the importance of components and phenomena in the thermal-hydraulic safety analysis of the Advanced Neutron Source Reactor (ANSR), a new facility now in the design phase. The uncertainty during large break loss of coolant, and decay heat removal scenarios is estimated by assigning a probability distribution function (pdf) to the potential error in the initial expert estimates of pair-wise importance between the components. Using a Monte Carlo sampling technique, the error pdfs are propagated through the AHP software solutions to determine a pdf of uncertainty in the system wide importance of each component. To enhance the generality of the results, study of one other problem having different number of elements is reported, as are the effects of a larger assumed pdf error in the expert ranks. Validation of the Monte Carlo sample size and repeatability are also documented

  4. Identifying selectively important amino acid positions associated with alternative habitat environments in fish mitochondrial genomes.

    Science.gov (United States)

    Xia, Jun Hong; Li, Hong Lian; Zhang, Yong; Meng, Zi Ning; Lin, Hao Ran

    2018-05-01

    Fish species inhabitating seawater (SW) or freshwater (FW) habitats have to develop genetic adaptations to alternative environment factors, especially salinity. Functional consequences of the protein variations associated with habitat environments in fish mitochondrial genomes have not yet received much attention. We analyzed 829 complete fish mitochondrial genomes and compared the amino acid differences of 13 mitochondrial protein families between FW and SW fish groups. We identified 47 specificity determining sites (SDS) that associated with FW or SW environments from 12 mitochondrial protein families. Thirty-two (68%) of the SDS sites are hydrophobic, 13 (28%) are neutral, and the remaining sites are acidic or basic. Seven of those SDS from ND1, ND2 and ND5 were scored as probably damaging to the protein structures. Furthermore, phylogenetic tree based Bayes Empirical Bayes analysis also detected 63 positive sites associated with alternative habitat environments across ten mtDNA proteins. These signatures could be important for studying mitochondrial genetic variation relevant to fish physiology and ecology.

  5. PCR-RFLP Method to Identify Salmonid Species of Economic Importance

    Directory of Open Access Journals (Sweden)

    Andreea Dudu

    2011-05-01

    Full Text Available The identification of different fish species by molecular methods has become necessary to avoid both the incorrect labelling of individuals involved in repopulation programmes and the commercial frauds on the fish market. Different fish species of great economical importance, like the salmonids, which are very much requested for their meat, can be identified using molecular techniques such as PCR-RFLP. The method is based on the amplification of a target region from the genome by PCR reaction followed by endonucleases digestion to detect the polymorphism of restriction fragments. In our study we analysed the following salmonid species from Romania: Salmo trutta fario, Salmo labrax, Salvelinus fontinalis, Onchorhynchus mykiss, Thymallus thymallus and Hucho hucho. In order to discriminate between the analysed species we amplified a fragment of mitochondrial genome comprising tRNAGlu/ cytochrome b/ tRNAThr/ tRNAPro/ D-loop/ tRNAPhe, followed by digestion with a specific restriction enzyme. The direct digestion of unpurified PCR products generated species-specific restriction patterns and proved to be a simple, reliable, inexpensive and fast method. Thus, it may be successfully utilized in specialized laboratories for the correct identification of the fish species for multiple purposes, including the traceability of fish food products.

  6. Identifying Important Atlantic Areas for the conservation of Balearic shearwaters: Spatial overlap with conservation areas

    Science.gov (United States)

    Pérez-Roda, Amparo; Delord, Karine; Boué, Amélie; Arcos, José Manuel; García, David; Micol, Thierry; Weimerskirch, Henri; Pinaud, David; Louzao, Maite

    2017-07-01

    Marine protected areas (MPAs) are considered one of the main tools in both fisheries and conservation management to protect threatened species and their habitats around the globe. However, MPAs are underrepresented in marine environments compared to terrestrial environments. Within this context, we studied the Atlantic non-breeding distribution of the southern population of Balearic shearwaters (Puffinus mauretanicus) breeding in Eivissa during the 2011-2012 period based on global location sensing (GLS) devices. Our objectives were (1) to identify overall Important Atlantic Areas (IAAs) from a southern population, (2) to describe spatio-temporal patterns of oceanographic habitat use, and (3) to assess whether existing conservation areas (Natura 2000 sites and marine Important Bird Areas (IBAs)) cover the main IAAs of Balearic shearwaters. Our results highlighted that the Atlantic staging (from June to October in 2011) dynamic of the southern population was driven by individual segregation at both spatial and temporal scales. Individuals ranged in the North-East Atlantic over four main IAAs (Bay of Biscay: BoB, Western Iberian shelf: WIS, Gulf of Cadiz: GoC, West of Morocco: WoM). While most individuals spent more time on the WIS or in the GoC, a small number of birds visited IAAs at the extremes of their Atlantic distribution range (i.e., BoB and WoM). The chronology of the arrivals to the IAAs showed a latitudinal gradient with northern areas reached earlier during the Atlantic staging. The IAAs coincided with the most productive areas (higher chlorophyll a values) in the NE Atlantic between July and October. The spatial overlap between IAAs and conservation areas was higher for Natura 2000 sites than marine IBAs (areas with and without legal protection, respectively). Concerning the use of these areas, a slightly higher proportion of estimated positions fell within marine IBAs compared to designated Natura 2000 sites, with Spanish and Portuguese conservation

  7. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  8. Directed International Technological Change and Climate Policy: New Methods for Identifying Robust Policies Under Conditions of Deep Uncertainty

    Science.gov (United States)

    Molina-Perez, Edmundo

    It is widely recognized that international environmental technological change is key to reduce the rapidly rising greenhouse gas emissions of emerging nations. In 2010, the United Nations Framework Convention on Climate Change (UNFCCC) Conference of the Parties (COP) agreed to the creation of the Green Climate Fund (GCF). This new multilateral organization has been created with the collective contributions of COP members, and has been tasked with directing over USD 100 billion per year towards investments that can enhance the development and diffusion of clean energy technologies in both advanced and emerging nations (Helm and Pichler, 2015). The landmark agreement arrived at the COP 21 has reaffirmed the key role that the GCF plays in enabling climate mitigation as it is now necessary to align large scale climate financing efforts with the long-term goals agreed at Paris 2015. This study argues that because of the incomplete understanding of the mechanics of international technological change, the multiplicity of policy options and ultimately the presence of climate and technological change deep uncertainty, climate financing institutions such as the GCF, require new analytical methods for designing long-term robust investment plans. Motivated by these challenges, this dissertation shows that the application of new analytical methods, such as Robust Decision Making (RDM) and Exploratory Modeling (Lempert, Popper and Bankes, 2003) to the study of international technological change and climate policy provides useful insights that can be used for designing a robust architecture of international technological cooperation for climate change mitigation. For this study I developed an exploratory dynamic integrated assessment model (EDIAM) which is used as the scenario generator in a large computational experiment. The scope of the experimental design considers an ample set of climate and technological scenarios. These scenarios combine five sources of uncertainty

  9. Uncertainty Quantification Reveals the Importance of Data Variability and Experimental Design Considerations for in Silico Proarrhythmia Risk Assessment

    Directory of Open Access Journals (Sweden)

    Kelly C. Chang

    2017-11-01

    Full Text Available The Comprehensive in vitro Proarrhythmia Assay (CiPA is a global initiative intended to improve drug proarrhythmia risk assessment using a new paradigm of mechanistic assays. Under the CiPA paradigm, the relative risk of drug-induced Torsade de Pointes (TdP is assessed using an in silico model of the human ventricular action potential (AP that integrates in vitro pharmacology data from multiple ion channels. Thus, modeling predictions of cardiac risk liability will depend critically on the variability in pharmacology data, and uncertainty quantification (UQ must comprise an essential component of the in silico assay. This study explores UQ methods that may be incorporated into the CiPA framework. Recently, we proposed a promising in silico TdP risk metric (qNet, which is derived from AP simulations and allows separation of a set of CiPA training compounds into Low, Intermediate, and High TdP risk categories. The purpose of this study was to use UQ to evaluate the robustness of TdP risk separation by qNet. Uncertainty in the model parameters used to describe drug binding and ionic current block was estimated using the non-parametric bootstrap method and a Bayesian inference approach. Uncertainty was then propagated through AP simulations to quantify uncertainty in qNet for each drug. UQ revealed lower uncertainty and more accurate TdP risk stratification by qNet when simulations were run at concentrations below 5× the maximum therapeutic exposure (Cmax. However, when drug effects were extrapolated above 10× Cmax, UQ showed that qNet could no longer clearly separate drugs by TdP risk. This was because for most of the pharmacology data, the amount of current block measured was <60%, preventing reliable estimation of IC50-values. The results of this study demonstrate that the accuracy of TdP risk prediction depends both on the intrinsic variability in ion channel pharmacology data as well as on experimental design considerations that preclude an

  10. Competence Description for Personal Recommendations: The Importance of Identifying the Complexity of Learning and Performance Situations

    Science.gov (United States)

    Prins, Frans J.; Nadolski, Rob J.; Berlanga, Adriana J.; Drachsler, Hendrik; Hummel, Hans G. K.; Koper, Rob

    2008-01-01

    For competences development of learners and professionals, target competences and corresponding competence development opportunities have to be identified. Personal Recommender Systems (PRS) provide personal recommendations for learners aimed at finding and selecting learning activities that best match their needs. This article argues that a…

  11. Identifying Important Career Indicators of Undergraduate Geoscience Students Upon Completion of Their Degree

    Science.gov (United States)

    Wilson, C. E.; Keane, C. M.; Houlton, H. R.

    2012-12-01

    The American Geosciences Institute (AGI) decided to create the National Geoscience Student Exit Survey in order to identify the initial pathways into the workforce for these graduating students, as well as assess their preparedness for entering the workforce upon graduation. The creation of this survey stemmed from a combination of experiences with the AGI/AGU Survey of Doctorates and discussions at the following Science Education Research Center (SERC) workshops: "Developing Pathways to Strong Programs for the Future", "Strengthening Your Geoscience Program", and "Assessing Geoscience Programs". These events identified distinct gaps in understanding the experiences and perspectives of geoscience students during one of their most profound professional transitions. Therefore, the idea for the survey arose as a way to evaluate how the discipline is preparing and educating students, as well as identifying the students' desired career paths. The discussions at the workshops solidified the need for this survey and created the initial framework for the first pilot of the survey. The purpose of this assessment tool is to evaluate student preparedness for entering the geosciences workforce; identify student decision points for entering geosciences fields and remaining in the geosciences workforce; identify geosciences fields that students pursue in undergraduate and graduate school; collect information on students' expected career trajectories and geosciences professions; identify geosciences career sectors that are hiring new graduates; collect information about salary projections; overall effectiveness of geosciences departments regionally and nationally; demonstrate the value of geosciences degrees to future students, the institutions, and employers; and establish a benchmark to perform longitudinal studies of geosciences graduates to understand their career pathways and impacts of their educational experiences on these decisions. AGI's Student Exit Survey went through

  12. Genome-wide association study identified copy number variants important for appendicular lean mass.

    Science.gov (United States)

    Ran, Shu; Liu, Yong-Jun; Zhang, Lei; Pei, Yufang; Yang, Tie-Lin; Hai, Rong; Han, Ying-Ying; Lin, Yong; Tian, Qing; Deng, Hong-Wen

    2014-01-01

    Skeletal muscle is a major component of the human body. Age-related loss of muscle mass and function contributes to some public health problems such as sarcopenia and osteoporosis. Skeletal muscle, mainly composed of appendicular lean mass (ALM), is a heritable trait. Copy number variation (CNV) is a common type of human genome variant which may play an important role in the etiology of many human diseases. In this study, we performed genome-wide association analyses of CNV for ALM in 2,286 Caucasian subjects. We then replicated the major findings in 1,627 Chinese subjects. Two CNVs, CNV1191 and CNV2580, were detected to be associated with ALM (p = 2.26×10(-2) and 3.34×10(-3), respectively). In the Chinese replication sample, the two CNVs achieved p-values of 3.26×10(-2) and 0.107, respectively. CNV1191 covers a gene, GTPase of the immunity-associated protein family (GIMAP1), which is important for skeletal muscle cell survival/death in humans. CNV2580 is located in the Serine hydrolase-like protein (SERHL) gene, which plays an important role in normal peroxisome function and skeletal muscle growth in response to mechanical stimuli. In summary, our study suggested two novel CNVs and the related genes that may contribute to variation in ALM.

  13. Genome-wide association study identified copy number variants important for appendicular lean mass.

    Directory of Open Access Journals (Sweden)

    Shu Ran

    Full Text Available Skeletal muscle is a major component of the human body. Age-related loss of muscle mass and function contributes to some public health problems such as sarcopenia and osteoporosis. Skeletal muscle, mainly composed of appendicular lean mass (ALM, is a heritable trait. Copy number variation (CNV is a common type of human genome variant which may play an important role in the etiology of many human diseases. In this study, we performed genome-wide association analyses of CNV for ALM in 2,286 Caucasian subjects. We then replicated the major findings in 1,627 Chinese subjects. Two CNVs, CNV1191 and CNV2580, were detected to be associated with ALM (p = 2.26×10(-2 and 3.34×10(-3, respectively. In the Chinese replication sample, the two CNVs achieved p-values of 3.26×10(-2 and 0.107, respectively. CNV1191 covers a gene, GTPase of the immunity-associated protein family (GIMAP1, which is important for skeletal muscle cell survival/death in humans. CNV2580 is located in the Serine hydrolase-like protein (SERHL gene, which plays an important role in normal peroxisome function and skeletal muscle growth in response to mechanical stimuli. In summary, our study suggested two novel CNVs and the related genes that may contribute to variation in ALM.

  14. Identifying faecal impaction is important for ensuring the timely diagnosis of childhood functional constipation

    DEFF Research Database (Denmark)

    Modin, Line; Walsted, Anne-Mette; Jakobsen, Marianne Skytte

    2015-01-01

    AIM: Most research on functional constipation has been carried out at a tertiary level. We focused this study on a secondary-level hospital outpatients' department, assessing the distribution of diagnostic criteria for childhood functional constipation and evaluating the consequences of current...... diagnostic practice based on current guidelines. METHODS: We enrolled 235 children, aged two to 16 years of age, with functional constipation according to the Rome III criteria and assessed them using medical histories and physical examinations, including rectal examinations and ultrasound measurements...... the timely diagnosis of childhood functional constipation at the secondary care level. Ultrasound examination proved a reliable alternative to rectal examination or abdominal radiography when identifying faecal impaction....

  15. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    Science.gov (United States)

    Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  16. A case study of identify importance of land use planning in road safety, Benidorm

    Energy Technology Data Exchange (ETDEWEB)

    Casares Blanco, J.; Sanchez Galiano, J.C.; Fernandez Aracil, P.; Ortuño Padilla, A.

    2016-07-01

    This research analyses how urban form, land use and urban density, may influence the incidence of traffic-related crashes injuries and deaths. It begins with a theoretical overview of studies which deal with the study of the relationship between urban patterns and road safety. Next, it details the development of a database of crash incidence and urban form at the district level for the city of Benidorm (Alicante, Spain) in 2010. Subsequently, it is developed a negative binomial approach for intra-city motor vehicle crash analysis. One-year crash data for Benidorm (the fourth largest tourism destination of Spain, after Barcelona, Madrid and San Bartolomé de Tirajana, and exclusively tourist-oriented city) are analyzed using a geographic information system (GIS) to generate relevant inputs for the analysis. In general, the study finds that a strong land use mix results on fewer road accidents, whereas accidents are more common but less severe in areas of high urban density. Finally, pedestrian accidents research showed that rural and low density environment is related to an important road accident numbers unlike tourism-oriented zones, much more safe for them. Based on these findings, the paper discusses the implications for urban design practice. (Author)

  17. Spider Transcriptomes Identify Ancient Large-Scale Gene Duplication Event Potentially Important in Silk Gland Evolution.

    Science.gov (United States)

    Clarke, Thomas H; Garb, Jessica E; Hayashi, Cheryl Y; Arensburger, Peter; Ayoub, Nadia A

    2015-06-08

    The evolution of specialized tissues with novel functions, such as the silk synthesizing glands in spiders, is likely an influential driver of adaptive success. Large-scale gene duplication events and subsequent paralog divergence are thought to be required for generating evolutionary novelty. Such an event has been proposed for spiders, but not tested. We de novo assembled transcriptomes from three cobweb weaving spider species. Based on phylogenetic analyses of gene families with representatives from each of the three species, we found numerous duplication events indicative of a whole genome or segmental duplication. We estimated the age of the gene duplications relative to several speciation events within spiders and arachnids and found that the duplications likely occurred after the divergence of scorpions (order Scorpionida) and spiders (order Araneae), but before the divergence of the spider suborders Mygalomorphae and Araneomorphae, near the evolutionary origin of spider silk glands. Transcripts that are expressed exclusively or primarily within black widow silk glands are more likely to have a paralog descended from the ancient duplication event and have elevated amino acid replacement rates compared with other transcripts. Thus, an ancient large-scale gene duplication event within the spider lineage was likely an important source of molecular novelty during the evolution of silk gland-specific expression. This duplication event may have provided genetic material for subsequent silk gland diversification in the true spiders (Araneomorphae). © The Author(s) 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  18. Promoting wellbeing in young unemployed adults: the importance of identifying meaningful patterns of time use.

    Science.gov (United States)

    Scanlan, Justin Newton; Bundy, Anita C; Matthews, Lynda R

    2011-04-01

    This study set out to explore the differences in time use between 'unemployed', 'unemployed but in education' and part-time and full-time employed 18- to 25-year-old Australians. Unemployed individuals generally experience poor health and this may be related to the way they use their time. Activity-based interventions may be one health-promoting strategy. This knowledge is important for all occupational therapists, as many service users are likely to be unemployed. Time use of unemployed 18- to 25-year-olds (measured using the Modified Occupational Questionnaire) was compared with the time use of part- and full-time employed 18- to 25-year-olds (from the 2006 Australian Time Use Survey). Individuals in the 'unemployed' groups spent significantly less time engaged in work-related activities than their employed peers. This time was reallocated mainly to recreation and leisure and household work (for both men and women) and child care and sleeping (women only). Recreation and leisure activities were generally passive, home-based activities such as watching television or 'doing nothing'. Individuals in the 'unemployed but in education' groups also spent less time in employment-related activities, but the majority of this time was reallocated to education activities. Individuals in the 'unemployed' groups spent large amounts of time engaged in potentially non-directed use of time (e.g. watching television or 'doing nothing'). Such patterns of time use have previously been associated with poor health. To support the health of unemployed individuals more effectively, occupational therapy interventions must focus on enhancing the quality of time use for this population. © 2010 The Authors. Australian Occupational Therapy Journal © 2010 Australian Association of Occupational Therapists.

  19. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  20. Use of amplified fragment length polymorphism analysis to identify medically important Candida spp., including C-dubliniensis

    NARCIS (Netherlands)

    Borst, A; Theelen, B; Reinders, E; Boekhout, T; Fluit, AC; Savelkoul, PHM

    Non-Candida albicans Candida species are increasingly being isolated. These species show differences in levels of resistance to antimycotic agents and mortality. Therefore, it is important to be able to correctly identify the causative organism to the species level. Identification of C. dubliniensis

  1. The hierarchy-by-interval approach to identifying important models that need improvement in severe-accident simulation codes

    International Nuclear Information System (INIS)

    Heames, T.J.; Khatib-Rahbar, M.; Kelly, J.E.

    1995-01-01

    The hierarchy-by-interval (HBI) methodology was developed to determine an appropriate phenomena identification and ranking table for an independent peer review of severe-accident computer codes. The methodology is described, and the results of a specific code review are presented. Use of this systematic and structured approach ensures that important code models that need improvement are identified and prioritized, which allows code sponsors to more effectively direct limited resources in future code development. In addition, critical phenomenological areas that need more fundamental work, such as experimentation, are identified

  2. Imaging-Based Screen Identifies Laminin 411 as a Physiologically Relevant Niche Factor with Importance for i-Hep Applications

    Directory of Open Access Journals (Sweden)

    John Ong

    2018-03-01

    Full Text Available Summary: Use of hepatocytes derived from induced pluripotent stem cells (i-Heps is limited by their functional differences in comparison with primary cells. Extracellular niche factors likely play a critical role in bridging this gap. Using image-based characterization (high content analysis; HCA of freshly isolated hepatocytes from 17 human donors, we devised and validated an algorithm (Hepatocyte Likeness Index; HLI for comparing the hepatic properties of cells against a physiological gold standard. The HLI was then applied in a targeted screen of extracellular niche factors to identify substrates driving i-Heps closer to the standard. Laminin 411, the top hit, was validated in two additional induced pluripotent stem cell (iPSC lines, primary tissue, and an in vitro model of α1-antitrypsin deficiency. Cumulatively, these data provide a reference method to control and screen for i-Hep differentiation, identify Laminin 411 as a key niche protein, and underscore the importance of combining substrates, soluble factors, and HCA when developing iPSC applications. : Rashid and colleagues demonstrate the utility of a high-throughput imaging platform for identification of physiologically relevant extracellular niche factors to advance i-Heps closer to their primary tissue counterparts. The extracellular matrix (ECM protein screen identified Laminin 411 as an important niche factor facilitating i-Hep-based disease modeling in vitro. Keywords: iPS hepatocytes, extracellular niche, image-based screening, disease modeling, laminin

  3. NRC Information No. 90-01: Importance of proper response to self-identified violations by licensees

    International Nuclear Information System (INIS)

    Cunningham, R.E.

    1992-01-01

    NRC expects a high standard of compliance by its licensees and requires that licensees provide NRC accurate and complete information and that required records will also be complete and accurate in all material respects. Licensees should be aware of the importance placed by NRC on licensee programs for self detection, correction and reporting of violations or errors related to regulatory requirements. The General Statement of Policy and Procedures for NRC Enforcement Actions in Appendix C to 10 CFR Part 2 underscores the importance of licensees responding promptly and properly to self-identified violations in two ways. It is suggested that when a licensee identifies a violation involving an NRC-required record, the licensee should make a dated notation indicating identification, either on the record itself or other appropriate documentation retrievable for NRC review. The record with the self-identified violation noted should not be altered in any way to mask the correction. The licensee should determine the cause of the violation, correct the root cause of the violation, and document such findings in an appropriate manner. Licensees should also assure that if a report of the violation is required, the report is submitted to NRC in a timely manner. These actions will be considered by NRC in making any enforcement decision, and generally lead to lesser or no civil penalty

  4. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 1: Review and comparison of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked

  5. Treatment of Non-Small Cell Lung Cancer Patients With Proton Beam-Based Stereotactic Body Radiotherapy: Dosimetric Comparison With Photon Plans Highlights Importance of Range Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Seco, Joao, E-mail: jseco@partners.org [Department of Radiation Oncology, Harvard Medical School and Massachusetts General Hospital, Boston, MA (United States); Panahandeh, Hamid Reza [Department of Radiation Oncology, Harvard Medical School and Massachusetts General Hospital, Boston, MA (United States); Westover, Kenneth [Department of Radiation Oncology, Harvard Medical School and Massachusetts General Hospital, Boston, MA (United States); Harvard Radiation Oncology Program, Harvard Medical School, Boston, MA (United States); Adams, Judith; Willers, Henning [Department of Radiation Oncology, Harvard Medical School and Massachusetts General Hospital, Boston, MA (United States)

    2012-05-01

    Purpose: Proton beam radiotherapy has been proposed for use in stereotactic body radiotherapy (SBRT) for early-stage non-small-cell lung cancer. In the present study, we sought to analyze how the range uncertainties for protons might affect its therapeutic utility for SBRT. Methods and Materials: Ten patients with early-stage non-small-cell lung cancer received SBRT with two to three proton beams. The patients underwent repeat planning for photon SBRT, and the dose distributions to the normal and tumor tissues were compared with the proton plans. The dosimetric comparisons were performed within an operational definition of high- and low-dose regions representing volumes receiving >50% and <50% of the prescription dose, respectively. Results: In high-dose regions, the average volume receiving {>=}95% of the prescription dose was larger for proton than for photon SBRT (i.e., 46.5 cm{sup 3} vs. 33.5 cm{sup 3}; p = .009, respectively). The corresponding conformity indexes were 2.46 and 1.56. For tumors in close proximity to the chest wall, the chest wall volume receiving {>=}30 Gy was 7 cm{sup 3} larger for protons than for photons (p = .06). In low-dose regions, the lung volume receiving {>=}5 Gy and maximum esophagus dose were smaller for protons than for photons (p = .019 and p < .001, respectively). Conclusions: Protons generate larger high-dose regions than photons because of range uncertainties. This can result in nearby healthy organs (e.g., chest wall) receiving close to the prescription dose, at least when two to three beams are used, such as in our study. Therefore, future research should explore the benefit of using more than three beams to reduce the dose to nearby organs. Additionally, clinical subgroups should be identified that will benefit from proton SBRT.

  6. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 2: robustness of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type II errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples

  7. Crowd-sourced Ontology for Photoleukocoria: Identifying Common Internet Search Terms for a Potentially Important Pediatric Ophthalmic Sign.

    Science.gov (United States)

    Staffieri, Sandra E; Kearns, Lisa S; Sanfilippo, Paul G; Craig, Jamie E; Mackey, David A; Hewitt, Alex W

    2018-02-01

    Leukocoria is the most common presenting sign for pediatric eye disease including retinoblastoma and cataract, with worse outcomes if diagnosis is delayed. We investigated whether individuals could identify leukocoria in photographs (photoleukocoria) and examined their subsequent Internet search behavior. Using a web-based questionnaire, in this cross-sectional study we invited adults aged over 18 years to view two photographs of a child with photoleukocoria, and then search the Internet to determine a possible diagnosis and action plan. The most commonly used search terms and websites accessed were recorded. The questionnaire was completed by 1639 individuals. Facebook advertisement was the most effective recruitment strategy. The mean age of all respondents was 38.95 ± 14.59 years (range, 18-83), 94% were female, and 59.3% had children. An abnormality in the images presented was identified by 1613 (98.4%) participants. The most commonly used search terms were: "white," "pupil," "photo," and "eye" reaching a variety of appropriate websites or links to print or social media articles. Different words or phrases were used to describe the same observation of photoleukocoria leading to a range of websites. Variations in the description of observed signs and search words influenced the sites reached, information obtained, and subsequent help-seeking intentions. Identifying the most commonly used search terms for photoleukocoria is an important step for search engine optimization. Being directed to the most appropriate websites informing of the significance of photoleukocoria and the appropriate actions to take could improve delays in diagnosis of important pediatric eye disease such as retinoblastoma or cataract.

  8. Identifying and Characterizing Important Trembling Aspen Competitors with Juvenile Lodgepole Pine in Three South-Central British Columbia Ecosystems

    Directory of Open Access Journals (Sweden)

    Teresa A. Newsome

    2012-01-01

    Full Text Available Critical height ratios for predicting competition between trembling aspen and lodgepole pine were identified in six juvenile stands in three south-central British Columbia ecosystems. We used a series of regression analyses predicting pine stem diameter from the density of neighbouring aspen in successively shorter relative height classes to identify the aspen-pine height ratio that maximized R2. Critical height ratios varied widely among sites when stands were 8–12 years old but, by age 14–19, had converged at 1.25–1.5. Maximum R2 values at age 14–19 ranged from 13.4% to 69.8%, demonstrating that the importance of aspen competition varied widely across a relatively small geographic range. Logistic regression also indicated that the risk of poor pine vigour in the presence of aspen varied between sites. Generally, the degree of competition, risk to pine vigour, and size of individual aspen contributing to the models declined along a gradient of decreasing ecosystem productivity.

  9. Use of DNA sequences to identify forensically important fly species and their distribution in the coastal region of Central California.

    Science.gov (United States)

    Nakano, Angie; Honda, Jeff

    2015-08-01

    Forensic entomology has gained prominence in recent years, as improvements in DNA technology and molecular methods have allowed insect and other arthropod evidence to become increasingly useful in criminal and civil investigations. However, comprehensive faunal inventories are still needed, including cataloging local DNA sequences for forensically significant Diptera. This multi-year fly-trapping study was built upon and expanded a previous survey of these flies in Santa Clara County, including the addition of genetic barcoding data from collected species of flies. Flies from the families Calliphoridae, Sarcophagidae, and Muscidae were trapped in meat-baited traps set in a variety of locations throughout the county. Flies were identified using morphological features and confirmed by molecular analysis. A total of 16 calliphorid species, 11 sarcophagid species, and four muscid species were collected and differentiated. This study found more species of flies than previous area surveys and established new county records for two calliphorid species: Cynomya cadaverina and Chrysomya rufifacies. Differences were found in fly fauna in different areas of the county, indicating the importance of microclimates in the distribution of these flies. Molecular analysis supported the use of DNA barcoding as an effective method of identifying cryptic fly species. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Combining Methods to Describe Important Marine Habitats for Top Predators: Application to Identify Biological Hotspots in Tropical Waters.

    Science.gov (United States)

    Thiers, Laurie; Louzao, Maite; Ridoux, Vincent; Le Corre, Matthieu; Jaquemet, Sébastien; Weimerskirch, Henri

    2014-01-01

    In tropical waters resources are usually scarce and patchy, and predatory species generally show specific adaptations for foraging. Tropical seabirds often forage in association with sub-surface predators that create feeding opportunities by bringing prey close to the surface, and the birds often aggregate in large multispecific flocks. Here we hypothesize that frigatebirds, a tropical seabird adapted to foraging with low energetic costs, could be a good predictor of the distribution of their associated predatory species, including other seabirds (e.g. boobies, terns) and subsurface predators (e.g., dolphins, tunas). To test this hypothesis, we compared distribution patterns of marine predators in the Mozambique Channel based on a long-term dataset of both vessel- and aerial surveys, as well as tracking data of frigatebirds. By developing species distribution models (SDMs), we identified key marine areas for tropical predators in relation to contemporaneous oceanographic features to investigate multi-species spatial overlap areas and identify predator hotspots in the Mozambique Channel. SDMs reasonably matched observed patterns and both static (e.g. bathymetry) and dynamic (e.g. Chlorophyll a concentration and sea surface temperature) factors were important explaining predator distribution patterns. We found that the distribution of frigatebirds included the distributions of the associated species. The central part of the channel appeared to be the best habitat for the four groups of species considered in this study (frigatebirds, brown terns, boobies and sub-surface predators).

  11. Combining Methods to Describe Important Marine Habitats for Top Predators: Application to Identify Biological Hotspots in Tropical Waters.

    Directory of Open Access Journals (Sweden)

    Laurie Thiers

    Full Text Available In tropical waters resources are usually scarce and patchy, and predatory species generally show specific adaptations for foraging. Tropical seabirds often forage in association with sub-surface predators that create feeding opportunities by bringing prey close to the surface, and the birds often aggregate in large multispecific flocks. Here we hypothesize that frigatebirds, a tropical seabird adapted to foraging with low energetic costs, could be a good predictor of the distribution of their associated predatory species, including other seabirds (e.g. boobies, terns and subsurface predators (e.g., dolphins, tunas. To test this hypothesis, we compared distribution patterns of marine predators in the Mozambique Channel based on a long-term dataset of both vessel- and aerial surveys, as well as tracking data of frigatebirds. By developing species distribution models (SDMs, we identified key marine areas for tropical predators in relation to contemporaneous oceanographic features to investigate multi-species spatial overlap areas and identify predator hotspots in the Mozambique Channel. SDMs reasonably matched observed patterns and both static (e.g. bathymetry and dynamic (e.g. Chlorophyll a concentration and sea surface temperature factors were important explaining predator distribution patterns. We found that the distribution of frigatebirds included the distributions of the associated species. The central part of the channel appeared to be the best habitat for the four groups of species considered in this study (frigatebirds, brown terns, boobies and sub-surface predators.

  12. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  13. Uncertainty of Modal Parameters Estimated by ARMA Models

    DEFF Research Database (Denmark)

    Jensen, Jakob Laigaard; Brincker, Rune; Rytter, Anders

    In this paper the uncertainties of identified modal parameters such as eigenfrequencies and damping ratios are assessed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty of the param...

  14. Shallow water table effects on water, sediment, and pesticide transport in vegetative filter strips - Part 2: model coupling, application, factor importance, and uncertainty

    Science.gov (United States)

    Lauvernet, Claire; Muñoz-Carpena, Rafael

    2018-01-01

    Vegetative filter strips are often used for protecting surface waters from pollution transferred by surface runoff in agricultural watersheds. In Europe, they are often prescribed along the stream banks, where a seasonal shallow water table (WT) could decrease the buffer zone efficiency. In spite of this potentially important effect, there are no systematic experimental or theoretical studies on the effect of this soil boundary condition on the VFS efficiency. In the companion paper (Muñoz-Carpena et al., 2018), we developed a physically based numerical algorithm (SWINGO) that allows the representation of soil infiltration with a shallow water table. Here we present the dynamic coupling of SWINGO with VFSMOD, an overland flow and transport mathematical model to study the WT influence on VFS efficiency in terms of reductions of overland flow, sediment, and pesticide transport. This new version of VFSMOD was applied to two contrasted benchmark field studies in France (sandy-loam soil in a Mediterranean semicontinental climate, and silty clay in a temperate oceanic climate), where limited testing of the model with field data on one of the sites showed promising results. The application showed that for the conditions of the studies, VFS efficiency decreases markedly when the water table is 0 to 1.5 m from the surface. In order to evaluate the relative importance of WT among other input factors controlling VFS efficiency, global sensitivity and uncertainty analysis (GSA) was applied on the benchmark studies. The most important factors found for VFS overland flow reduction were saturated hydraulic conductivity and WT depth, added to sediment characteristics and VFS dimensions for sediment and pesticide reductions. The relative importance of WT varied as a function of soil type (most important at the silty-clay soil) and hydraulic loading (rainfall + incoming runoff) at each site. The presence of WT introduced more complex responses dominated by strong interactions in

  15. Shallow water table effects on water, sediment, and pesticide transport in vegetative filter strips – Part 2: model coupling, application, factor importance, and uncertainty

    Directory of Open Access Journals (Sweden)

    C. Lauvernet

    2018-01-01

    Full Text Available Vegetative filter strips are often used for protecting surface waters from pollution transferred by surface runoff in agricultural watersheds. In Europe, they are often prescribed along the stream banks, where a seasonal shallow water table (WT could decrease the buffer zone efficiency. In spite of this potentially important effect, there are no systematic experimental or theoretical studies on the effect of this soil boundary condition on the VFS efficiency. In the companion paper (Muñoz-Carpena et al., 2018, we developed a physically based numerical algorithm (SWINGO that allows the representation of soil infiltration with a shallow water table. Here we present the dynamic coupling of SWINGO with VFSMOD, an overland flow and transport mathematical model to study the WT influence on VFS efficiency in terms of reductions of overland flow, sediment, and pesticide transport. This new version of VFSMOD was applied to two contrasted benchmark field studies in France (sandy-loam soil in a Mediterranean semicontinental climate, and silty clay in a temperate oceanic climate, where limited testing of the model with field data on one of the sites showed promising results. The application showed that for the conditions of the studies, VFS efficiency decreases markedly when the water table is 0 to 1.5 m from the surface. In order to evaluate the relative importance of WT among other input factors controlling VFS efficiency, global sensitivity and uncertainty analysis (GSA was applied on the benchmark studies. The most important factors found for VFS overland flow reduction were saturated hydraulic conductivity and WT depth, added to sediment characteristics and VFS dimensions for sediment and pesticide reductions. The relative importance of WT varied as a function of soil type (most important at the silty-clay soil and hydraulic loading (rainfall + incoming runoff at each site. The presence of WT introduced more complex responses dominated by strong

  16. Molecular characterization of NRXN1 deletions from 19,263 clinical microarray cases identifies exons important for neurodevelopmental disease expression

    Science.gov (United States)

    Lowther, Chelsea; Speevak, Marsha; Armour, Christine M.; Goh, Elaine S.; Graham, Gail E.; Li, Chumei; Zeesman, Susan; Nowaczyk, Malgorzata J.M.; Schultz, Lee-Anne; Morra, Antonella; Nicolson, Rob; Bikangaga, Peter; Samdup, Dawa; Zaazou, Mostafa; Boyd, Kerry; Jung, Jack H.; Siu, Victoria; Rajguru, Manjulata; Goobie, Sharan; Tarnopolsky, Mark A.; Prasad, Chitra; Dick, Paul T.; Hussain, Asmaa S.; Walinga, Margreet; Reijenga, Renske G.; Gazzellone, Matthew; Lionel, Anath C.; Marshall, Christian R.; Scherer, Stephen W.; Stavropoulos, Dimitri J.; McCready, Elizabeth; Bassett, Anne S.

    2016-01-01

    Purpose The purpose of the current study was to assess the penetrance of NRXN1 deletions. Methods We compared the prevalence and genomic extent of NRXN1 deletions identified among 19,263 clinically referred cases to that of 15,264 controls. The burden of additional clinically relevant CNVs was used as a proxy to estimate the relative penetrance of NRXN1 deletions. Results We identified 41 (0.21%) previously unreported exonic NRXN1 deletions ascertained for developmental delay/intellectual disability, significantly greater than in controls [OR=8.14 (95% CI 2.91–22.72), p< 0.0001)]. Ten (22.7%) of these had a second clinically relevant CNV. Subjects with a deletion near the 3′ end of NRXN1 were significantly more likely to have a second rare CNV than subjects with a 5′ NRXN1 deletion [OR=7.47 (95% CI 2.36–23.61), p=0.0006]. The prevalence of intronic NRXN1 deletions was not statistically different between cases and controls (p=0.618). The majority (63.2%) of intronic NRXN1 deletion cases had a second rare CNV, a two-fold greater prevalence than for exonic NRXN1 deletion cases (p=0.0035). Conclusions The results support the importance of exons near the 5′ end of NRXN1 in the expression of neurodevelopmental disorders. Intronic NRXN1 deletions do not appear to substantially increase the risk for clinical phenotypes. PMID:27195815

  17. Gene Expression Profiling Identifies Important Genes Affected by R2 Compound Disrupting FAK and P53 Complex

    International Nuclear Information System (INIS)

    Golubovskaya, Vita M.; Ho, Baotran; Conroy, Jeffrey; Liu, Song; Wang, Dan; Cance, William G.

    2014-01-01

    Focal Adhesion Kinase (FAK) is a non-receptor kinase that plays an important role in many cellular processes: adhesion, proliferation, invasion, angiogenesis, metastasis and survival. Recently, we have shown that Roslin 2 or R2 (1-benzyl-15,3,5,7-tetraazatricyclo[3.3.1.1~3,7~]decane) compound disrupts FAK and p53 proteins, activates p53 transcriptional activity, and blocks tumor growth. In this report we performed a microarray gene expression analysis of R2-treated HCT116 p53 +/+ and p53 −/− cells and detected 1484 genes that were significantly up- or down-regulated (p < 0.05) in HCT116 p53 +/+ cells but not in p53 −/− cells. Among up-regulated genes in HCT p53 +/+ cells we detected critical p53 targets: Mdm-2, Noxa-1, and RIP1. Among down-regulated genes, Met, PLK2, KIF14, BIRC2 and other genes were identified. In addition, a combination of R2 compound with M13 compound that disrupts FAK and Mmd-2 complex or R2 and Nutlin-1 that disrupts Mdm-2 and p53 decreased clonogenicity of HCT116 p53 +/+ colon cancer cells more significantly than each agent alone in a p53-dependent manner. Thus, the report detects gene expression profile in response to R2 treatment and demonstrates that the combination of drugs targeting FAK, Mdm-2, and p53 can be a novel therapy approach

  18. DNA-SIP identifies sulfate-reducing Clostridia as important toluene degraders in tar-oil-contaminated aquifer sediment

    Energy Technology Data Exchange (ETDEWEB)

    Winderl, C.; Penning, H.; von Netzer, F.; Meckenstock, R.U.; Lueders, T. [Helmholtz Zentrum Munchen, Neuherberg (Germany)

    2010-10-15

    Global groundwater resources are constantly challenged by a multitude of contaminants such as aromatic hydrocarbons. Especially in anaerobic habitats, a large diversity of unrecognized microbial populations may be responsible for their degradation. Still, our present understanding of the respective microbiota and their ecophysiology is almost exclusively based on a small number of cultured organisms, mostly within the Proteobacteria. Here, by DNA-based stable isotope probing (SIP), we directly identified the most active sulfate-reducing toluene degraders in a diverse sedimentary microbial community originating from a tar-oil-contaminated aquifer at a former coal gasification plant. On incubation of fresh sediments with {sup 13}C{sub 7}-toluene, the production of both sulfide and (CS{sub 2}){sup 13}CO{sub 2} was clearly coupled to the {sup 13}Clabeling of DNA of microbes related to Desulfosporosinus spp. within the Peptococcaceae (Clostridia). The screening of labeled DNA fractions also suggested a novel benzylsuccinate synthase alpha-subunit (bssA) sequence type previously only detected in the environment to be tentatively affiliated with these degraders. However, carbon flow from the contaminant into degrader DNA was only similar to 50%, pointing toward high ratios of heterotrophic CS{sub 2}-fixation during assimilation of acetyl-CoA originating from the contaminant by these degraders. These findings demonstrate that the importance of non-proteobacterial populations in anaerobic aromatics degradation, as well as their specific ecophysiology in the subsurface may still be largely ungrasped.

  19. Evaluation of bentonite alteration due to interactions with iron. Sensitivity analyses to identify the important factors for the bentonite alteration

    International Nuclear Information System (INIS)

    Sasamoto, Hiroshi; Wilson, James; Sato, Tsutomu

    2013-01-01

    Performance assessment of geological disposal systems for high-level radioactive waste requires a consideration of long-term systems behaviour. It is possible that the alteration of swelling clay present in bentonite buffers might have an impact on buffer functions. In the present study, iron (as a candidate overpack material)-bentonite (I-B) interactions were evaluated as the main buffer alteration scenario. Existing knowledge on alteration of bentonite during I-B interactions was first reviewed, then the evaluation methodology was developed considering modeling techniques previously used overseas. A conceptual model for smectite alteration during I-B interactions was produced. The following reactions and processes were selected: 1) release of Fe 2+ due to overpack corrosion; 2) diffusion of Fe 2+ in compacted bentonite; 3) sorption of Fe 2+ on smectite edge and ion exchange in interlayers; 4) dissolution of primary phases and formation of alteration products. Sensitivity analyses were performed to identify the most important factors for the alteration of bentonite by I-B interactions. (author)

  20. Identifying important and feasible policies and actions for health at community sports clubs: a consensus-generating approach.

    Science.gov (United States)

    Kelly, Bridget; King, Lesley; Bauman, Adrian E; Baur, Louise A; Macniven, Rona; Chapman, Kathy; Smith, Ben J

    2014-01-01

    Children's high participation in organised sport in Australia makes sport an ideal setting for health promotion. This study aimed to generate consensus on priority health promotion objectives for community sports clubs, based on informed expert judgements. Delphi survey using three structured questionnaires. Forty-six health promotion, nutrition, physical activity and sport management/delivery professionals were approached to participate in the survey. Questionnaires used an iterative process to determine aspects of sports clubs deemed necessary for developing healthy sporting environments for children. Initially, participants were provided with a list of potential standards for a range of health promotion areas and asked to rate standards based on their importance and feasibility, and any barriers to implementation. Subsequently, participants were provided with information that summarised ratings for each standard to indicate convergence of the group, and asked to review and potentially revise their responses where they diverged. In a third round, participants ranked confirmed standards by priority. 26 professionals completed round 1, 21 completed round 2, and 18 completed round 3. The highest ranked standards related to responsible alcohol practices, availability of healthy food and drinks at sports canteens, smoke-free club facilities, restricting the sale and consumption of alcohol during junior sporting activities, and restricting unhealthy food and beverage company sponsorship. Identifying and prioritising health promotion areas that are relevant to children's sports clubs assists in focusing public health efforts and may guide future engagement of sports clubs. Approaches for providing informational and financial support to clubs to operationalise these standards are proposed. Copyright © 2013 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  1. Identifying organisational principles and management practices important to the quality of health care services for chronic conditions.

    Science.gov (United States)

    Frølich, Anne

    2012-02-01

    effect of financial incentives and public performance reporting on the behaviour of professionals and quality of care. Using secondary data, KP and the Danish health care system were compared in terms of six central dimensions: population, health care professionals, health care organisations, utilization patterns, quality measurements, and costs. Differences existed between the two systems on all dimensions, complicating the interpretation of findings. For instance, observed differences might be due to similar tendencies in the two health care systems that were observed at different times, rather than true structural differences. The expenses in the two health care systems were corrected for differences in the populations served and the purchasing power of currencies. However, no validated methods existed to correct for observed differences in case-mixes of chronic conditions. Data from a population of about half a million patients with diabetes in a large U.S. integrated health care delivery system affiliated with 41 medical centers employing 15 different CCM management practices was the basis for identifying effective management practices. Through the use of statistical modelling, the management practice of provider alerts was identified as most effective for promoting screening for hemoglobin A1c and lipid profile. The CCM was used as a framework for implementing four rehabilitation programs. The model promoted continuity of care and quality of health care services. New management practices were developed in the study, and known practices were further developed. However, the observational nature of the study limited the generalisability of the findings. In a structured literature survey focusing on the effect of financial incentives and public performance reporting on the quality of health care services, few studies documenting an effect were identified. The results varied, and important program aspects or contextual variables were often omitted. A model describing

  2. Identifying Important Gaps in Randomized Controlled Trials of Adult Cardiac Arrest Treatments: A Systematic Review of the Published Literature

    Science.gov (United States)

    Sinha, Shashank S.; Sukul, Devraj; Lazarus, John J.; Polavarapu, Vivek; Chan, Paul S.; Neumar, Robert W.; Nallamothu, Brahmajee K.

    2016-01-01

    Background Cardiac arrests are a major public health concern worldwide. The extent and types of randomized controlled trials (RCTs) – our most reliable source of clinical evidence – conducted in these high-risk patients over recent years are largely unknown. Methods and Results We performed a systematic review, identifying all RCTs published in PubMed, EMBASE, Scopus, Web of Science, and the Cochrane Library from 1995 to 2014 that focused on acute treatment of non-traumatic cardiac arrest in adults. We then extracted data on the setting of study populations, types and timing of interventions studied, risk of bias, outcomes reported and how these factors have changed over time. Over this twenty-year period, 92 RCTs were published containing 64,309 patients (median, 225.5 per trial). Of these, 81 RCTs (88.0%) involved out-of-hospital cardiac arrest whereas 4 (4.3%) involved in-hospital cardiac arrest and 7 (7.6%) included both. Eighteen RCTs (19.6%) were performed in the U.S., 68 (73.9%) were performed outside the U.S., and 6 (6.5%) were performed in both settings. Thirty-eight RCTs (41.3%) evaluated drug therapy, 39 (42.4%) evaluated device therapy, and 15 (16.3%) evaluated protocol improvements. Seventy-four RCTs (80.4%) examined interventions during the cardiac arrest, 15 (16.3%) examined post-cardiac arrest treatment, and 3 (3.3%) studied both. Overall, reporting of risk of bias was limited. The most common outcome reported was ROSC: 86 (93.5%) with only 22 (23.9%) reporting survival beyond 6 months. Fifty-three RCTs (57.6%) reported global ordinal outcomes whereas 15 (16.3%) reported quality-of-life. RCTs in the last 5 years were more likely to be focused on protocol improvement and post-cardiac arrest care. Conclusions Important gaps in RCTs of cardiac arrest treatments exist, especially those examining in-hospital cardiac arrest, protocol improvement, post-cardiac arrest care, and long-term or quality-of-life outcomes. PMID:27756794

  3. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  4. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  5. A multi-reservoir based water-hydroenergy management model for identifying the risk horizon of regional resources-energy policy under uncertainties

    International Nuclear Information System (INIS)

    Zeng, X.T.; Zhang, S.J.; Feng, J.; Huang, G.H.; Li, Y.P.; Zhang, P.; Chen, J.P.; Li, K.L.

    2017-01-01

    Highlights: • A multi-reservoir system can handle water/energy deficit, flood and sediment damage. • A MWH model is developed for planning a water allocation and energy generation issue. • A mixed fuzzy-stochastic risk analysis method (MFSR) can handle uncertainties in MWH. • A hybrid MWH model can plan human-recourse-energy with a robust and effective manner. • Results can support adjusting water-energy policy to satisfy increasing demands. - Abstract: In this study, a multi-reservoir based water-hydroenergy management (MWH) model is developed for planning water allocation and hydroenergy generation (WAHG) under uncertainties. A mixed fuzzy-stochastic risk analysis method (MFSR) is introduced to handle objective and subjective uncertainties in MWH model, which can couple fuzzy credibility programming and risk management within a general two-stage context, with aim to reflect the infeasibility risks between expected targets and random second-stage recourse costs. The developed MWH model (embedded by MFSR method) can be applied to a practical study of WAHG issue in Jing River Basin (China), which encounters conflicts between human activity and resource/energy crisis. The construction of water-energy nexus (WEN) is built to reflect integrity of economic development and resource/energy conservation, as well as confronting natural and artificial damages such as water deficit, electricity insufficient, floodwater, high sedimentation deposition contemporarily. Meanwhile, the obtained results with various credibility levels and target-violated risk levels can support generating a robust plan associated with risk control for identification of the optimized water-allocation and hydroenergy-generation alternatives, as well as flood controls. Moreover, results can be beneficial for policymakers to discern the optimal water/sediment release routes, reservoirs’ storage variations (impacted by sediment deposition), electricity supply schedules and system benefit

  6. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Campos, E [Argonne National Lab. (ANL), Argonne, IL (United States); Sisterson, Douglas [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-12-01

    The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore, an easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. Note that some of the instrument observations require mathematical algorithms (retrievals) to convert a measured engineering variable into a useful geophysical measurement. While those types of retrieval measurements are identified, this study does not address particular methods for retrieval uncertainty. As well, the ARM Facility also provides engineered data products, or value-added products (VAPs), based on multiple instrument measurements. This study does not include uncertainty estimates for those data products. We propose here that a total measurement uncertainty should be calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). The study will not expand on methods for computing these uncertainties. Instead, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available in the ARM community through the ARM instrument mentors and their ARM instrument handbooks. As a result, this study will address the first steps towards reporting ARM measurement uncertainty

  7. Importance of evaluation of uncertainties on the measurement of natural gas and petroleum volumes; Importancia da avaliacao das incertezas na medicao dos volumes de petroleo e gas natural

    Energy Technology Data Exchange (ETDEWEB)

    Silva Filho, Jose Alberto Pinheiro da; Oliveira, Thiago Barra Vidal de; Mata, Josaphat Dias da [PETROBRAS, Rio de Janeiro, RJ (Brazil)], Emails: jose.pinheiro@petrobras.com.br, thiagovidal@petrobras.com.br, josaphat@petrobras.com.br; Val, Luiz Gustavo do [Instituto de Qualidade e Metrologia (IQM), Rio de Janeiro, RJ (Brazil)], E-mail: gdoval.iqm@petrobras.com.br

    2009-07-01

    The measurement is considered as the 'cash register' of the enterprises, increasing the accuracy and the exigence at each step when come close to the delivery points, where the 0.1 % of differences are discussed. The work presents the approach used in the evaluation of measurement uncertainties in the volumes obtained of petroleum and natural gas at the processes of production in Brazil, and in the international level as well.

  8. The estimation of uncertainty of radioactivity measurement on gamma counters in radiopharmacy

    International Nuclear Information System (INIS)

    Jovanovic, M.S.; Orlic, M.; Vranjes, S.; Stamenkovic, Lj. . E-mail address of corresponding author: nikijov@vin.bg.ac.yu; Jovanovic, M.S.)

    2005-01-01

    In this paper the estimation of uncertainty of measurement of radioactivity on gamma counter in Laboratory for radioisotopes is presented. The uncertainty components, which are important for these measurements, are identified and taken into account while estimating the uncertainty of measurement.(author)

  9. Identifying organisational principles and management practices important to the quality of health care services for chronic conditions

    DEFF Research Database (Denmark)

    Frølich, Anne

    2012-01-01

    are limited, it is necessary to identify efficient methods to improve the quality of care. Comparing health care systems is a well-known method for identifying new knowledge regarding, for instance, organisational methods and principles. Kaiser Permanente (KP), an integrated health care delivery system...... in the U.S., is recognized as providing high-quality chronic care; to some extent, this is due to KP's implementation of the chronic care model (CCM). This model recommends a range of evidence-based management practices that support the implementation of evidence-based medicine. However, it is not clear...... which management practices in the CCM are most efficient and in what combinations. In addition, financial incentives and public reporting of performance are often considered effective at improving the quality of health care services, but this has not yet been definitively proved....

  10. CEC/USDOE workshop on uncertainty analysis

    International Nuclear Information System (INIS)

    Elderkin, C.E.; Kelly, G.N.

    1990-07-01

    Any measured or assessed quantity contains uncertainty. The quantitative estimation of such uncertainty is becoming increasingly important, especially in assuring that safety requirements are met in design, regulation, and operation of nuclear installations. The CEC/USDOE Workshop on Uncertainty Analysis, held in Santa Fe, New Mexico, on November 13 through 16, 1989, was organized jointly by the Commission of European Communities (CEC's) Radiation Protection Research program, dealing with uncertainties throughout the field of consequence assessment, and DOE's Atmospheric Studies in Complex Terrain (ASCOT) program, concerned with the particular uncertainties in time and space variant transport and dispersion. The workshop brought together US and European scientists who have been developing or applying uncertainty analysis methodologies, conducted in a variety of contexts, often with incomplete knowledge of the work of others in this area. Thus, it was timely to exchange views and experience, identify limitations of approaches to uncertainty and possible improvements, and enhance the interface between developers and users of uncertainty analysis methods. Furthermore, the workshop considered the extent to which consistent, rigorous methods could be used in various applications within consequence assessment. 3 refs

  11. Identifying diabetes-related important protein targets with few interacting partners with the PageRank algorithm.

    Science.gov (United States)

    Grolmusz, Vince I

    2015-04-01

    Diabetes is a growing concern for the developed nations worldwide. New genomic, metagenomic and gene-technologic approaches may yield considerable results in the next several years in its early diagnosis, or in advances in therapy and management. In this work, we highlight some human proteins that may serve as new targets in the early diagnosis and therapy. With the help of a very successful mathematical tool for network analysis that formed the basis of the early successes of Google(TM), Inc., we analyse the human protein-protein interaction network gained from the IntAct database with a mathematical algorithm. The novelty of our approach is that the new protein targets suggested do not have many interacting partners (so, they are not hubs or super-hubs), so their inhibition or promotion probably will not have serious side effects. We have identified numerous possible protein targets for diabetes therapy and/or management; some of these have been well known for a long time (these validate our method), some of them appeared in the literature in the last 12 months (these show the cutting edge of the algorithm), and the remainder are still unknown to be connected with diabetes, witnessing completely new hits of the method.

  12. Gene expression profiling and candidate gene resequencing identifies pathways and mutations important for malignant transformation caused by leukemogenic fusion genes.

    Science.gov (United States)

    Novak, Rachel L; Harper, David P; Caudell, David; Slape, Christopher; Beachy, Sarah H; Aplan, Peter D

    2012-12-01

    NUP98-HOXD13 (NHD13) and CALM-AF10 (CA10) are oncogenic fusion proteins produced by recurrent chromosomal translocations in patients with acute myeloid leukemia (AML). Transgenic mice that express these fusions develop AML with a long latency and incomplete penetrance, suggesting that collaborating genetic events are required for leukemic transformation. We employed genetic techniques to identify both preleukemic abnormalities in healthy transgenic mice as well as collaborating events leading to leukemic transformation. Candidate gene resequencing revealed that 6 of 27 (22%) CA10 AMLs spontaneously acquired a Ras pathway mutation and 8 of 27 (30%) acquired an Flt3 mutation. Two CA10 AMLs acquired an Flt3 internal-tandem duplication, demonstrating that these mutations can be acquired in murine as well as human AML. Gene expression profiles revealed a marked upregulation of Hox genes, particularly Hoxa5, Hoxa9, and Hoxa10 in both NHD13 and CA10 mice. Furthermore, mir196b, which is embedded within the Hoxa locus, was overexpressed in both CA10 and NHD13 samples. In contrast, the Hox cofactors Meis1 and Pbx3 were differentially expressed; Meis1 was increased in CA10 AMLs but not NHD13 AMLs, whereas Pbx3 was consistently increased in NHD13 but not CA10 AMLs. Silencing of Pbx3 in NHD13 cells led to decreased proliferation, increased apoptosis, and decreased colony formation in vitro, suggesting a previously unexpected role for Pbx3 in leukemic transformation. Published by Elsevier Inc.

  13. Using the apparent diffusion coefficient to identifying MGMT promoter methylation status early in glioblastoma: importance of analytical method

    Energy Technology Data Exchange (ETDEWEB)

    Rundle-Thiele, Dayle [Centre for Clinical Research, University of Queensland, Brisbane, Queensland (Australia); Day, Bryan; Stringer, Brett [Brain Cancer Research Unit, Queensland Institute of Medical Research, Brisbane, Queensland (Australia); Fay, Michael [Department of Radiation Oncology, Royal Brisbane and Women' s Hospital, Brisbane, Queensland (Australia); Martin, Jennifer [Discipline of Clinical Pharmacology, School of Medicine and Public Health, University of Newcastle, Newcastle, New South Wales (Australia); Jeffree, Rosalind L [Department of Neurosurgery, Royal Brisbane and Women' s Hospital, Brisbane, Queensland (Australia); Thomas, Paul [Queensland PET Service, Royal Brisbane and Women' s Hospital, Brisbane, Queensland (Australia); Bell, Christopher [Centre for Clinical Research, University of Queensland, Brisbane, Queensland (Australia); Salvado, Olivier [CSIRO Digital Productivity Flagship, CSIRO, Herston, Queensland (Australia); Gal, Yaniv [Centre for Medical Diagnostic Technologies in Queensland, University of Queensland, Brisbane, Queensland (Australia); Coulthard, Alan [Discipline of Medical Imaging, University of Queensland, St Lucia, Queensland (Australia); Department of Medical Imaging, Royal Brisbane and Women' s Hospital, Brisbane, Queensland (Australia); Crozier, Stuart [Centre for Medical Diagnostic Technologies in Queensland, University of Queensland, Brisbane, Queensland (Australia); Rose, Stephen, E-mail: stephen.rose@csiro.au [CSIRO Digital Productivity Flagship, CSIRO, Herston, Queensland (Australia); Centre for Clinical Research, University of Queensland, Brisbane, Queensland (Australia)

    2015-06-15

    Accurate knowledge of O{sup 6}-methylguanine methyltransferase (MGMT) gene promoter subtype in patients with glioblastoma (GBM) is important for treatment. However, this test is not always available. Pre-operative diffusion MRI (dMRI) can be used to probe tumour biology using the apparent diffusion coefficient (ADC); however, its ability to act as a surrogate to predict MGMT status has shown mixed results. We investigated whether this was due to variations in the method used to analyse ADC. We undertook a retrospective study of 32 patients with GBM who had MGMT status measured. Matching pre-operative MRI data were used to calculate the ADC within contrast enhancing regions of tumour. The relationship between ADC and MGMT was examined using two published ADC methods. A strong trend between a measure of ‘minimum ADC’ and methylation status was seen. An elevated minimum ADC was more likely in the methylated compared to the unmethylated MGMT group (U = 56, P = 0.0561). In contrast, utilising a two-mixture model histogram approach, a significant reduction in mean measure of the ‘low ADC’ component within the histogram was associated with an MGMT promoter methylation subtype (P < 0.0246). This study shows that within the same patient cohort, the method selected to analyse ADC measures has a significant bearing on the use of that metric as a surrogate marker of MGMT status. Thus for dMRI data to be clinically useful, consistent methods of data analysis need to be established prior to establishing any relationship with genetic or epigenetic profiling.

  14. Using the apparent diffusion coefficient to identifying MGMT promoter methylation status early in glioblastoma: importance of analytical method

    International Nuclear Information System (INIS)

    Rundle-Thiele, Dayle; Day, Bryan; Stringer, Brett; Fay, Michael; Martin, Jennifer; Jeffree, Rosalind L; Thomas, Paul; Bell, Christopher; Salvado, Olivier; Gal, Yaniv; Coulthard, Alan; Crozier, Stuart; Rose, Stephen

    2015-01-01

    Accurate knowledge of O 6 -methylguanine methyltransferase (MGMT) gene promoter subtype in patients with glioblastoma (GBM) is important for treatment. However, this test is not always available. Pre-operative diffusion MRI (dMRI) can be used to probe tumour biology using the apparent diffusion coefficient (ADC); however, its ability to act as a surrogate to predict MGMT status has shown mixed results. We investigated whether this was due to variations in the method used to analyse ADC. We undertook a retrospective study of 32 patients with GBM who had MGMT status measured. Matching pre-operative MRI data were used to calculate the ADC within contrast enhancing regions of tumour. The relationship between ADC and MGMT was examined using two published ADC methods. A strong trend between a measure of ‘minimum ADC’ and methylation status was seen. An elevated minimum ADC was more likely in the methylated compared to the unmethylated MGMT group (U = 56, P = 0.0561). In contrast, utilising a two-mixture model histogram approach, a significant reduction in mean measure of the ‘low ADC’ component within the histogram was associated with an MGMT promoter methylation subtype (P < 0.0246). This study shows that within the same patient cohort, the method selected to analyse ADC measures has a significant bearing on the use of that metric as a surrogate marker of MGMT status. Thus for dMRI data to be clinically useful, consistent methods of data analysis need to be established prior to establishing any relationship with genetic or epigenetic profiling

  15. Uncertainties affecting fund collection, management and final utilisation

    International Nuclear Information System (INIS)

    Soederberg, Olof

    2006-01-01

    The paper presents, on a general level, major uncertainties in financing systems aiming at providing secure funding for future costs for decommissioning. The perspective chosen is that of a fund collector/manager. The paper also contains a description of how these uncertainties are dealt within the Swedish financing system and particularly from the perspective of the Board of the Swedish Nuclear Waste Fund. It is concluded that existing uncertainties are a good reason not to postpone decommissioning activities to a distant future. This aspect is important also when countries have in place financing systems that have been constructed in order to be robust against identified uncertainties. (author)

  16. Summary of existing uncertainty methods

    International Nuclear Information System (INIS)

    Glaeser, Horst

    2013-01-01

    A summary of existing and most used uncertainty methods is presented, and the main features are compared. One of these methods is the order statistics method based on Wilks' formula. It is applied in safety research as well as in licensing. This method has been first proposed by GRS for use in deterministic safety analysis, and is now used by many organisations world-wide. Its advantage is that the number of potential uncertain input and output parameters is not limited to a small number. Such a limitation was necessary for the first demonstration of the Code Scaling Applicability Uncertainty Method (CSAU) by the United States Regulatory Commission (USNRC). They did not apply Wilks' formula in their statistical method propagating input uncertainties to obtain the uncertainty of a single output variable, like peak cladding temperature. A Phenomena Identification and Ranking Table (PIRT) was set up in order to limit the number of uncertain input parameters, and consequently, the number of calculations to be performed. Another purpose of such a PIRT process is to identify the most important physical phenomena which a computer code should be suitable to calculate. The validation of the code should be focused on the identified phenomena. Response surfaces are used in some applications replacing the computer code for performing a high number of calculations. The second well known uncertainty method is the Uncertainty Methodology Based on Accuracy Extrapolation (UMAE) and the follow-up method 'Code with the Capability of Internal Assessment of Uncertainty (CIAU)' developed by the University Pisa. Unlike the statistical approaches, the CIAU does compare experimental data with calculation results. It does not consider uncertain input parameters. Therefore, the CIAU is highly dependent on the experimental database. The accuracy gained from the comparison between experimental data and calculated results are extrapolated to obtain the uncertainty of the system code predictions

  17. Confronting uncertainty in wildlife management: performance of grizzly bear management.

    Science.gov (United States)

    Artelle, Kyle A; Anderson, Sean C; Cooper, Andrew B; Paquet, Paul C; Reynolds, John D; Darimont, Chris T

    2013-01-01

    Scientific management of wildlife requires confronting the complexities of natural and social systems. Uncertainty poses a central problem. Whereas the importance of considering uncertainty has been widely discussed, studies of the effects of unaddressed uncertainty on real management systems have been rare. We examined the effects of outcome uncertainty and components of biological uncertainty on hunt management performance, illustrated with grizzly bears (Ursus arctos horribilis) in British Columbia, Canada. We found that both forms of uncertainty can have serious impacts on management performance. Outcome uncertainty alone--discrepancy between expected and realized mortality levels--led to excess mortality in 19% of cases (population-years) examined. Accounting for uncertainty around estimated biological parameters (i.e., biological uncertainty) revealed that excess mortality might have occurred in up to 70% of cases. We offer a general method for identifying targets for exploited species that incorporates uncertainty and maintains the probability of exceeding mortality limits below specified thresholds. Setting targets in our focal system using this method at thresholds of 25% and 5% probability of overmortality would require average target mortality reductions of 47% and 81%, respectively. Application of our transparent and generalizable framework to this or other systems could improve management performance in the presence of uncertainty.

  18. Confronting uncertainty in wildlife management: performance of grizzly bear management.

    Directory of Open Access Journals (Sweden)

    Kyle A Artelle

    Full Text Available Scientific management of wildlife requires confronting the complexities of natural and social systems. Uncertainty poses a central problem. Whereas the importance of considering uncertainty has been widely discussed, studies of the effects of unaddressed uncertainty on real management systems have been rare. We examined the effects of outcome uncertainty and components of biological uncertainty on hunt management performance, illustrated with grizzly bears (Ursus arctos horribilis in British Columbia, Canada. We found that both forms of uncertainty can have serious impacts on management performance. Outcome uncertainty alone--discrepancy between expected and realized mortality levels--led to excess mortality in 19% of cases (population-years examined. Accounting for uncertainty around estimated biological parameters (i.e., biological uncertainty revealed that excess mortality might have occurred in up to 70% of cases. We offer a general method for identifying targets for exploited species that incorporates uncertainty and maintains the probability of exceeding mortality limits below specified thresholds. Setting targets in our focal system using this method at thresholds of 25% and 5% probability of overmortality would require average target mortality reductions of 47% and 81%, respectively. Application of our transparent and generalizable framework to this or other systems could improve management performance in the presence of uncertainty.

  19. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  20. BEPU methods and combining of uncertainties

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2004-01-01

    After approval of the revised rule on the acceptance of emergency core cooling system (ECCS) performance in 1988 there has been significant interest in the development of codes and methodologies for best-estimate loss-of-coolant accident (LOCAs) analyses. The Code Scaling, Applicability and Uncertainty (CSAU) evaluation method was developed and demonstrated for large-break (LB) LOCA in a pressurized water reactor. Later several new best estimate plus uncertainty methods (BEPUs) were developed in the world. The purpose of the paper is to identify and compare the statistical approaches of BEPU methods and present their important plant and licensing applications. The study showed that uncertainty analysis with random sampling of input parameters and the use of order statistics for desired tolerance limits of output parameters is today commonly accepted approach. The existing BEPU methods seems mature enough while the future research may be focused on the codes with internal assessment of uncertainty. (author)

  1. Uncertainty and global climate change research

    Energy Technology Data Exchange (ETDEWEB)

    Tonn, B.E. [Oak Ridge National Lab., TN (United States); Weiher, R. [National Oceanic and Atmospheric Administration, Boulder, CO (United States)

    1994-06-01

    The Workshop on Uncertainty and Global Climate Change Research March 22--23, 1994, in Knoxville, Tennessee. This report summarizes the results and recommendations of the workshop. The purpose of the workshop was to examine in-depth the concept of uncertainty. From an analytical point of view, uncertainty is a central feature of global climate science, economics and decision making. The magnitude and complexity of uncertainty surrounding global climate change has made it quite difficult to answer even the most simple and important of questions-whether potentially costly action is required now to ameliorate adverse consequences of global climate change or whether delay is warranted to gain better information to reduce uncertainties. A major conclusion of the workshop is that multidisciplinary integrated assessments using decision analytic techniques as a foundation is key to addressing global change policy concerns. First, uncertainty must be dealt with explicitly and rigorously since it is and will continue to be a key feature of analysis and recommendations on policy questions for years to come. Second, key policy questions and variables need to be explicitly identified, prioritized, and their uncertainty characterized to guide the entire scientific, modeling, and policy analysis process. Multidisciplinary integrated assessment techniques and value of information methodologies are best suited for this task. In terms of timeliness and relevance of developing and applying decision analytic techniques, the global change research and policy communities are moving rapidly toward integrated approaches to research design and policy analysis.

  2. Uncertainty estimation of ultrasonic thickness measurement

    International Nuclear Information System (INIS)

    Yassir Yassen, Abdul Razak Daud; Mohammad Pauzi Ismail; Abdul Aziz Jemain

    2009-01-01

    The most important factor that should be taken into consideration when selecting ultrasonic thickness measurement technique is its reliability. Only when the uncertainty of a measurement results is known, it may be judged if the result is adequate for intended purpose. The objective of this study is to model the ultrasonic thickness measurement function, to identify the most contributing input uncertainty components, and to estimate the uncertainty of the ultrasonic thickness measurement results. We assumed that there are five error sources significantly contribute to the final error, these sources are calibration velocity, transit time, zero offset, measurement repeatability and resolution, by applying the propagation of uncertainty law to the model function, a combined uncertainty of the ultrasonic thickness measurement was obtained. In this study the modeling function of ultrasonic thickness measurement was derived. By using this model the estimation of the uncertainty of the final output result was found to be reliable. It was also found that the most contributing input uncertainty components are calibration velocity, transit time linearity and zero offset. (author)

  3. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  4. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  5. Triangulating Principal Effectiveness: How Perspectives of Parents, Teachers, and Assistant Principals Identify the Central Importance of Managerial Skills. Working Paper 35

    Science.gov (United States)

    Grissom, Jason A.; Loeb, Susanna

    2009-01-01

    While the importance of effective principals is undisputed, few studies have addressed what specific skills principals need to promote school success. This study draws on unique data combining survey responses from principals, assistant principals, teachers and parents with rich administrative data to identify which principal skills matter most…

  6. Mind Your Product-Market Strategy on Selecting Marketing Inputs: An Uncertainty Approach in Indian Context

    OpenAIRE

    Susmita Ghosh; Bhaskar Bhowmick

    2015-01-01

    Market is an important factor for start-ups to look into during decision-making in product development and related areas. Emerging country markets are more uncertain in terms of information availability and institutional supports. The literature review of market uncertainty reveals the need for identifying factors representing the market uncertainty. This paper identifies factors for market uncertainty using Exploratory Factor Analysis (EFA) and confirmed the number of fa...

  7. Uncertainty of Modal Parameters Estimated by ARMA Models

    DEFF Research Database (Denmark)

    Jensen, Jacob Laigaard; Brincker, Rune; Rytter, Anders

    1990-01-01

    In this paper the uncertainties of identified modal parameters such as eidenfrequencies and damping ratios are assed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty of the parameters...... by simulation study of a lightly damped single degree of freedom system. Identification by ARMA models has been choosen as system identification method. It is concluded that both the sampling interval and number of sampled points may play a significant role with respect to the statistical errors. Furthermore......, it is shown that the model errors may also contribute significantly to the uncertainty....

  8. Teaching Uncertainties

    Science.gov (United States)

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  9. Calibration uncertainty

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...

  10. Uncertainty in oil projects

    International Nuclear Information System (INIS)

    Limperopoulos, G.J.

    1995-01-01

    This report presents an oil project valuation under uncertainty by means of two well-known financial techniques: The Capital Asset Pricing Model (CAPM) and The Black-Scholes Option Pricing Formula. CAPM gives a linear positive relationship between expected rate of return and risk but does not take into consideration the aspect of flexibility which is crucial for an irreversible investment as an oil price is. Introduction of investment decision flexibility by using real options can increase the oil project value substantially. Some simple tests for the importance of uncertainty in stock market for oil investments are performed. Uncertainty in stock returns is correlated with aggregate product market uncertainty according to Pindyck (1991). The results of the tests are not satisfactory due to the short data series but introducing two other explanatory variables the interest rate and Gross Domestic Product make the situation better. 36 refs., 18 figs., 6 tabs

  11. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models....... This retooling addresses several shortcomings. First, the imperfect correlation of demands reconciles the sales variation observed in and across destinations. Second, since demands for the firm's output are correlated across destinations, a firm can use previously realized demands to forecast unknown demands...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles...

  12. Genome-Wide Association Study Identifying Candidate Genes Influencing Important Agronomic Traits of Flax (Linum usitatissimum L.) Using SLAF-seq.

    Science.gov (United States)

    Xie, Dongwei; Dai, Zhigang; Yang, Zemao; Sun, Jian; Zhao, Debao; Yang, Xue; Zhang, Liguo; Tang, Qing; Su, Jianguang

    2017-01-01

    Flax ( Linum usitatissimum L.) is an important cash crop, and its agronomic traits directly affect yield and quality. Molecular studies on flax remain inadequate because relatively few flax genes have been associated with agronomic traits or have been identified as having potential applications. To identify markers and candidate genes that can potentially be used for genetic improvement of crucial agronomic traits, we examined 224 specimens of core flax germplasm; specifically, phenotypic data for key traits, including plant height, technical length, number of branches, number of fruits, and 1000-grain weight were investigated under three environmental conditions before specific-locus amplified fragment sequencing (SLAF-seq) was employed to perform a genome-wide association study (GWAS) for these five agronomic traits. Subsequently, the results were used to screen single nucleotide polymorphism (SNP) loci and candidate genes that exhibited a significant correlation with the important agronomic traits. Our analyses identified a total of 42 SNP loci that showed significant correlations with the five important agronomic flax traits. Next, candidate genes were screened in the 10 kb zone of each of the 42 SNP loci. These SNP loci were then analyzed by a more stringent screening via co-identification using both a general linear model (GLM) and a mixed linear model (MLM) as well as co-occurrences in at least two of the three environments, whereby 15 final candidate genes were obtained. Based on these results, we determined that UGT and PL are candidate genes for plant height, GRAS and XTH are candidate genes for the number of branches, Contig1437 and LU0019C12 are candidate genes for the number of fruits, and PHO1 is a candidate gene for the 1000-seed weight. We propose that the identified SNP loci and corresponding candidate genes might serve as a biological basis for improving crucial agronomic flax traits.

  13. Genome-Wide Association Study Identifying Candidate Genes Influencing Important Agronomic Traits of Flax (Linum usitatissimum L. Using SLAF-seq

    Directory of Open Access Journals (Sweden)

    Dongwei Xie

    2018-01-01

    Full Text Available Flax (Linum usitatissimum L. is an important cash crop, and its agronomic traits directly affect yield and quality. Molecular studies on flax remain inadequate because relatively few flax genes have been associated with agronomic traits or have been identified as having potential applications. To identify markers and candidate genes that can potentially be used for genetic improvement of crucial agronomic traits, we examined 224 specimens of core flax germplasm; specifically, phenotypic data for key traits, including plant height, technical length, number of branches, number of fruits, and 1000-grain weight were investigated under three environmental conditions before specific-locus amplified fragment sequencing (SLAF-seq was employed to perform a genome-wide association study (GWAS for these five agronomic traits. Subsequently, the results were used to screen single nucleotide polymorphism (SNP loci and candidate genes that exhibited a significant correlation with the important agronomic traits. Our analyses identified a total of 42 SNP loci that showed significant correlations with the five important agronomic flax traits. Next, candidate genes were screened in the 10 kb zone of each of the 42 SNP loci. These SNP loci were then analyzed by a more stringent screening via co-identification using both a general linear model (GLM and a mixed linear model (MLM as well as co-occurrences in at least two of the three environments, whereby 15 final candidate genes were obtained. Based on these results, we determined that UGT and PL are candidate genes for plant height, GRAS and XTH are candidate genes for the number of branches, Contig1437 and LU0019C12 are candidate genes for the number of fruits, and PHO1 is a candidate gene for the 1000-seed weight. We propose that the identified SNP loci and corresponding candidate genes might serve as a biological basis for improving crucial agronomic flax traits.

  14. Decommissioning funding: ethics, implementation, uncertainties

    International Nuclear Information System (INIS)

    2006-01-01

    This status report on Decommissioning Funding: Ethics, Implementation, Uncertainties also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). The report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems. (authors)

  15. Addressing uncertainty in adaptation planning for agriculture.

    Science.gov (United States)

    Vermeulen, Sonja J; Challinor, Andrew J; Thornton, Philip K; Campbell, Bruce M; Eriyagama, Nishadi; Vervoort, Joost M; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J; Hawkins, Ed; Smith, Daniel R

    2013-05-21

    We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop-climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty.

  16. Photometric Uncertainties

    Science.gov (United States)

    Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon

    2018-01-01

    The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.

  17. The strategic importance of identifying knowledge-based and intangible assets for generating value, competitiveness and innovation in sub-Saharan Africa

    Directory of Open Access Journals (Sweden)

    Nicoline Ondari-Okemwa

    2011-01-01

    Full Text Available This article discusses the strategic importance of identifying intangible assets for creating value and enhancing competitiveness and innovation in science and technology in a knowledge economy with particular reference to the sub- Saharan Africa region. It has always been difficult to gather the prerequisite information to manage such assets and create value from them. The paper discusses the nature of intangible assets, the characteristics of a knowledge economy and the role of knowledge workers in a knowledge economy. The paper also discusses the importance of identifying intangible assets in relation to capturing the value of such assets, the transfer of intangible assets to other owners and the challenges of managing organizational intangible assets. Objectives of the article include: underscoring the strategic importance of identifying intangible assets in sub-Saharan Africa; examining the performance of intangible assets in a knowledge economy; how intangible assets may generate competitiveness, economic growth and innovation; and assess how knowledge workers are becoming a dominant factor in the knowledge economy. An extensive literature review was employed to collect data for this article. It is concluded in the article that organizations and governments in sub-Saharan Africa should look at knowledge-based assets as strategic resources, even though the traditional accounting systems may still be having problems in determining the exact book value of such assets. It is recommended that organizations and government departments in sub-Saharan Africa should implement a system of the reporting of the value of intangible organizational assets just like the reporting of the value of tangible assets; and that organizations in sub-Saharan Africa should use knowledge to produce “smart products and services” which command premium prices.

  18. Analytic uncertainty and sensitivity analysis of models with input correlations

    Science.gov (United States)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  19. Estimating the measurement uncertainty in forensic blood alcohol analysis.

    Science.gov (United States)

    Gullberg, Rod G

    2012-04-01

    For many reasons, forensic toxicologists are being asked to determine and report their measurement uncertainty in blood alcohol analysis. While understood conceptually, the elements and computations involved in determining measurement uncertainty are generally foreign to most forensic toxicologists. Several established and well-documented methods are available to determine and report the uncertainty in blood alcohol measurement. A straightforward bottom-up approach is presented that includes: (1) specifying the measurand, (2) identifying the major components of uncertainty, (3) quantifying the components, (4) statistically combining the components and (5) reporting the results. A hypothetical example is presented that employs reasonable estimates for forensic blood alcohol analysis assuming headspace gas chromatography. These computations are easily employed in spreadsheet programs as well. Determining and reporting measurement uncertainty is an important element in establishing fitness-for-purpose. Indeed, the demand for such computations and information from the forensic toxicologist will continue to increase.

  20. Including model uncertainty in risk-informed decision making

    International Nuclear Information System (INIS)

    Reinert, Joshua M.; Apostolakis, George E.

    2006-01-01

    Model uncertainties can have a significant impact on decisions regarding licensing basis changes. We present a methodology to identify basic events in the risk assessment that have the potential to change the decision and are known to have significant model uncertainties. Because we work with basic event probabilities, this methodology is not appropriate for analyzing uncertainties that cause a structural change to the model, such as success criteria. We use the risk achievement worth (RAW) importance measure with respect to both the core damage frequency (CDF) and the change in core damage frequency (ΔCDF) to identify potentially important basic events. We cross-check these with generically important model uncertainties. Then, sensitivity analysis is performed on the basic event probabilities, which are used as a proxy for the model parameters, to determine how much error in these probabilities would need to be present in order to impact the decision. A previously submitted licensing basis change is used as a case study. Analysis using the SAPHIRE program identifies 20 basic events as important, four of which have model uncertainties that have been identified in the literature as generally important. The decision is fairly insensitive to uncertainties in these basic events. In three of these cases, one would need to show that model uncertainties would lead to basic event probabilities that would be between two and four orders of magnitude larger than modeled in the risk assessment before they would become important to the decision. More detailed analysis would be required to determine whether these higher probabilities are reasonable. Methods to perform this analysis from the literature are reviewed and an example is demonstrated using the case study

  1. A systematic framework for effective uncertainty assessment of severe accident calculations; Hybrid qualitative and quantitative methodology

    International Nuclear Information System (INIS)

    Hoseyni, Seyed Mohsen; Pourgol-Mohammad, Mohammad; Tehranifard, Ali Abbaspour; Yousefpour, Faramarz

    2014-01-01

    This paper describes a systematic framework for characterizing important phenomena and quantifying the degree of contribution of each parameter to the output in severe accident uncertainty assessment. The proposed methodology comprises qualitative as well as quantitative phases. The qualitative part so called Modified PIRT, being a robust process of PIRT for more precise quantification of uncertainties, is a two step process for identifying and ranking based on uncertainty importance in severe accident phenomena. In this process identified severe accident phenomena are ranked according to their effect on the figure of merit and their level of knowledge. Analytical Hierarchical Process (AHP) serves here as a systematic approach for severe accident phenomena ranking. Formal uncertainty importance technique is used to estimate the degree of credibility of the severe accident model(s) used to represent the important phenomena. The methodology uses subjective justification by evaluating available information and data from experiments, and code predictions for this step. The quantitative part utilizes uncertainty importance measures for the quantification of the effect of each input parameter to the output uncertainty. A response surface fitting approach is proposed for estimating associated uncertainties with less calculation cost. The quantitative results are used to plan in reducing epistemic uncertainty in the output variable(s). The application of the proposed methodology is demonstrated for the ACRR MP-2 severe accident test facility. - Highlights: • A two stage framework for severe accident uncertainty analysis is proposed. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • Uncertainty importance measure quantitatively calculates effect of each uncertainty source. • Methodology is applied successfully on ACRR MP-2 severe accident test facility

  2. Uncertainty analysis

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  3. Uncertainties as Barriers for Knowledge Sharing with Enterprise Social Media

    DEFF Research Database (Denmark)

    Trier, Matthias; Fung, Magdalene; Hansen, Abigail

    2017-01-01

    become a barrier for the participants’ adoption. There is only limited existing research studying the types of uncertainties that employees perceive and their impact on knowledge transfer via social media. To address this gap, this article presents a qualitative interview-based study of the adoption...... of the Enterprise Social Media tool Yammer for knowledge sharing in a large global organization. We identify and categorize nine uncertainties that were perceived as barriers by the respondents. The study revealed that the uncertainty types play an important role in affecting employees’ participation...

  4. Peer review of HEDR uncertainty and sensitivity analyses plan

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, F.O.

    1993-06-01

    This report consists of a detailed documentation of the writings and deliberations of the peer review panel that met on May 24--25, 1993 in Richland, Washington to evaluate your draft report ``Uncertainty/Sensitivity Analysis Plan`` (PNWD-2124 HEDR). The fact that uncertainties are being considered in temporally and spatially varying parameters through the use of alternative time histories and spatial patterns deserves special commendation. It is important to identify early those model components and parameters that will have the most influence on the magnitude and uncertainty of the dose estimates. These are the items that should be investigated most intensively prior to committing to a final set of results.

  5. Neural Correlates of Intolerance of Uncertainty in Clinical Disorders.

    Science.gov (United States)

    Wever, Mirjam; Smeets, Paul; Sternheim, Lot

    2015-01-01

    Intolerance of uncertainty is a key contributor to anxiety-related disorders. Recent studies highlight its importance in other clinical disorders. The link between its clinical presentation and the underlying neural correlates remains unclear. This review summarizes the emerging literature on the neural correlates of intolerance of uncertainty. In conclusion, studies focusing on the neural correlates of this construct are sparse, and findings are inconsistent across disorders. Future research should identify neural correlates of intolerance of uncertainty in more detail. This may unravel the neurobiology of a wide variety of clinical disorders and pave the way for novel therapeutic targets.

  6. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  7. Uncertainty Management and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Rosenbaum, Ralph K.; Georgiadis, Stylianos; Fantke, Peter

    2018-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...

  8. Identifying obstacles and ranking common biological control research priorities for Europe to manage most economically important pests in arable, vegetable and perennial crops.

    Science.gov (United States)

    Lamichhane, Jay Ram; Bischoff-Schaefer, Monika; Bluemel, Sylvia; Dachbrodt-Saaydeh, Silke; Dreux, Laure; Jansen, Jean-Pierre; Kiss, Jozsef; Köhl, Jürgen; Kudsk, Per; Malausa, Thibaut; Messéan, Antoine; Nicot, Philippe C; Ricci, Pierre; Thibierge, Jérôme; Villeneuve, François

    2017-01-01

    EU agriculture is currently in transition from conventional crop protection to integrated pest management (IPM). Because biocontrol is a key component of IPM, many European countries recently have intensified their national efforts on biocontrol research and innovation (R&I), although such initiatives are often fragmented. The operational outputs of national efforts would benefit from closer collaboration among stakeholders via transnationally coordinated approaches, as most economically important pests are similar across Europe. This paper proposes a common European framework on biocontrol R&I. It identifies generic R&I bottlenecks and needs as well as priorities for three crop types (arable, vegetable and perennial crops). The existing gap between the market offers of biocontrol solutions and the demand of growers, the lengthy and expensive registration process for biocontrol solutions and their varying effectiveness due to variable climatic conditions and site-specific factors across Europe are key obstacles hindering the development and adoption of biocontrol solutions in Europe. Considering arable, vegetable and perennial crops, a dozen common target pests are identified for each type of crop and ranked by order of importance at European level. Such a ranked list indicates numerous topics on which future joint transnational efforts would be justified. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  9. Uncertainty budget in internal monostandard NAA for small and large size samples analysis

    International Nuclear Information System (INIS)

    Dasari, K.B.; Acharya, R.

    2014-01-01

    Total uncertainty budget evaluation on determined concentration value is important under quality assurance programme. Concentration calculation in NAA or carried out by relative NAA and k0 based internal monostandard NAA (IM-NAA) method. IM-NAA method has been used for small and large sample analysis of clay potteries. An attempt was made to identify the uncertainty components in IM-NAA and uncertainty budget for La in both small and large size samples has been evaluated and compared. (author)

  10. Uncertainty and sensitivity studies supporting the interpretation of the results of TVO I/II PRA

    International Nuclear Information System (INIS)

    Holmberg, J.

    1992-01-01

    A comprehensive Level 1 probabilistic risk assessment (PRA) has been performed for the TVO I/II nuclear power units. As a part of the PRA project, uncertainties of risk models and methods were systematically studied in order to describe them and to demonstrate their impact by way of results. The uncertainty study was divided into two phases: a qualitative and a quantitative study. The qualitative study contained identification of uncertainties and qualitative assessments of their importance. The PRA was introduced, and identified assumptions and uncertainties behind the models were documented. The most significant uncertainties were selected by importance measures or other judgements for further quantitative studies. The quantitative study included sensitivity studies and propagation of uncertainty ranges. In the sensitivity studies uncertain assumptions or parameters were varied in order to illustrate the sensitivity of the models. The propagation of the uncertainty ranges demonstrated the impact of the statistical uncertainties of the parameter values. The Monte Carlo method was used as a propagation method. The most significant uncertainties were those involved in modelling human interactions, dependences and common cause failures (CCFs), loss of coolant accident (LOCA) frequencies and pressure suppression. The qualitative mapping out of the uncertainty factors turned out to be useful in planning quantitative studies. It also served as internal review of the assumptions made in the PRA. The sensitivity studies were perhaps the most advantageous part of the quantitative study because they allowed individual analyses of the significance of uncertainty sources identified. The uncertainty study was found reasonable in systematically and critically assessing uncertainties in a risk analysis. The usefulness of this study depends on the decision maker (power company) since uncertainty studies are primarily carried out to support decision making when uncertainties are

  11. Linking the Salt Transcriptome with Physiological Responses of a Salt-Resistant Populus Species as a Strategy to Identify Genes Important for Stress Acclimation1[W][OA

    Science.gov (United States)

    Brinker, Monika; Brosché, Mikael; Vinocur, Basia; Abo-Ogiala, Atef; Fayyaz, Payam; Janz, Dennis; Ottow, Eric A.; Cullmann, Andreas D.; Saborowski, Joachim; Kangasjärvi, Jaakko; Altman, Arie; Polle, Andrea

    2010-01-01

    To investigate early salt acclimation mechanisms in a salt-tolerant poplar species (Populus euphratica), the kinetics of molecular, metabolic, and physiological changes during a 24-h salt exposure were measured. Three distinct phases of salt stress were identified by analyses of the osmotic pressure and the shoot water potential: dehydration, salt accumulation, and osmotic restoration associated with ionic stress. The duration and intensity of these phases differed between leaves and roots. Transcriptome analysis using P. euphratica-specific microarrays revealed clusters of coexpressed genes in these phases, with only 3% overlapping salt-responsive genes in leaves and roots. Acclimation of cellular metabolism to high salt concentrations involved remodeling of amino acid and protein biosynthesis and increased expression of molecular chaperones (dehydrins, osmotin). Leaves suffered initially from dehydration, which resulted in changes in transcript levels of mitochondrial and photosynthetic genes, indicating adjustment of energy metabolism. Initially, decreases in stress-related genes were found, whereas increases occurred only when leaves had restored the osmotic balance by salt accumulation. Comparative in silico analysis of the poplar stress regulon with Arabidopsis (Arabidopsis thaliana) orthologs was used as a strategy to reduce the number of candidate genes for functional analysis. Analysis of Arabidopsis knockout lines identified a lipocalin-like gene (AtTIL) and a gene encoding a protein with previously unknown functions (AtSIS) to play roles in salt tolerance. In conclusion, by dissecting the stress transcriptome of tolerant species, novel genes important for salt endurance can be identified. PMID:20959419

  12. Linking the salt transcriptome with physiological responses of a salt-resistant Populus species as a strategy to identify genes important for stress acclimation.

    Science.gov (United States)

    Brinker, Monika; Brosché, Mikael; Vinocur, Basia; Abo-Ogiala, Atef; Fayyaz, Payam; Janz, Dennis; Ottow, Eric A; Cullmann, Andreas D; Saborowski, Joachim; Kangasjärvi, Jaakko; Altman, Arie; Polle, Andrea

    2010-12-01

    To investigate early salt acclimation mechanisms in a salt-tolerant poplar species (Populus euphratica), the kinetics of molecular, metabolic, and physiological changes during a 24-h salt exposure were measured. Three distinct phases of salt stress were identified by analyses of the osmotic pressure and the shoot water potential: dehydration, salt accumulation, and osmotic restoration associated with ionic stress. The duration and intensity of these phases differed between leaves and roots. Transcriptome analysis using P. euphratica-specific microarrays revealed clusters of coexpressed genes in these phases, with only 3% overlapping salt-responsive genes in leaves and roots. Acclimation of cellular metabolism to high salt concentrations involved remodeling of amino acid and protein biosynthesis and increased expression of molecular chaperones (dehydrins, osmotin). Leaves suffered initially from dehydration, which resulted in changes in transcript levels of mitochondrial and photosynthetic genes, indicating adjustment of energy metabolism. Initially, decreases in stress-related genes were found, whereas increases occurred only when leaves had restored the osmotic balance by salt accumulation. Comparative in silico analysis of the poplar stress regulon with Arabidopsis (Arabidopsis thaliana) orthologs was used as a strategy to reduce the number of candidate genes for functional analysis. Analysis of Arabidopsis knockout lines identified a lipocalin-like gene (AtTIL) and a gene encoding a protein with previously unknown functions (AtSIS) to play roles in salt tolerance. In conclusion, by dissecting the stress transcriptome of tolerant species, novel genes important for salt endurance can be identified.

  13. Climate policy uncertainty and investment risk

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-06-21

    Our climate is changing. This is certain. Less certain, however, is the timing and magnitude of climate change, and the cost of transition to a low-carbon world. Therefore, many policies and programmes are still at a formative stage, and policy uncertainty is very high. This book identifies how climate change policy uncertainty may affect investment behaviour in the power sector. For power companies, where capital stock is intensive and long-lived, those risks rank among the biggest and can create an incentive to delay investment. Our analysis results show that the risk premiums of climate change uncertainty can add 40% of construction costs of the plant for power investors, and 10% of price surcharges for the electricity end-users. This publication tells what can be done in policy design to reduce these costs. Incorporating the results of quantitative analysis, this publication also shows the sensitivity of different power sector investment decisions to different risks. It compares the effects of climate policy uncertainty with energy market uncertainty, showing the relative importance of these sources of risk for different technologies in different market types. Drawing on extensive consultation with power companies and financial investors, it also assesses the implications for policy makers, allowing the key messages to be transferred into policy designs. This book is a useful tool for governments to improve climate policy mechanisms and create more certainty for power investors.

  14. Uncertainty analysis of energy consumption in dwellings

    Energy Technology Data Exchange (ETDEWEB)

    Pettersen, Trine Dyrstad

    1997-12-31

    This thesis presents a comprehensive study of an energy estimation model that can be used to examine the uncertainty of predicted energy consumption in a dwelling. The variation and uncertainty of input parameters due to the outdoor climate, the building construction and the inhabitants are studied as a basis for further energy evaluations. The occurring variations of energy consumption in nominal similar dwellings are also investigated due to verification of the simulated energy consumption. The main topics are (1) a study of expected variations and uncertainties in both input parameters used in energy consumption calculations and the energy consumption in the dwelling, (2) the development and evaluation of a simplified energy calculation model that considers uncertainties due to the input parameters, (3) an evaluation of the influence of the uncertain parameters on the total variation so that the most important parameters can be identified, and (4) the recommendation of a simplified procedure for treating uncertainties or possible deviations from average conditions. 90 refs., 182 figs., 73 tabs.

  15. Experimental assessment of the importance of amino acid positions identified by an entropy-based correlation analysis of multiple-sequence alignments.

    Science.gov (United States)

    Dietrich, Susanne; Borst, Nadine; Schlee, Sandra; Schneider, Daniel; Janda, Jan-Oliver; Sterner, Reinhard; Merkl, Rainer

    2012-07-17

    The analysis of a multiple-sequence alignment (MSA) with correlation methods identifies pairs of residue positions whose occupation with amino acids changes in a concerted manner. It is plausible to assume that positions that are part of many such correlation pairs are important for protein function or stability. We have used the algorithm H2r to identify positions k in the MSAs of the enzymes anthranilate phosphoribosyl transferase (AnPRT) and indole-3-glycerol phosphate synthase (IGPS) that show a high conn(k) value, i.e., a large number of significant correlations in which k is involved. The importance of the identified residues was experimentally validated by performing mutagenesis studies with sAnPRT and sIGPS from the archaeon Sulfolobus solfataricus. For sAnPRT, five H2r mutant proteins were generated by replacing nonconserved residues with alanine or the prevalent residue of the MSA. As a control, five residues with conn(k) values of zero were chosen randomly and replaced with alanine. The catalytic activities and conformational stabilities of the H2r and control mutant proteins were analyzed by steady-state enzyme kinetics and thermal unfolding studies. Compared to wild-type sAnPRT, the catalytic efficiencies (k(cat)/K(M)) were largely unaltered. In contrast, the apparent thermal unfolding temperature (T(M)(app)) was lowered in most proteins. Remarkably, the strongest observed destabilization (ΔT(M)(app) = 14 °C) was caused by the V284A exchange, which pertains to the position with the highest correlation signal [conn(k) = 11]. For sIGPS, six H2r mutant and four control proteins with alanine exchanges were generated and characterized. The k(cat)/K(M) values of four H2r mutant proteins were reduced between 13- and 120-fold, and their T(M)(app) values were decreased by up to 5 °C. For the sIGPS control proteins, the observed activity and stability decreases were much less severe. Our findings demonstrate that positions with high conn(k) values have an

  16. Advanced LOCA code uncertainty assessment

    International Nuclear Information System (INIS)

    Wickett, A.J.; Neill, A.P.

    1990-11-01

    This report describes a pilot study that identified, quantified and combined uncertainties for the LOBI BL-02 3% small break test. A ''dials'' version of TRAC-PF1/MOD1, called TRAC-F, was used. (author)

  17. High throughput phenotypic selection of Mycobacterium tuberculosis mutants with impaired resistance to reactive oxygen species identifies genes important for intracellular growth.

    Directory of Open Access Journals (Sweden)

    Olga Mestre

    Full Text Available Mycobacterium tuberculosis has the remarkable capacity to survive within the hostile environment of the macrophage, and to resist potent antibacterial molecules such as reactive oxygen species (ROS. Thus, understanding mycobacterial resistance mechanisms against ROS may contribute to the development of new anti-tuberculosis therapies. Here we identified genes involved in such mechanisms by screening a high-density transposon mutant library, and we show that several of them are involved in the intracellular lifestyle of the pathogen. Many of these genes were found to play a part in cell envelope functions, further strengthening the important role of the mycobacterial cell envelope in protection against aggressions such as the ones caused by ROS inside host cells.

  18. Modified Phenomena Identification and Ranking Table (PIRT) for Uncertainty Analysis

    International Nuclear Information System (INIS)

    Gol-Mohamad, Mohammad P.; Modarres, Mohammad; Mosleh, Ali

    2006-01-01

    This paper describes a methodology of characterizing important phenomena, which is also part of a broader research by the authors called 'Modified PIRT'. The methodology provides robust process of phenomena identification and ranking process for more precise quantification of uncertainty. It is a two-step process of identifying and ranking methodology based on thermal-hydraulics (TH) importance as well as uncertainty importance. Analytical Hierarchical Process (AHP) has been used for as a formal approach for TH identification and ranking. Formal uncertainty importance technique is used to estimate the degree of credibility of the TH model(s) used to represent the important phenomena. This part uses subjective justification by evaluating available information and data from experiments, and code predictions. The proposed methodology was demonstrated by developing a PIRT for large break loss of coolant accident LBLOCA for the LOFT integral facility with highest core power (test LB-1). (authors)

  19. Uncertainty, probability and information-gaps

    International Nuclear Information System (INIS)

    Ben-Haim, Yakov

    2004-01-01

    This paper discusses two main ideas. First, we focus on info-gap uncertainty, as distinct from probability. Info-gap theory is especially suited for modelling and managing uncertainty in system models: we invest all our knowledge in formulating the best possible model; this leaves the modeller with very faulty and fragmentary information about the variation of reality around that optimal model. Second, we examine the interdependence between uncertainty modelling and decision-making. Good uncertainty modelling requires contact with the end-use, namely, with the decision-making application of the uncertainty model. The most important avenue of uncertainty-propagation is from initial data- and model-uncertainties into uncertainty in the decision-domain. Two questions arise. Is the decision robust to the initial uncertainties? Is the decision prone to opportune windfall success? We apply info-gap robustness and opportunity functions to the analysis of representation and propagation of uncertainty in several of the Sandia Challenge Problems

  20. Uncertainty Propagation in Hypersonic Vehicle Aerothermoelastic Analysis

    Science.gov (United States)

    Lamorte, Nicolas Etienne

    Hypersonic vehicles face a challenging flight environment. The aerothermoelastic analysis of its components requires numerous simplifying approximations. Identifying and quantifying the effect of uncertainties pushes the limits of the existing deterministic models, and is pursued in this work. An uncertainty quantification framework is used to propagate the effects of identified uncertainties on the stability margins and performance of the different systems considered. First, the aeroelastic stability of a typical section representative of a control surface on a hypersonic vehicle is examined. Variability in the uncoupled natural frequencies of the system is modeled to mimic the effect of aerodynamic heating. Next, the stability of an aerodynamically heated panel representing a component of the skin of a generic hypersonic vehicle is considered. Uncertainty in the location of transition from laminar to turbulent flow and the heat flux prediction is quantified using CFD. In both cases significant reductions of the stability margins are observed. A loosely coupled airframe--integrated scramjet engine is considered next. The elongated body and cowl of the engine flow path are subject to harsh aerothermodynamic loading which causes it to deform. Uncertainty associated with deformation prediction is propagated to the engine performance analysis. The cowl deformation is the main contributor to the sensitivity of the propulsion system performance. Finally, a framework for aerothermoelastic stability boundary calculation for hypersonic vehicles using CFD is developed. The usage of CFD enables one to consider different turbulence conditions, laminar or turbulent, and different models of the air mixture, in particular real gas model which accounts for dissociation of molecules at high temperature. The system is found to be sensitive to turbulence modeling as well as the location of the transition from laminar to turbulent flow. Real gas effects play a minor role in the

  1. Decision Making Under Uncertainty

    Science.gov (United States)

    2010-11-01

    A sound approach to rational decision making requires a decision maker to establish decision objectives, identify alternatives, and evaluate those...often violate the axioms of rationality when making decisions under uncertainty. The systematic description of such observations may lead to the...which leads to “anchoring” on the initial value. The fact that individuals have been shown to deviate from rationality when making decisions

  2. An introductory guide to uncertainty analysis in environmental and health risk assessment. Environmental Restoration Program

    International Nuclear Information System (INIS)

    Hammonds, J.S.; Hoffman, F.O.; Bartell, S.M.

    1994-12-01

    This report presents guidelines for evaluating uncertainty in mathematical equations and computer models applied to assess human health and environmental risk. Uncertainty analyses involve the propagation of uncertainty in model parameters and model structure to obtain confidence statements for the estimate of risk and identify the model components of dominant importance. Uncertainty analyses are required when there is no a priori knowledge about uncertainty in the risk estimate and when there is a chance that the failure to assess uncertainty may affect the selection of wrong options for risk reduction. Uncertainty analyses are effective when they are conducted in an iterative mode. When the uncertainty in the risk estimate is intolerable for decision-making, additional data are acquired for the dominant model components that contribute most to uncertainty. This process is repeated until the level of residual uncertainty can be tolerated. A analytical and numerical methods for error propagation are presented along with methods for identifying the most important contributors to uncertainty. Monte Carlo simulation with either Simple Random Sampling (SRS) or Latin Hypercube Sampling (LHS) is proposed as the most robust method for propagating uncertainty through either simple or complex models. A distinction is made between simulating a stochastically varying assessment endpoint (i.e., the distribution of individual risks in an exposed population) and quantifying uncertainty due to lack of knowledge about a fixed but unknown quantity (e.g., a specific individual, the maximally exposed individual, or the mean, median, or 95%-tile of the distribution of exposed individuals). Emphasis is placed on the need for subjective judgement to quantify uncertainty when relevant data are absent or incomplete

  3. Uncertainty and sensitivity analysis in nuclear accident consequence assessment

    International Nuclear Information System (INIS)

    Karlberg, Olof.

    1989-01-01

    This report contains the results of a four year project in research contracts with the Nordic Cooperation in Nuclear Safety and the National Institute for Radiation Protection. An uncertainty/sensitivity analysis methodology consisting of Latin Hypercube sampling and regression analysis was applied to an accident consequence model. A number of input parameters were selected and the uncertainties related to these parameter were estimated within a Nordic group of experts. Individual doses, collective dose, health effects and their related uncertainties were then calculated for three release scenarios and for a representative sample of meteorological situations. From two of the scenarios the acute phase after an accident were simulated and from one the long time consequences. The most significant parameters were identified. The outer limits of the calculated uncertainty distributions are large and will grow to several order of magnitudes for the low probability consequences. The uncertainty in the expectation values are typical a factor 2-5 (1 Sigma). The variation in the model responses due to the variation of the weather parameters is fairly equal to the parameter uncertainty induced variation. The most important parameters showed out to be different for each pathway of exposure, which could be expected. However, the overall most important parameters are the wet deposition coefficient and the shielding factors. A general discussion of the usefulness of uncertainty analysis in consequence analysis is also given. (au)

  4. Systemic analysis of different colorectal cancer cell lines and TCGA datasets identified IGF-1R/EGFR-PPAR-CASPASE axis as important indicator for radiotherapy sensitivity.

    Science.gov (United States)

    Chen, Lin; Zhu, Zhe; Gao, Wei; Jiang, Qixin; Yu, Jiangming; Fu, Chuangang

    2017-09-05

    Insulin-like growth factor 1 receptor (IGF-1R) is proved to contribute the development of many types of cancers. But, little is known about its roles in radio-resistance of colorectal cancer (CRC). Here, we demonstrated that low IGF-1R expression value was associated with the better radiotherapy sensitivity of CRC. Besides, through Quantitative Real-time PCR (qRT-PCR), the elevated expression value of epidermal growth factor receptor (EGFR) was observed in CRC cell lines (HT29, RKO) with high radio-sensitivity compared with those with low sensitivity (SW480, LOVO). The irradiation induced apoptosis rates of wild type and EGFR agonist (EGF) or IGF-1R inhibitor (NVP-ADW742) treated HT29 and SW480 cells were quantified by flow cytometry. As a result, the apoptosis rate of EGF and NVP-ADW742 treated HT29 cells was significantly higher than that of those wild type ones, which indicated that high EGFR and low IGF-1R expression level in CRC was associated with the high sensitivity to radiotherapy. We next conducted systemic bioinformatics analysis of genome-wide expression profiles of CRC samples from the Cancer Genome Atlas (TCGA). Differential expression analysis between IGF-1R and EGFR abnormal CRC samples, i.e. CRC samples with higher IGF-1R and lower EGFR expression levels based on their median expression values, and the rest of CRC samples identified potential genes contribute to radiotherapy sensitivity. Functional enrichment of analysis of those differential expression genes (DEGs) in the Database for Annotation, Visualization and Integrated Discovery (DAVID) indicated PPAR signaling pathway as an important pathway for the radio-resistance of CRC. Our study identified the potential biomarkers for the rational selection of radiotherapy for CRC patients. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Whole-exome sequencing of muscle-invasive bladder cancer identifies recurrent mutations of UNC5C and prognostic importance of DNA repair gene mutations on survival.

    Science.gov (United States)

    Yap, Kai Lee; Kiyotani, Kazuma; Tamura, Kenji; Antic, Tatjana; Jang, Miran; Montoya, Magdeline; Campanile, Alexa; Yew, Poh Yin; Ganshert, Cory; Fujioka, Tomoaki; Steinberg, Gary D; O'Donnell, Peter H; Nakamura, Yusuke

    2014-12-15

    Because of suboptimal outcomes in muscle-invasive bladder cancer even with multimodality therapy, determination of potential genetic drivers offers the possibility of improving therapeutic approaches and discovering novel prognostic indicators. Using pTN staging, we case-matched 81 patients with resected ≥pT2 bladder cancers for whom perioperative chemotherapy use and disease recurrence status were known. Whole-exome sequencing was conducted in 43 cases to identify recurrent somatic mutations and targeted sequencing of 10 genes selected from the initial screening in an additional 38 cases was completed. Mutational profiles along with clinicopathologic information were correlated with recurrence-free survival (RFS) in the patients. We identified recurrent novel somatic mutations in the gene UNC5C (9.9%), in addition to TP53 (40.7%), KDM6A (21.0%), and TSC1 (12.3%). Patients who were carriers of somatic mutations in DNA repair genes (one or more of ATM, ERCC2, FANCD2, PALB2, BRCA1, or BRCA2) had a higher overall number of somatic mutations (P = 0.011). Importantly, after a median follow-up of 40.4 months, carriers of somatic mutations (n = 25) in any of these six DNA repair genes had significantly enhanced RFS compared with noncarriers [median, 32.4 vs. 14.8 months; hazard ratio of 0.46, 95% confidence interval (CI), 0.22-0.98; P = 0.0435], after adjustment for pathologic pTN staging and independent of adjuvant chemotherapy usage. Better prognostic outcomes of individuals carrying somatic mutations in DNA repair genes suggest these mutations as favorable prognostic events in muscle-invasive bladder cancer. Additional mechanistic investigation into the previously undiscovered role of UNC5C in bladder cancer is warranted. ©2014 American Association for Cancer Research.

  6. A review of the uncertainties in the assessment of radiological consequences of spent nuclear fuel disposal

    International Nuclear Information System (INIS)

    Wiborgh, M.; Elert, M.; Hoeglund, L.O.; Jones, C.; Grundfelt, B.; Skagius, K.; Bengtsson, A.

    1992-06-01

    Radioactive waste disposal systems for spent nuclear fuel are designed to isolate the radioactive waste from the human environment for long period of time. The isolation is provided by a combination of engineered and natural barriers. Safety assessments are performed to describe and quantify the performance of the individual barriers and the disposal system over long-term periods. These assessments will always be associated with uncertainties. Uncertainties can originate from the variability of natural systems and will also be introduced in the predictive modelling performed to quantitatively evaluate the behaviour of the disposal system as a consequence of the incomplete knowledge about the governing processes. Uncertainties in safety assessments can partly be reduced by additional measurements and research. The aim of this study has been to identify uncertainties in assessments of radiological consequences from the disposal of spent nuclear fuel based on the Swedish KBS-3 concept. The identified uncertainties have been classified with respect to their origin, i.e. in conceptual, modelling and data uncertainties. The possibilities to reduce the uncertainties are also commented upon. In assessments it is important to decrease uncertainties which are of major importance for the performance of the disposal system. These could to some extent be identified by uncertainty analysis. However, conceptual uncertainties and some type of model uncertainties are difficult to evaluate. To be able to decrease uncertainties in conceptual models, it is essential that the processes describing and influencing the radionuclide transport in the engineered and natural barriers are sufficiently understood. In this study a qualitative approach has been used. The importance of different barriers and processes are indicated by their influence on the release of some representative radionuclides. (122 refs.) (au)

  7. Uncertainty in artificial intelligence

    CERN Document Server

    Shachter, RD; Henrion, M; Lemmer, JF

    1990-01-01

    This volume, like its predecessors, reflects the cutting edge of research on the automation of reasoning under uncertainty.A more pragmatic emphasis is evident, for although some papers address fundamental issues, the majority address practical issues. Topics include the relations between alternative formalisms (including possibilistic reasoning), Dempster-Shafer belief functions, non-monotonic reasoning, Bayesian and decision theoretic schemes, and new inference techniques for belief nets. New techniques are applied to important problems in medicine, vision, robotics, and natural language und

  8. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  9. Integrated physiological, biochemical and molecular analysis identifies important traits and mechanisms associated with differential response of rice genotypes to elevated temperature

    Directory of Open Access Journals (Sweden)

    Boghireddy eSailaja

    2015-11-01

    Full Text Available In changing climate, heat stress caused by high temperature poses a serious threat to rice cultivation. A multiple organizational analysis at physiological, biochemical and molecular level is required to fully understand the impact of elevated temperature in rice. This study was aimed at deciphering the elevated temperature response in eleven popular and mega rice cultivars widely grown in India. Physiological and biochemical traits specifically membrane thermostability (MTS, antioxidants, and photosynthesis were studied at vegetative and reproductive phases which were used to establish a correlation with grain yield under stress. Several useful traits in different genotypes were identified which will be important resource to develop high temperature tolerant rice cultivars. Interestingly, Nagina22 emerged as best performer in terms of yield as well as expression of physiological and biochemical traits at elevated temperature. It showed lesser relative injury, lesser reduction in chlorophyll content, increased super oxide dismutase, catalase and peroxidase activity, lesser reduction in net photosynthetic rate (PN, high transpiration rate (E and other photosynthetic/ fluorescence parameters contributing to least reduction in spikelet fertility and grain yield at elevated temperature. Further, expression of 14 genes including heat shock transcription factors and heat shock proteins was analyzed in Nagina22 (tolerant and Vandana (susceptible at flowering phase, strengthening the fact that N22 performs better at molecular level also during elevated temperature. This study shows that elevated temperature response is complex and involves multiple biological processes which are needed to be characterized to address the challenges of future climate extreme conditions.

  10. Machine Learning Analysis Identifies Drosophila Grunge/Atrophin as an Important Learning and Memory Gene Required for Memory Retention and Social Learning.

    Science.gov (United States)

    Kacsoh, Balint Z; Greene, Casey S; Bosco, Giovanni

    2017-11-06

    High-throughput experiments are becoming increasingly common, and scientists must balance hypothesis-driven experiments with genome-wide data acquisition. We sought to predict novel genes involved in Drosophila learning and long-term memory from existing public high-throughput data. We performed an analysis using PILGRM, which analyzes public gene expression compendia using machine learning. We evaluated the top prediction alongside genes involved in learning and memory in IMP, an interface for functional relationship networks. We identified Grunge/Atrophin ( Gug/Atro ), a transcriptional repressor, histone deacetylase, as our top candidate. We find, through multiple, distinct assays, that Gug has an active role as a modulator of memory retention in the fly and its function is required in the adult mushroom body. Depletion of Gug specifically in neurons of the adult mushroom body, after cell division and neuronal development is complete, suggests that Gug function is important for memory retention through regulation of neuronal activity, and not by altering neurodevelopment. Our study provides a previously uncharacterized role for Gug as a possible regulator of neuronal plasticity at the interface of memory retention and memory extinction. Copyright © 2017 Kacsoh et al.

  11. Uncertainty contributions to low flow projections in Austria

    Science.gov (United States)

    Parajka, J.; Blaschke, A. P.; Blöschl, G.; Haslinger, K.; Hepp, G.; Laaha, G.; Schöner, W.; Trautvetter, H.; Viglione, A.; Zessner, M.

    2015-11-01

    The main objective of the paper is to understand the contributions to the uncertainty in low flow projections resulting from hydrological model uncertainty and climate projection uncertainty. Model uncertainty is quantified by different parameterizations of a conceptual semi-distributed hydrologic model (TUWmodel) using 11 objective functions in three different decades (1976-1986, 1987-1997, 1998-2008), which allows disentangling the effect of modeling uncertainty and temporal stability of model parameters. Climate projection uncertainty is quantified by four future climate scenarios (ECHAM5-A1B, A2, B1 and HADCM3-A1B) using a delta change approach. The approach is tested for 262 basins in Austria. The results indicate that the seasonality of the low flow regime is an important factor affecting the performance of model calibration in the reference period and the uncertainty of Q95 low flow projections in the future period. In Austria, the calibration uncertainty in terms of Q95 is larger in basins with summer low flow regime than in basins with winter low flow regime. Using different calibration periods may result in a range of up to 60 % in simulated Q95 low flows. The low flow projections show an increase of low flows in the Alps, typically in the range of 10-30 % and a decrease in the south-eastern part of Austria mostly in the range -5 to -20 % for the period 2021-2050 relative the reference period 1976-2008. The change in seasonality varies between scenarios, but there is a tendency for earlier low flows in the Northern Alps and later low flows in Eastern Austria. In 85 % of the basins, the uncertainty in Q95 from model calibration is larger than the uncertainty from different climate scenarios. The total uncertainty of Q95 projections is the largest in basins with winter low flow regime and, in some basins, exceeds 60 %. In basins with summer low flows and the total uncertainty is mostly less than 20 %. While the calibration uncertainty dominates over climate

  12. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  13. Uncertainty in adaptive capacity

    International Nuclear Information System (INIS)

    Neil Adger, W.; Vincent, K.

    2005-01-01

    The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. (authors)

  14. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report: Updated in 2016

    Energy Technology Data Exchange (ETDEWEB)

    Sisterson, Douglas [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-01-15

    The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. ARM currently provides data and supporting metadata (information about the data or data quality) to its users through several sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, ARM relies on Instrument Mentors and the ARM Data Quality Office to ensure, assess, and report measurement quality. Therefore, an easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. This report is a continuation of the work presented by Campos and Sisterson (2015) and provides additional uncertainty information from instruments not available in their report. As before, a total measurement uncertainty has been calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). This study will not expand on methods for computing these uncertainties. As before, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available to the ARM community through the ARM Instrument Mentors and their ARM instrument handbooks. This study continues the first steps towards reporting ARM measurement uncertainty as: (1) identifying how the uncertainty of individual ARM measurements is currently expressed, (2) identifying a consistent approach to measurement uncertainty, and then (3) reclassifying ARM instrument measurement uncertainties in a common framework.

  15. Summary from the epistemic uncertainty workshop: consensus amid diversity

    International Nuclear Information System (INIS)

    Ferson, Scott; Joslyn, Cliff A.; Helton, Jon C.; Oberkampf, William L.; Sentz, Kari

    2004-01-01

    The 'Epistemic Uncertainty Workshop' sponsored by Sandia National Laboratories was held in Albuquerque, New Mexico, on 6-7 August 2002. The workshop was organized around a set of Challenge Problems involving both epistemic and aleatory uncertainty that the workshop participants were invited to solve and discuss. This concluding article in a special issue of Reliability Engineering and System Safety based on the workshop discusses the intent of the Challenge Problems, summarizes some discussions from the workshop, and provides a technical comparison among the papers in this special issue. The Challenge Problems were computationally simple models that were intended as vehicles for the illustration and comparison of conceptual and numerical techniques for use in analyses that involve: (i) epistemic uncertainty, (ii) aggregation of multiple characterizations of epistemic uncertainty, (iii) combination of epistemic and aleatory uncertainty, and (iv) models with repeated parameters. There was considerable diversity of opinion at the workshop about both methods and fundamental issues, and yet substantial consensus about what the answers to the problems were, and even about how each of the four issues should be addressed. Among the technical approaches advanced were probability theory, Dempster-Shafer evidence theory, random sets, sets of probability measures, imprecise coherent probabilities, coherent lower previsions, probability boxes, possibility theory, fuzzy sets, joint distribution tableaux, polynomial chaos expansions, and info-gap models. Although some participants maintained that a purely probabilistic approach is fully capable of accounting for all forms of uncertainty, most agreed that the treatment of epistemic uncertainty introduces important considerations and that the issues underlying the Challenge Problems are legitimate and significant. Topics identified as meriting additional research include elicitation of uncertainty representations, aggregation of

  16. Entropic uncertainty relations-a survey

    International Nuclear Information System (INIS)

    Wehner, Stephanie; Winter, Andreas

    2010-01-01

    Uncertainty relations play a central role in quantum mechanics. Entropic uncertainty relations in particular have gained significant importance within quantum information, providing the foundation for the security of many quantum cryptographic protocols. Yet, little is known about entropic uncertainty relations with more than two measurement settings. In the present survey, we review known results and open questions.

  17. UNCERTAINTY IN THE PROCESS INTEGRATION FOR THE BIOREFINERIES DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Meilyn González Cortés

    2015-07-01

    Full Text Available This paper presents how the design approaches with high level of flexibility can reduce the additional costs of the strategies that apply overdesign factors to consider parameters with uncertainty that impact on the economic feasibility of a project. The elements with associate uncertainties and that are important in the configurations of the process integration under a biorefinery scheme are: raw material, raw material technologies of conversion, and variety of products that can be obtained. From the analysis it is obtained that in the raw materials and products with potentialities in a biorefinery scheme, there are external uncertainties such as availability, demands and prices in the market. Those external uncertainties can determine their impact on the biorefinery and also in the product prices we can find minimum and maximum limits that can be identified in intervals which should be considered for the project economic evaluation and the sensibility analysis due to varied conditions.

  18. Transcriptional profiling of Medicago truncatula under salt stress identified a novel CBF transcription factor MtCBF4 that plays an important role in abiotic stress responses

    Directory of Open Access Journals (Sweden)

    Su Zhen

    2011-07-01

    Full Text Available Abstract Background Salt stress hinders the growth of plants and reduces crop production worldwide. However, different plant species might possess different adaptive mechanisms to mitigate salt stress. We conducted a detailed pathway analysis of transcriptional dynamics in the roots of Medicago truncatula seedlings under salt stress and selected a transcription factor gene, MtCBF4, for experimental validation. Results A microarray experiment was conducted using root samples collected 6, 24, and 48 h after application of 180 mM NaCl. Analysis of 11 statistically significant expression profiles revealed different behaviors between primary and secondary metabolism pathways in response to external stress. Secondary metabolism that helps to maintain osmotic balance was induced. One of the highly induced transcription factor genes was successfully cloned, and was named MtCBF4. Phylogenetic analysis revealed that MtCBF4, which belongs to the AP2-EREBP transcription factor family, is a novel member of the CBF transcription factor in M. truncatula. MtCBF4 is shown to be a nuclear-localized protein. Expression of MtCBF4 in M. truncatula was induced by most of the abiotic stresses, including salt, drought, cold, and abscisic acid, suggesting crosstalk between these abiotic stresses. Transgenic Arabidopsis over-expressing MtCBF4 enhanced tolerance to drought and salt stress, and activated expression of downstream genes that contain DRE elements. Over-expression of MtCBF4 in M. truncatula also enhanced salt tolerance and induced expression level of corresponding downstream genes. Conclusion Comprehensive transcriptomic analysis revealed complex mechanisms exist in plants in response to salt stress. The novel transcription factor gene MtCBF4 identified here played an important role in response to abiotic stresses, indicating that it might be a good candidate gene for genetic improvement to produce stress-tolerant plants.

  19. A method to identify important dynamical states in Boolean models of regulatory networks: application to regulation of stomata closure by ABA in A. thaliana.

    Science.gov (United States)

    Bugs, Cristhian A; Librelotto, Giovani R; Mombach, José C M

    2011-12-22

    We introduce a method to analyze the states of regulatory Boolean models that identifies important network states and their biological influence on the global network dynamics. It consists in (1) finding the states of the network that are most frequently visited and (2) the identification of variable and frozen nodes of the network. The method, along with a simulation that includes random features, is applied to the study of stomata closure by abscisic acid (ABA) in A. thaliana proposed by Albert and coworkers. We find that for the case of study, that the dynamics of wild and mutant networks have just two states that are highly visited in their space of states and about a third of all nodes of the wild network are variable while the rest remain frozen in True or False states. This high number of frozen elements explains the low cardinality of the space of states of the wild network. Similar results are observed in the mutant networks. The application of the method allowed us to explain how wild and mutants behave dynamically in the SS and determined an essential feature of the activation of the closure node (representing stomata closure), i.e. its synchronization with the AnionEm node (representing anion efflux at the plasma membrane). The dynamics of this synchronization explains the efficiency reached by the wild and each of the mutant networks. For the biological problem analyzed, our method allows determining how wild and mutant networks differ 'phenotypically'. It shows that the different efficiencies of stomata closure reached among the simulated wild and mutant networks follow from a dynamical behavior of two nodes that are always synchronized. Additionally, we predict that the involvement of the anion efflux at the plasma membrane is crucial for the plant response to ABA. The algorithm used in the simulations is available upon request.

  20. Transcriptional profiling of Medicago truncatula under salt stress identified a novel CBF transcription factor MtCBF4 that plays an important role in abiotic stress responses

    Science.gov (United States)

    2011-01-01

    Background Salt stress hinders the growth of plants and reduces crop production worldwide. However, different plant species might possess different adaptive mechanisms to mitigate salt stress. We conducted a detailed pathway analysis of transcriptional dynamics in the roots of Medicago truncatula seedlings under salt stress and selected a transcription factor gene, MtCBF4, for experimental validation. Results A microarray experiment was conducted using root samples collected 6, 24, and 48 h after application of 180 mM NaCl. Analysis of 11 statistically significant expression profiles revealed different behaviors between primary and secondary metabolism pathways in response to external stress. Secondary metabolism that helps to maintain osmotic balance was induced. One of the highly induced transcription factor genes was successfully cloned, and was named MtCBF4. Phylogenetic analysis revealed that MtCBF4, which belongs to the AP2-EREBP transcription factor family, is a novel member of the CBF transcription factor in M. truncatula. MtCBF4 is shown to be a nuclear-localized protein. Expression of MtCBF4 in M. truncatula was induced by most of the abiotic stresses, including salt, drought, cold, and abscisic acid, suggesting crosstalk between these abiotic stresses. Transgenic Arabidopsis over-expressing MtCBF4 enhanced tolerance to drought and salt stress, and activated expression of downstream genes that contain DRE elements. Over-expression of MtCBF4 in M. truncatula also enhanced salt tolerance and induced expression level of corresponding downstream genes. Conclusion Comprehensive transcriptomic analysis revealed complex mechanisms exist in plants in response to salt stress. The novel transcription factor gene MtCBF4 identified here played an important role in response to abiotic stresses, indicating that it might be a good candidate gene for genetic improvement to produce stress-tolerant plants. PMID:21718548

  1. Uncertainties about climate

    International Nuclear Information System (INIS)

    Laval, Katia; Laval, Guy

    2013-01-01

    Like meteorology, climatology is not an exact science: climate change forecasts necessarily include a share of uncertainty. It is precisely this uncertainty which is brandished and exploited by the opponents to the global warming theory to put into question the estimations of its future consequences. Is it legitimate to predict the future using the past climate data (well documented up to 100000 years BP) or the climates of other planets, taking into account the impreciseness of the measurements and the intrinsic complexity of the Earth's machinery? How is it possible to model a so huge and interwoven system for which any exact description has become impossible? Why water and precipitations play such an important role in local and global forecasts, and how should they be treated? This book written by two physicists answers with simpleness these delicate questions in order to give anyone the possibility to build his own opinion about global warming and the need to act rapidly

  2. Uncertainty and measurement

    International Nuclear Information System (INIS)

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  3. Quantification of Safety-Critical Software Test Uncertainty

    International Nuclear Information System (INIS)

    Khalaquzzaman, M.; Cho, Jaehyun; Lee, Seung Jun; Jung, Wondea

    2015-01-01

    The method, conservatively assumes that the failure probability of a software for the untested inputs is 1, and the failure probability turns in 0 for successful testing of all test cases. However, in reality the chance of failure exists due to the test uncertainty. Some studies have been carried out to identify the test attributes that affect the test quality. Cao discussed the testing effort, testing coverage, and testing environment. Management of the test uncertainties was discussed in. In this study, the test uncertainty has been considered to estimate the software failure probability because the software testing process is considered to be inherently uncertain. A reliability estimation of software is very important for a probabilistic safety analysis of a digital safety critical system of NPPs. This study focused on the estimation of the probability of a software failure that considers the uncertainty in software testing. In our study, BBN has been employed as an example model for software test uncertainty quantification. Although it can be argued that the direct expert elicitation of test uncertainty is much simpler than BBN estimation, however the BBN approach provides more insights and a basis for uncertainty estimation

  4. Probabilistic numerics and uncertainty in computations.

    Science.gov (United States)

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  5. The importance of Foxp3 antibody and fixation/permeabilization buffer combinations in identifying CD4+CD25+Foxp3+ regulatory T cells.

    Science.gov (United States)

    Law, Jacqueline P; Hirschkorn, Dale F; Owen, Rachel E; Biswas, Hope H; Norris, Philip J; Lanteri, Marion C

    2009-12-01

    Foxp3 is a key marker for CD4(+) regulatory T cells (T(regs)) and was used in developing a multiparameter flow cytometric panel to identify T(regs). Achieving reproducible staining and analysis first required optimization of Foxp3 staining. We present a comparative study of PCH101, 236A/E7, 3G3, 206D, 150D, and 259D/C7 clones of anti-human-Foxp3 antibodies used in combination with five different fixation/permeabilization buffers. Staining for CD25, CD152, and CD127 was also compared between fixation/permeabilization treatments. Promising antibody/buffer combinations were tested in a panel of peripheral blood mononuclear cells from 10 individuals, and then on fresh versus frozen cells from four individuals. Finally, different fluorochromes coupled to two representative antibodies were compared to optimize separation of Foxp3(+) from Foxp3(-) events. Foxp3 gates were set using two gating strategies based on CD127(+)CD25(-) "non-T(regs)" or based on isotype controls. For Foxp3 staining, the best conditions for fixation/permeabilization were obtained using the eBioscience Foxp3, Imgenex, BioLegend, and BD Foxp3 buffers. Comparing results from 10 subjects, 259D/C7, PCH101, 236A/E7, and 206D antibodies yielded statistically higher levels of Foxp3 cells than those by 150D and 3G3 antibodies (mean = 6.9, 5.1, 4.7, and 3.7% compared with 1.7, and 0.3% of CD25(+)Foxp3(+) events within CD4(+) cells, respectively). Importantly, the "nonspecificity" of some antibodies observed with a Foxp3 gate based on isotype controls could be eliminated by setting the Foxp3 gate on "non-T(regs)". Better separation of Foxp3(+) and Foxp3(-) populations was observed using the PCH101 clone coupled to Alexa647 compared with FITC or the 259D/C7 clone coupled to PE compared with Alexa488 fluorochrome. Foxp3 staining can be highly variable and depends on the choice of antibody/buffer pair and the fluorochrome used. Selecting the correct population for setting the Foxp3 gate is critical to avoid

  6. A new importance measure for sensitivity analysis

    International Nuclear Information System (INIS)

    Liu, Qiao; Homma, Toshimitsu

    2010-01-01

    Uncertainty is an integral part of risk assessment of complex engineering systems, such as nuclear power plants and space crafts. The aim of sensitivity analysis is to identify the contribution of the uncertainty in model inputs to the uncertainty in the model output. In this study, a new importance measure that characterizes the influence of the entire input distribution on the entire output distribution was proposed. It represents the expected deviation of the cumulative distribution function (CDF) of the model output that would be obtained when one input parameter of interest were known. The applicability of this importance measure was tested with two models, a nonlinear nonmonotonic mathematical model and a risk model. In addition, a comparison of this new importance measure with several other importance measures was carried out and the differences between these measures were explained. (author)

  7. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    Energy Technology Data Exchange (ETDEWEB)

    Pourgol-Mohammad, Mohammad, E-mail: pourgolmohammad@sut.ac.ir [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mojtaba [Building & Housing Research Center, Tehran (Iran, Islamic Republic of); Sepanloo, Kamran [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of)

    2016-08-15

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  8. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    International Nuclear Information System (INIS)

    Pourgol-Mohammad, Mohammad; Hoseyni, Seyed Mohsen; Hoseyni, Seyed Mojtaba; Sepanloo, Kamran

    2016-01-01

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  9. Uncertainties in Nuclear Proliferation Modeling

    International Nuclear Information System (INIS)

    Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok

    2015-01-01

    There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies

  10. Measurement uncertainty: Friend or foe?

    Science.gov (United States)

    Infusino, Ilenia; Panteghini, Mauro

    2018-02-02

    The definition and enforcement of a reference measurement system, based on the implementation of metrological traceability of patients' results to higher order reference methods and materials, together with a clinically acceptable level of measurement uncertainty, are fundamental requirements to produce accurate and equivalent laboratory results. The uncertainty associated with each step of the traceability chain should be governed to obtain a final combined uncertainty on clinical samples fulfilling the requested performance specifications. It is important that end-users (i.e., clinical laboratory) may know and verify how in vitro diagnostics (IVD) manufacturers have implemented the traceability of their calibrators and estimated the corresponding uncertainty. However, full information about traceability and combined uncertainty of calibrators is currently very difficult to obtain. Laboratory professionals should investigate the need to reduce the uncertainty of the higher order metrological references and/or to increase the precision of commercial measuring systems. Accordingly, the measurement uncertainty should not be considered a parameter to be calculated by clinical laboratories just to fulfil the accreditation standards, but it must become a key quality indicator to describe both the performance of an IVD measuring system and the laboratory itself. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  11. The Uncertainties of Risk Management

    DEFF Research Database (Denmark)

    Vinnari, Eija; Skærbæk, Peter

    2014-01-01

    for expanding risk management. More generally, such uncertainties relate to the professional identities and responsibilities of operational managers as defined by the framing devices. Originality/value – The paper offers three contributions to the extant literature: first, it shows how risk management itself......Purpose – The purpose of this paper is to analyse the implementation of risk management as a tool for internal audit activities, focusing on unexpected effects or uncertainties generated during its application. Design/methodology/approach – Public and confidential documents as well as semi......-structured interviews are analysed through the lens of actor-network theory to identify the effects of risk management devices in a Finnish municipality. Findings – The authors found that risk management, rather than reducing uncertainty, itself created unexpected uncertainties that would otherwise not have emerged...

  12. Assessment of Risks and Uncertainties in Poultry Farming in Kwara ...

    African Journals Online (AJOL)

    , identify the risks and uncertainties encountered by the farmers, determines the level of severity of the risks and uncertainties, and identifies the coping strategies employed by the farmers. Primary data obtained from 99 registered poultry ...

  13. Commonplaces and social uncertainty

    DEFF Research Database (Denmark)

    Lassen, Inger

    2008-01-01

    This article explores the concept of uncertainty in four focus group discussions about genetically modified food. In the discussions, members of the general public interact with food biotechnology scientists while negotiating their attitudes towards genetic engineering. Their discussions offer...... an example of risk discourse in which the use of commonplaces seems to be a central feature (Myers 2004: 81). My analyses support earlier findings that commonplaces serve important interactional purposes (Barton 1999) and that they are used for mitigating disagreement, for closing topics and for facilitating...

  14. A novel screening method for cell wall mutants in Aspergillus niger identifies UDP-galactopyranose mutase as an important protein in fungal cell wall biosynthesis

    NARCIS (Netherlands)

    Damveld, R.A.; Franken, A.; Arentshorst, M.; Punt, P.J.; Klis, F.M.; van den Hondel, C.A.M.J.J.; Ram, A.F.J.

    2008-01-01

    To identify cell wall biosynthetic genes in filamentous fungi and thus potential targets for the discovery of new antifungals, we developed a novel screening method for cell wall mutants. It is based on our earlier observation that the Aspergillus niger agsA gene, which encodes a putative

  15. Gene expression profiling identifies FYN as an important molecule in tamoxifen resistance and a predictor of early recurrence in patients treated with endocrine therapy

    DEFF Research Database (Denmark)

    Elias, D; (Hansen) Vever, Henriette; Lænkholm, A-V

    2015-01-01

    To elucidate the molecular mechanisms of tamoxifen resistance in breast cancer, we performed gene array analyses and identified 366 genes with altered expression in four unique tamoxifen-resistant (TamR) cell lines vs the parental tamoxifen-sensitive MCF-7/S0.5 cell line. Most of these genes were...

  16. A novel screening method for cell wall mutants in Aspergillus niger identifies UDP-galactopyranose mutase as an important protein in fungal cell wall biosynthesis

    NARCIS (Netherlands)

    Damveld, R.A.; Franken, A.; Arentshorst, M.; Punt, P.J.; Klis, F.M.; Hondel, C.A.M.J.J. van den; Ram, A.F.J.

    2008-01-01

    To identify cell wall biosynthetic genes in filamentous fungi and thus potential targets for the discovery of new antifungals, we developed a novel screening method for cell wall mutants. It is based on our earlier observation that the Aspergillus niger agsA gene, which encodes a putative a-glucan

  17. Uncertainty in social dilemmas

    OpenAIRE

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size uncertainty). Several researchers have therefore asked themselves the question as to how such uncertainty influences people’s choice behavior. These researchers have repeatedly concluded that uncertainty...

  18. Uncertainty related to Environmental Data and Estimated Extreme Events

    DEFF Research Database (Denmark)

    Burcharth, H. F.

    The design loads on rubble mound breakwaters are almost entirely determined by the environmental conditions, i.e. sea state, water levels, sea bed characteristics, etc. It is the objective of sub-group B to identify the most important environmental parameters and evaluate the related uncertainties...... including those corresponding to extreme estimates typically used for design purposes. Basically a design condition is made up of a set of parameter values stemming from several environmental parameters. To be able to evaluate the uncertainty related to design states one must know the corresponding joint....... Consequently this report deals mainly with each parameter separately. Multi parameter problems are briefly discussed in section 9. It is important to notice that the quantified uncertainties reported in section 7.7 represent what might be regarded as typical figures to be used only when no more qualified...

  19. Parameter Uncertainty for Repository Thermal Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Greenberg, Harris [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dupont, Mark [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-10-01

    This report is one follow-on to a study of reference geologic disposal design concepts (Hardin et al. 2011a). Based on an analysis of maximum temperatures, that study concluded that certain disposal concepts would require extended decay storage prior to emplacement, or the use of small waste packages, or both. The study used nominal values for thermal properties of host geologic media and engineered materials, demonstrating the need for uncertainty analysis to support the conclusions. This report is a first step that identifies the input parameters of the maximum temperature calculation, surveys published data on measured values, uses an analytical approach to determine which parameters are most important, and performs an example sensitivity analysis. Using results from this first step, temperature calculations planned for FY12 can focus on only the important parameters, and can use the uncertainty ranges reported here. The survey of published information on thermal properties of geologic media and engineered materials, is intended to be sufficient for use in generic calculations to evaluate the feasibility of reference disposal concepts. A full compendium of literature data is beyond the scope of this report. The term “uncertainty” is used here to represent both measurement uncertainty and spatial variability, or variability across host geologic units. For the most important parameters (e.g., buffer thermal conductivity) the extent of literature data surveyed samples these different forms of uncertainty and variability. Finally, this report is intended to be one chapter or section of a larger FY12 deliverable summarizing all the work on design concepts and thermal load management for geologic disposal (M3FT-12SN0804032, due 15Aug2012).

  20. Estimation of uncertainty in pKa values determined by potentiometric titration.

    Science.gov (United States)

    Koort, Eve; Herodes, Koit; Pihl, Viljar; Leito, Ivo

    2004-06-01

    A procedure is presented for estimation of uncertainty in measurement of the pK(a) of a weak acid by potentiometric titration. The procedure is based on the ISO GUM. The core of the procedure is a mathematical model that involves 40 input parameters. A novel approach is used for taking into account the purity of the acid, the impurities are not treated as inert compounds only, their possible acidic dissociation is also taken into account. Application to an example of practical pK(a) determination is presented. Altogether 67 different sources of uncertainty are identified and quantified within the example. The relative importance of different uncertainty sources is discussed. The most important source of uncertainty (with the experimental set-up of the example) is the uncertainty of pH measurement followed by the accuracy of the burette and the uncertainty of weighing. The procedure gives uncertainty separately for each point of the titration curve. The uncertainty depends on the amount of titrant added, being lowest in the central part of the titration curve. The possibilities of reducing the uncertainty and interpreting the drift of the pK(a) values obtained from the same curve are discussed.

  1. Sources of uncertainty in individual monitoring for photographic,TL and OSL dosimetry techniques

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Max S.; Silva, Everton R.; Mauricio, Claudia L.P., E-mail: max.das.ferreira@gmail.com, E-mail: everton@ird.gov.br, E-mail: claudia@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The identification of the uncertainty sources and their quantification is essential to the quality of any dosimetric results. If uncertainties are not stated for all dose measurements informed in the monthly dose report to the monitored radiation facilities, they need to be known. This study aims to analyze the influence of different sources of uncertainties associated with photographic, TL and OSL dosimetric techniques, considering the evaluation of occupational doses of whole-body exposure for photons. To identify the sources of uncertainty it was conducted a bibliographic review in specific documents that deal with operational aspects of each technique and the uncertainties associated to each of them. Withal, technical visits to individual monitoring services were conducted to assist in this identification. The sources of uncertainty were categorized and their contributions were expressed in a qualitative way. The process of calibration and traceability are the most important sources of uncertainties, regardless the technique used. For photographic dosimetry, the remaining important uncertainty sources are due to: energy and angular dependence; linearity of response; variations in the films processing. For TL and OSL, the key process for a good performance is respectively the reproducibility of the thermal and optical cycles. For the three techniques, all procedures of the measurement process must be standardized, controlled and reproducible. Further studies can be performed to quantify the contribution of the sources of uncertainty. (author)

  2. Prevalence and Clinical Import of Thoracic Injury Identified by Chest Computed Tomography but Not Chest Radiography in Blunt Trauma: Multicenter Prospective Cohort Study.

    Science.gov (United States)

    Langdorf, Mark I; Medak, Anthony J; Hendey, Gregory W; Nishijima, Daniel K; Mower, William R; Raja, Ali S; Baumann, Brigitte M; Anglin, Deirdre R; Anderson, Craig L; Lotfipour, Shahram; Reed, Karin E; Zuabi, Nadia; Khan, Nooreen A; Bithell, Chelsey A; Rowther, Armaan A; Villar, Julian; Rodriguez, Robert M

    2015-12-01

    Chest computed tomography (CT) diagnoses more injuries than chest radiography, so-called occult injuries. Wide availability of chest CT has driven substantial increase in emergency department use, although the incidence and clinical significance of chest CT findings have not been fully described. We determine the frequency, severity, and clinical import of occult injury, as determined by changes in management. These data will better inform clinical decisions, need for chest CT, and odds of intervention. Our sample included prospective data (2009 to 2013) on 5,912 patients at 10 Level I trauma center EDs with both chest radiography and chest CT at physician discretion. These patients were 40.6% of 14,553 enrolled in the parent study who had either chest radiography or chest CT. Occult injuries were pneumothorax, hemothorax, sternal or greater than 2 rib fractures, pulmonary contusion, thoracic spine or scapula fracture, and diaphragm or great vessel injury found on chest CT but not on preceding chest radiography. A priori, we categorized thoracic injuries as major (having invasive procedures), minor (observation or inpatient pain control >24 hours), or of no clinical significance. Primary outcome was prevalence and proportion of occult injury with major interventions of chest tube, mechanical ventilation, or surgery. Secondary outcome was minor interventions of admission rate or observation hours because of occult injury. Two thousand forty-eight patients (34.6%) had chest injury on chest radiography or chest CT, whereas 1,454 of these patients (71.0%, 24.6% of all patients) had occult injury. Of these, in 954 patients (46.6% of injured, 16.1% of total), chest CT found injuries not observed on immediately preceding chest radiography. In 500 more patients (24.4% of injured patients, 8.5% of all patients), chest radiography found some injury, but chest CT found occult injury. Chest radiography found all injuries in only 29.0% of injured patients. Two hundred and two

  3. Uncertainty budget for k0-NAA

    International Nuclear Information System (INIS)

    Robouch, P.; Arana, G.; Eguskiza, M.; Etxebarria, N.

    2000-01-01

    The concepts of the Guide to the expression of Uncertainties in Measurements for chemical measurements (GUM) and the recommendations of the Eurachem document 'Quantifying Uncertainty in Analytical Methods' are applied to set up the uncertainty budget for k 0 -NAA. The 'universally applicable spreadsheet technique', described by KRAGTEN, is applied to the k 0 -NAA basic equations for the computation of uncertainties. The variance components - individual standard uncertainties - highlight the contribution and the importance of the different parameters to be taken into account. (author)

  4. Risk uncertainty analysis methods for NUREG-1150

    International Nuclear Information System (INIS)

    Benjamin, A.S.; Boyd, G.J.

    1987-01-01

    Evaluation and display of risk uncertainties for NUREG-1150 constitute a principal focus of the Severe Accident Risk Rebaselining/Risk Reduction Program (SARRP). Some of the principal objectives of the uncertainty evaluation are: (1) to provide a quantitative estimate that reflects, for those areas considered, a credible and realistic range of uncertainty in risk; (2) to rank the various sources of uncertainty with respect to their importance for various measures of risk; and (3) to characterize the state of understanding of each aspect of the risk assessment for which major uncertainties exist. This paper describes the methods developed to fulfill these objectives

  5. What Does It Take to Change an Editor's Mind? Identifying Minimally Important Difference Thresholds for Peer Reviewer Rating Scores of Scientific Articles.

    Science.gov (United States)

    Callaham, Michael; John, Leslie K

    2018-01-05

    We define a minimally important difference for the Likert-type scores frequently used in scientific peer review (similar to existing minimally important differences for scores in clinical medicine). The magnitude of score change required to change editorial decisions has not been studied, to our knowledge. Experienced editors at a journal in the top 6% by impact factor were asked how large a change of rating in "overall desirability for publication" was required to trigger a change in their initial decision on an article. Minimally important differences were assessed twice for each editor: once assessing the rating change required to shift the editor away from an initial decision to accept, and the other assessing the magnitude required to shift away from an initial rejection decision. Forty-one editors completed the survey (89% response rate). In the acceptance frame, the median minimally important difference was 0.4 points on a scale of 1 to 5. Editors required a greater rating change to shift from an initial rejection decision; in the rejection frame, the median minimally important difference was 1.2 points. Within each frame, there was considerable heterogeneity: in the acceptance frame, 38% of editors did not change their decision within the maximum available range; in the rejection frame, 51% did not. To our knowledge, this is the first study to determine the minimally important difference for Likert-type ratings of research article quality, or in fact any nonclinical scientific assessment variable. Our findings may be useful for future research assessing whether changes to the peer review process produce clinically meaningful differences in editorial decisionmaking. Copyright © 2017 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  6. Uncertainty in techno-economic estimates of cellulosic ethanol production due to experimental measurement uncertainty

    Directory of Open Access Journals (Sweden)

    Vicari Kristin J

    2012-04-01

    Full Text Available Abstract Background Cost-effective production of lignocellulosic biofuels remains a major financial and technical challenge at the industrial scale. A critical tool in biofuels process development is the techno-economic (TE model, which calculates biofuel production costs using a process model and an economic model. The process model solves mass and energy balances for each unit, and the economic model estimates capital and operating costs from the process model based on economic assumptions. The process model inputs include experimental data on the feedstock composition and intermediate product yields for each unit. These experimental yield data are calculated from primary measurements. Uncertainty in these primary measurements is propagated to the calculated yields, to the process model, and ultimately to the economic model. Thus, outputs of the TE model have a minimum uncertainty associated with the uncertainty in the primary measurements. Results We calculate the uncertainty in the Minimum Ethanol Selling Price (MESP estimate for lignocellulosic ethanol production via a biochemical conversion process: dilute sulfuric acid pretreatment of corn stover followed by enzymatic hydrolysis and co-fermentation of the resulting sugars to ethanol. We perform a sensitivity analysis on the TE model and identify the feedstock composition and conversion yields from three unit operations (xylose from pretreatment, glucose from enzymatic hydrolysis, and ethanol from fermentation as the most important variables. The uncertainty in the pretreatment xylose yield arises from multiple measurements, whereas the glucose and ethanol yields from enzymatic hydrolysis and fermentation, respectively, are dominated by a single measurement: the fraction of insoluble solids (fIS in the biomass slurries. Conclusions We calculate a $0.15/gal uncertainty in MESP from the TE model due to uncertainties in primary measurements. This result sets a lower bound on the error bars of

  7. Correlated uncertainties in integral data

    International Nuclear Information System (INIS)

    McCracken, A.K.

    1978-01-01

    The use of correlated uncertainties in calculational data is shown in cases investigated to lead to a reduction in the uncertainty of calculated quantities of importance to reactor design. It is stressed however that such reductions are likely to be important in a minority of cases of practical interest. The effect of uncertainties in detector cross-sections is considered and is seen to be, in some cases, of equal importance to that in the data used in calculations. Numerical investigations have been limited by the sparse information available on data correlations; some comparisons made of these data reveal quite large inconsistencies for both detector cross-sections and cross-section of interest for reactor calculations

  8. Needs of the CSAU uncertainty method

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2000-01-01

    The use of best estimate codes for safety analysis requires quantification of the uncertainties. These uncertainties are inherently linked to the chosen safety analysis methodology. Worldwide, various methods were proposed for this quantification. The purpose of this paper was to identify the needs of the Code Scaling, Applicability, and Uncertainty (CSAU) methodology and then to answer the needs. The specific procedural steps were combined from other methods for uncertainty evaluation and new tools and procedures were proposed. The uncertainty analysis approach and tools were then utilized for confirmatory study. The uncertainty was quantified for the RELAP5/MOD3.2 thermalhydraulic computer code. The results of the adapted CSAU approach to the small-break loss-of-coolant accident (SB LOCA) show that the adapted CSAU can be used for any thermal-hydraulic safety analysis with uncertainty evaluation. However, it was indicated that there are still some limitations in the CSAU approach that need to be resolved. (author)

  9. Uncertainty in geological and hydrogeological data

    Directory of Open Access Journals (Sweden)

    B. Nilsson

    2007-09-01

    Full Text Available Uncertainty in conceptual model structure and in environmental data is of essential interest when dealing with uncertainty in water resources management. To make quantification of uncertainty possible is it necessary to identify and characterise the uncertainty in geological and hydrogeological data. This paper discusses a range of available techniques to describe the uncertainty related to geological model structure and scale of support. Literature examples on uncertainty in hydrogeological variables such as saturated hydraulic conductivity, specific yield, specific storage, effective porosity and dispersivity are given. Field data usually have a spatial and temporal scale of support that is different from the one on which numerical models for water resources management operate. Uncertainty in hydrogeological data variables is characterised and assessed within the methodological framework of the HarmoniRiB classification.

  10. Comparison of Spot Urine Protein to Creatinine Ratio to 24-Hour Proteinuria to Identify Important Change Over Time in Proteinuria in Lupus.

    Science.gov (United States)

    Medina-Rosas, Jorge; Su, Jiandong; Cook, Richard J; Sabapathy, Arthy; Touma, Zahi

    2017-09-01

    The aim of this study was to determine whether spot urine protein-to-creatinine ratio (PCR) accurately measures the change in proteinuria compared with 24-hour proteinuria (24H-P). This was a retrospective analysis on patients' paired visits and paired urine samples for PCR and 24H-P. Patients with both abnormal 24H-P (>0.5 g/d) and PCR (>0.05 g/mmol) or both normal 24H-P (≤0.5 g/d) and PCR (≤0.05 g/mmol) at baseline visit were identified.The first follow-up visit with partial recovery (50% decrease in proteinuria) or complete recovery (≤0.5 g/d) was identified for those with abnormal baseline 24H-P, and new proteinuria (>0.5 g/d) was identified for those with normal 24H-P. Twenty-four-hour urine collection and PCR end-point frequencies were compared. Twenty-four-hour urine collection results were converted to 24H-PCR. Twenty-four-hour PCR and PCR were utilized to measure the magnitude of change (by standardized response mean [SRM]) in patients who achieved the end points. Of 230 patients, at baseline, 95 patients had abnormal and 109 had normal 24H-P and PCR. On follow-up, 57 achieved partial recovery, and 53 achieved complete recovery by 24H-P. Standardized response mean was -1.03 and -1.10 for 24H-PCR and PCR, respectively. By PCR, 53 patients had partial recovery, and 27 had complete recovery. Standardized response mean was -1.25 and -0.86 by 24H-PCR and PCR, respectively.For new proteinuria, 28 patients were identified by 24H-P and 21 by PCR. Twenty-four-hour PCR SRM was 0.80, and PCR SRM was 0.68. Protein-to-creatinine ratio does not have sufficient accuracy compared with 24H-P for improvement and worsening to be used in lieu of 24H-P.

  11. Evacuation decision-making: process and uncertainty

    International Nuclear Information System (INIS)

    Mileti, D.; Sorensen, J.; Bogard, W.

    1985-09-01

    The purpose was to describe the processes of evacuation decision-making, identify and document uncertainties in that process and discuss implications for federal assumption of liability for precautionary evacuations at nuclear facilities under the Price-Anderson Act. Four major categories of uncertainty are identified concerning the interpretation of hazard, communication problems, perceived impacts of evacuation decisions and exogenous influences. Over 40 historical accounts are reviewed and cases of these uncertainties are documented. The major findings are that all levels of government, including federal agencies experience uncertainties in some evacuation situations. Second, private sector organizations are subject to uncertainties at a variety of decision points. Third, uncertainties documented in the historical record have provided the grounds for liability although few legal actions have ensued. Finally it is concluded that if liability for evacuations is assumed by the federal government, the concept of a ''precautionary'' evacuation is not useful in establishing criteria for that assumption. 55 refs., 1 fig., 4 tabs

  12. Identifying most important skills for PhD students in Food Science and Technology: a comparison between industry and academic stakeholders

    Directory of Open Access Journals (Sweden)

    Chelo González-Martínez

    2015-10-01

    Full Text Available Nowadays, there is an increasing need of new skills for PhD students to face the future labour market prospects. PhD graduates must have qualities attractive not only in academia but also outside, in both manufacture and service-oriented enterprises, in small innovative companies, and in the civil services and public administration, among others. To know what the needs of these future employees are, is of great importance to be able to improve their personal and academic formation. The aim of this work was, in the framework of the EC-funded ISEKI_Food 4 network, to evaluate the most desirable specific and soft skills that PhD students should acquire by the end of their doctoral studies. To this aim, several surveys were conducted and sent to the different stakeholders (academia and food industry partners in order to collect the information needed. Results showed that competences related to research skills and techniques, research management, personal effectiveness and communication skills were considered to be the most valuable skills to be acquired by our PhD students to meet the future needs of the labour market.  The importance of these skills was appreciated differently, depending on the stakeholder. To sum up, some recommendations to integrate such valuable skills into the curricula of the PhD student are given.

  13. Uncertainty Communication. Issues and good practice

    International Nuclear Information System (INIS)

    Kloprogge, P.; Van der Sluijs, J.; Wardekker, A.

    2007-12-01

    In 2003 the Netherlands Environmental Assessment Agency (MNP) published the RIVM/MNP Guidance for Uncertainty Assessment and Communication. The Guidance assists in dealing with uncertainty in environmental assessments. Dealing with uncertainty is essential because assessment results regarding complex environmental issues are of limited value if the uncertainties have not been taken into account adequately. A careful analysis of uncertainties in an environmental assessment is required, but even more important is the effective communication of these uncertainties in the presentation of assessment results. The Guidance yields rich and differentiated insights in uncertainty, but the relevance of this uncertainty information may vary across audiences and uses of assessment results. Therefore, the reporting of uncertainties is one of the six key issues that is addressed in the Guidance. In practice, users of the Guidance felt a need for more practical assistance in the reporting of uncertainty information. This report explores the issue of uncertainty communication in more detail, and contains more detailed guidance on the communication of uncertainty. In order to make this a 'stand alone' document several questions that are mentioned in the detailed Guidance have been repeated here. This document thus has some overlap with the detailed Guidance. Part 1 gives a general introduction to the issue of communicating uncertainty information. It offers guidelines for (fine)tuning the communication to the intended audiences and context of a report, discusses how readers of a report tend to handle uncertainty information, and ends with a list of criteria that uncertainty communication needs to meet to increase its effectiveness. Part 2 helps writers to analyze the context in which communication takes place, and helps to map the audiences, and their information needs. It further helps to reflect upon anticipated uses and possible impacts of the uncertainty information on the

  14. Post-hoc principal component analysis on a largely illiterate elderly population from North-west India to identify important elements of mini-mental state examination

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Raina

    2016-01-01

    Full Text Available Background: Mini-mental state examination (MMSE scale measures cognition using specific elements that can be isolated, defined, and subsequently measured. This study was conducted with the aim to analyze the factorial structure of MMSE in a largely, illiterate, elderly population in India and to reduce the number of variables to a few meaningful and interpretable combinations. Methodology: Principal component analysis (PCA was performed post-hoc on the data generated by a research project conducted to estimate the prevalence of dementia in four geographically defined habitations in Himachal Pradesh state of India. Results: Questions on orientation and registration account for high percentage of cumulative variance in comparison to other questions. Discussion: The PCA conducted on the data derived from a largely, illiterate population reveals that the most important components to consider for the estimation of cognitive impairment in illiterate Indian population are temporal orientation, spatial orientation, and immediate memory.

  15. Post-hoc principal component analysis on a largely illiterate elderly population from North-west India to identify important elements of mini-mental state examination.

    Science.gov (United States)

    Raina, Sunil Kumar; Chander, Vishav; Raina, Sujeet; Grover, Ashoo

    2016-01-01

    Mini-mental state examination (MMSE) scale measures cognition using specific elements that can be isolated, defined, and subsequently measured. This study was conducted with the aim to analyze the factorial structure of MMSE in a largely, illiterate, elderly population in India and to reduce the number of variables to a few meaningful and interpretable combinations. Principal component analysis (PCA) was performed post-hoc on the data generated by a research project conducted to estimate the prevalence of dementia in four geographically defined habitations in Himachal Pradesh state of India. Questions on orientation and registration account for high percentage of cumulative variance in comparison to other questions. The PCA conducted on the data derived from a largely, illiterate population reveals that the most important components to consider for the estimation of cognitive impairment in illiterate Indian population are temporal orientation, spatial orientation, and immediate memory.

  16. Bipolar disorder: The importance of clinical assessment in identifying prognostic factors - An Audit. Part 1: An analysis of potential prognostic factors.

    Science.gov (United States)

    Verdolini, Norma; Dean, Jonathon; Elisei, Sandro; Quartesan, Roberto; Zaman, Rashid; Agius, Mark

    2014-11-01

    Prognostic factors of bipolar disorder must be identified to assist in staging and treatment, and this may be done primarily during the initial psychiatric assessment. In fact, most of the prognostic factors, which determine disease outcome, could be detected from simple but often-unrecorded questions asked during the psychiatric clinic visit. We collected data from the clinical notes of 70 bipolar outpatients seen at the initial psychiatric assessment clinic about socio-demographic and clinical factors to determine whether various factors had relevance to prevalence, prognosis, or outcome. The sample comprised 16 bipolar I (22.9%) and 54 bipolar II (77.1%) outpatients; a psychiatric comorbidity was noted in 26 patients (37.1%). 60.9% (42 patients) reported anxiety features and 12 patients (17.6%) were noted to have obsessive-compulsive characteristics. Percentages reported in our results are of the sample for which the data was available. Anhedonia is a depressive feature that was present in most of the population where this data was available (92.2%, 59 patients) and 81.8% (54 patients) reported suicidal thoughts during a depressive episode. 74.6% (47 patients) had a family history of bipolar disorder, depression, suicide or psychosis. 27 patients (39.7%) reported current alcohol use and 14 patients (22.6%) current illicit drug use. A comparison between 10 prognostic factors found that only the correlations between current illicit drug use/previous illicit drug use (χ(2)=11.471, Palcohol use/previous alcohol use (χ(2)=31.510, Palcohol use (χ(2)=5.071, P=0.023) and previous alcohol use/family history (χ(2)=4.309, P=0.037) were almost statistically significant. 17 patients (24.3%) of the 70 bipolar patients were assigned to a care coordinator; we have evaluated the possible differences between the patients with or without a care coordinator on the basis of the presence of 10 possible prognostic factors and found no statistically significant differences between

  17. Inventories and sales uncertainty\\ud

    OpenAIRE

    Caglayan, M.; Maioli, S.; Mateut, S.

    2011-01-01

    We investigate the empirical linkages between sales uncertainty and firms´ inventory investment behavior while controlling for firms´ financial strength. Using large panels of manufacturing firms from several European countries we find that higher sales uncertainty leads to larger stocks of inventories. We also identify an indirect effect of sales uncertainty on inventory accumulation through the financial strength of firms. Our results provide evidence that financial strength mitigates the a...

  18. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    Science.gov (United States)

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  19. Improving a full-text search engine: the importance of negation detection and family history context to identify cases in a biomedical data warehouse.

    Science.gov (United States)

    Garcelon, Nicolas; Neuraz, Antoine; Benoit, Vincent; Salomon, Rémi; Burgun, Anita

    2017-05-01

    The repurposing of electronic health records (EHRs) can improve clinical and genetic research for rare diseases. However, significant information in rare disease EHRs is embedded in the narrative reports, which contain many negated clinical signs and family medical history. This paper presents a method to detect family history and negation in narrative reports and evaluates its impact on selecting populations from a clinical data warehouse (CDW). We developed a pipeline to process 1.6 million reports from multiple sources. This pipeline is part of the load process of the Necker Hospital CDW. We identified patients with "Lupus and diarrhea," "Crohn's and diabetes," and "NPHP1" from the CDW. The overall precision, recall, specificity, and F-measure were 0.85, 0.98, 0.93, and 0.91, respectively. The proposed method generates a highly accurate identification of cases from a CDW of rare disease EHRs. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  20. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  1. Proteome analysis identifies the Dpr protein of Streptococcus mutans as an important factor in the presence of early streptococcal colonizers of tooth surfaces.

    Directory of Open Access Journals (Sweden)

    Akihiro Yoshida

    Full Text Available Oral streptococci are primary colonizers of tooth surfaces and Streptococcus mutans is the principal causative agent of dental caries in humans. A number of proteins are involved in the formation of monospecies biofilms by S. mutans. This study analyzed the protein expression profiles of S. mutans biofilms formed in the presence or absence of S. gordonii, a pioneer colonizer of the tooth surface, by two-dimensional gel electrophoresis (2-DE. After identifying S. mutans proteins by Mass spectrometric analysis, their expression in the presence of S. gordonii was analyzed. S. mutans was inoculated with or without S. gordonii DL1. The two species were compartmentalized using 0.2-μl Anopore membranes. The biofilms on polystyrene plates were harvested, and the solubilized proteins were separated by 2-DE. When S. mutans biofilms were formed in the presence of S. gordonii, the peroxide resistance protein Dpr of the former showed 4.3-fold increased expression compared to biofilms that developed in the absence of the pioneer colonizer. In addition, we performed a competition assay using S. mutans antioxidant protein mutants together with S. gordonii and other initial colonizers. Growth of the dpr-knockout S. mutans mutant was significantly inhibited by S. gordonii, as well as by S. sanguinis. Furthermore, a cell viability assay revealed that the viability of the dpr-defective mutant was significantly attenuated compared to the wild-type strain when co-cultured with S. gordonii. Therefore, these results suggest that Dpr might be one of the essential proteins for S. mutans survival on teeth in the presence of early colonizing oral streptococci.

  2. Proteome Analysis Identifies the Dpr Protein of Streptococcus mutans as an Important Factor in the Presence of Early Streptococcal Colonizers of Tooth Surfaces

    Science.gov (United States)

    Yoshida, Akihiro; Niki, Mamiko; Yamamoto, Yuji; Yasunaga, Ai; Ansai, Toshihiro

    2015-01-01

    Oral streptococci are primary colonizers of tooth surfaces and Streptococcus mutans is the principal causative agent of dental caries in humans. A number of proteins are involved in the formation of monospecies biofilms by S. mutans. This study analyzed the protein expression profiles of S. mutans biofilms formed in the presence or absence of S. gordonii, a pioneer colonizer of the tooth surface, by two-dimensional gel electrophoresis (2-DE). After identifying S. mutans proteins by Mass spectrometric analysis, their expression in the presence of S. gordonii was analyzed. S. mutans was inoculated with or without S. gordonii DL1. The two species were compartmentalized using 0.2-μl Anopore membranes. The biofilms on polystyrene plates were harvested, and the solubilized proteins were separated by 2-DE. When S. mutans biofilms were formed in the presence of S. gordonii, the peroxide resistance protein Dpr of the former showed 4.3-fold increased expression compared to biofilms that developed in the absence of the pioneer colonizer. In addition, we performed a competition assay using S. mutans antioxidant protein mutants together with S. gordonii and other initial colonizers. Growth of the dpr-knockout S. mutans mutant was significantly inhibited by S. gordonii, as well as by S. sanguinis. Furthermore, a cell viability assay revealed that the viability of the dpr-defective mutant was significantly attenuated compared to the wild-type strain when co-cultured with S. gordonii. Therefore, these results suggest that Dpr might be one of the essential proteins for S. mutans survival on teeth in the presence of early colonizing oral streptococci. PMID:25816242

  3. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    Directory of Open Access Journals (Sweden)

    J. Florian Wellmann

    2013-04-01

    Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.

  4. Foxtail millet NF-Y families: genome-wide survey and evolution analyses identified two functional genes important in abiotic stresses

    Directory of Open Access Journals (Sweden)

    Zhi-Juan eFeng

    2015-12-01

    Full Text Available It was reported that Nuclear Factor Y (NF-Y genes were involved in abiotic stress in plants. Foxtail millet (Setaria italica, an elite stress tolerant crop, provided an impetus for the investigation of the NF-Y families in abiotic responses. In the present study, a total of 39 NF-Y genes were identified in foxtail millet. Synteny analyses suggested that foxtail millet NF-Y genes had experienced rapid expansion and strong purifying selection during the process of plant evolution. De novo transcriptome assembly of foxtail millet revealed 11 drought up-regulated NF-Y genes. SiNF-YA1 and SiNF-YB8 were highly activated in leaves and/or roots by drought and salt stresses. Abscisic acid (ABA and H2O2 played positive roles in the induction of SiNF-YA1 and SiNF-YB8 under stress treatments. Transient luciferase (LUC expression assays revealed that SiNF-YA1 and SiNF-YB8 could activate the LUC gene driven by the tobacco (Nicotiana tobacam NtERD10, NtLEA5, NtCAT, NtSOD or NtPOD promoter under normal or stress conditions. Overexpression of SiNF-YA1 enhanced drought and salt tolerance by activating stress-related genes NtERD10 and NtCAT1 and by maintaining relatively stable relative water content (RWC and contents of chlorophyll, superoxide dismutase (SOD, peroxidase (POD, catalase (CAT and malondialdehyde (MDA in transgenic lines under stresses. SiNF-YB8 regulated expression of NtSOD, NtPOD, NtLEA5 and NtERD10 and conferred relatively high RWC and chlorophyll contents and low MDA content, resulting in drought and osmotic tolerance in transgenic lines under stresses. Therefore, SiNF-YA1 and SiNF-YB8 could activate stress-related genes and improve physiological traits, resulting in tolerance to abiotic stresses in plants. All these results will facilitate functional characterization of foxtail millet NF-Ys in future studies.

  5. Sensitivity and uncertainty analyses in aging risk-based prioritizations

    International Nuclear Information System (INIS)

    Hassan, M.; Uryas'ev, S.; Vesely, W.E.

    1993-01-01

    Aging risk evaluations of nuclear power plants using Probabilistic Risk Analyses (PRAs) involve assessments of the impact of aging structures, systems, and components (SSCs) on plant core damage frequency (CDF). These assessments can be used to prioritize the contributors to aging risk reflecting the relative risk potential of the SSCs. Aging prioritizations are important for identifying the SSCs contributing most to plant risk and can provide a systematic basis on which aging risk control and management strategies for a plant can be developed. However, these prioritizations are subject to variabilities arising from uncertainties in data, and/or from various modeling assumptions. The objective of this paper is to present an evaluation of the sensitivity of aging prioritizations of active components to uncertainties in aging risk quantifications. Approaches for robust prioritization of SSCs also are presented which are less susceptible to the uncertainties

  6. Quantifying chemical uncertainties in simulations of the ISM

    Science.gov (United States)

    Glover, Simon

    2018-06-01

    The ever-increasing power of large parallel computers now makes it possible to include increasingly sophisticated chemical models in three-dimensional simulations of the interstellar medium (ISM). This allows us to study the role that chemistry plays in the thermal balance of a realistically-structured, turbulent ISM, as well as enabling us to generated detailed synthetic observations of important atomic or molecular tracers. However, one major constraint on the accuracy of these models is the accuracy with which the input chemical rate coefficients are known. Uncertainties in these chemical rate coefficients inevitably introduce uncertainties into the model predictions. In this talk, I will review some of the methods we can use to quantify these uncertainties and to identify the key reactions where improved chemical data is most urgently required. I will also discuss a few examples, ranging from the local ISM to the high-redshift universe.

  7. Uncertainty covariances in robotics applications

    International Nuclear Information System (INIS)

    Smith, D.L.

    1984-01-01

    The application of uncertainty covariance matrices in the analysis of robot trajectory errors is explored. First, relevant statistical concepts are reviewed briefly. Then, a simple, hypothetical robot model is considered to illustrate methods for error propagation and performance test data evaluation. The importance of including error correlations is emphasized

  8. Utilization of Software Tools for Uncertainty Calculation in Measurement Science Education

    International Nuclear Information System (INIS)

    Zangl, Hubert; Zine-Zine, Mariam; Hoermaier, Klaus

    2015-01-01

    Despite its importance, uncertainty is often neglected by practitioners in the design of system even in safety critical applications. Thus, problems arising from uncertainty may only be identified late in the design process and thus lead to additional costs. Although there exists numerous tools to support uncertainty calculation, reasons for limited usage in early design phases may be low awareness of the existence of the tools and insufficient training in the practical application. We present a teaching philosophy that addresses uncertainty from the very beginning of teaching measurement science, in particular with respect to the utilization of software tools. The developed teaching material is based on the GUM method and makes use of uncertainty toolboxes in the simulation environment. Based on examples in measurement science education we discuss advantages and disadvantages of the proposed teaching philosophy and include feedback from students

  9. Decommissioning Funding: Ethics, Implementation, Uncertainties

    International Nuclear Information System (INIS)

    2007-01-01

    This status report on decommissioning funding: ethics, implementation, uncertainties is based on a review of recent literature and materials presented at NEA meetings in 2003 and 2004, and particularly at a topical session organised in November 2004 on funding issues associated with the decommissioning of nuclear power facilities. The report also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). This report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems

  10. Wastewater treatment modelling: dealing with uncertainties

    DEFF Research Database (Denmark)

    Belia, E.; Amerlinck, Y.; Benedetti, L.

    2009-01-01

    This paper serves as a problem statement of the issues surrounding uncertainty in wastewater treatment modelling. The paper proposes a structure for identifying the sources of uncertainty introduced during each step of an engineering project concerned with model-based design or optimisation...

  11. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  12. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  13. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  14. Uncertainty and Cognitive Control

    Directory of Open Access Journals (Sweden)

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  15. Report on the uncertainty methods study

    International Nuclear Information System (INIS)

    1998-06-01

    The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes: the Pisa method (based on extrapolation from integral experiments) and four methods identifying and combining input uncertainties. Three of these, the GRS, IPSN and ENUSA methods, use subjective probability distributions, and one, the AEAT method, performs a bounding analysis. Each method has been used to calculate the uncertainty in specified parameters for the LSTF SB-CL-18 5% cold leg small break LOCA experiment in the ROSA-IV Large Scale Test Facility (LSTF). The uncertainty analysis was conducted essentially blind and the participants did not use experimental measurements from the test as input apart from initial and boundary conditions. Participants calculated uncertainty ranges for experimental parameters including pressurizer pressure, primary circuit inventory and clad temperature (at a specified position) as functions of time

  16. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    Science.gov (United States)

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  17. Uncertainties in radioecological assessment models

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.; Ng, Y.C.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because models are inexact representations of real systems. The major sources of this uncertainty are related to bias in model formulation and imprecision in parameter estimation. The magnitude of uncertainty is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, health risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible. 41 references, 4 figures, 4 tables

  18. Uncertainty modeling and decision support

    International Nuclear Information System (INIS)

    Yager, Ronald R.

    2004-01-01

    We first formulate the problem of decision making under uncertainty. The importance of the representation of our knowledge about the uncertainty in formulating a decision process is pointed out. We begin with a brief discussion of the case of probabilistic uncertainty. Next, in considerable detail, we discuss the case of decision making under ignorance. For this case the fundamental role of the attitude of the decision maker is noted and its subjective nature is emphasized. Next the case in which a Dempster-Shafer belief structure is used to model our knowledge of the uncertainty is considered. Here we also emphasize the subjective choices the decision maker must make in formulating a decision function. The case in which the uncertainty is represented by a fuzzy measure (monotonic set function) is then investigated. We then return to the Dempster-Shafer belief structure and show its relationship to the fuzzy measure. This relationship allows us to get a deeper understanding of the formulation the decision function used Dempster- Shafer framework. We discuss how this deeper understanding allows a decision analyst to better make the subjective choices needed in the formulation of the decision function

  19. Reusable launch vehicle model uncertainties impact analysis

    Science.gov (United States)

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  20. "Maybe the Algae Was from the Filter": Maybe and Similar Modifiers as Mediational Tools and Indicators of Uncertainty and Possibility in Children's Science Talk

    Science.gov (United States)

    Kirch, Susan A.; Siry, Christina A.

    2012-01-01

    Uncertainty is an essential component of scientific inquiry and it also permeates our daily lives. Understanding how to identify, evaluate, resolve and live in the presence of uncertainty is important for decision-making strategies and engaging in transformative actions. In contrast, confidence and certainty are prized in elementary school…

  1. Section summary: Uncertainty and design considerations

    Science.gov (United States)

    Stephen Hagen

    2013-01-01

    Well planned sampling designs and robust approaches to estimating uncertainty are critical components of forest monitoring. The importance of uncertainty estimation increases as deforestation and degradation issues become more closely tied to financing incentives for reducing greenhouse gas emissions in the forest sector. Investors like to know risk and risk is tightly...

  2. Gamma-Ray Telescope and Uncertainty Principle

    Science.gov (United States)

    Shivalingaswamy, T.; Kagali, B. A.

    2012-01-01

    Heisenberg's Uncertainty Principle is one of the important basic principles of quantum mechanics. In most of the books on quantum mechanics, this uncertainty principle is generally illustrated with the help of a gamma ray microscope, wherein neither the image formation criterion nor the lens properties are taken into account. Thus a better…

  3. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    Energy Technology Data Exchange (ETDEWEB)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  4. Ruminations On NDA Measurement Uncertainty Compared TO DA Uncertainty

    International Nuclear Information System (INIS)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-01-01

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  5. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  6. Model-specification uncertainty in future forest pest outbreak.

    Science.gov (United States)

    Boulanger, Yan; Gray, David R; Cooke, Barry J; De Grandpré, Louis

    2016-04-01

    Climate change will modify forest pest outbreak characteristics, although there are disagreements regarding the specifics of these changes. A large part of this variability may be attributed to model specifications. As a case study, we developed a consensus model predicting spruce budworm (SBW, Choristoneura fumiferana [Clem.]) outbreak duration using two different predictor data sets and six different correlative methods. The model was used to project outbreak duration and the uncertainty associated with using different data sets and correlative methods (=model-specification uncertainty) for 2011-2040, 2041-2070 and 2071-2100, according to three forcing scenarios (RCP 2.6, RCP 4.5 and RCP 8.5). The consensus model showed very high explanatory power and low bias. The model projected a more important northward shift and decrease in outbreak duration under the RCP 8.5 scenario. However, variation in single-model projections increases with time, making future projections highly uncertain. Notably, the magnitude of the shifts in northward expansion, overall outbreak duration and the patterns of outbreaks duration at the southern edge were highly variable according to the predictor data set and correlative method used. We also demonstrated that variation in forcing scenarios contributed only slightly to the uncertainty of model projections compared with the two sources of model-specification uncertainty. Our approach helped to quantify model-specification uncertainty in future forest pest outbreak characteristics. It may contribute to sounder decision-making by acknowledging the limits of the projections and help to identify areas where model-specification uncertainty is high. As such, we further stress that this uncertainty should be strongly considered when making forest management plans, notably by adopting adaptive management strategies so as to reduce future risks. © 2015 Her Majesty the Queen in Right of Canada Global Change Biology © 2015 Published by John

  7. Methodology for qualitative uncertainty assessment of climate impact indicators

    Science.gov (United States)

    Otto, Juliane; Keup-Thiel, Elke; Rechid, Diana; Hänsler, Andreas; Pfeifer, Susanne; Roth, Ellinor; Jacob, Daniela

    2016-04-01

    The FP7 project "Climate Information Portal for Copernicus" (CLIPC) is developing an integrated platform of climate data services to provide a single point of access for authoritative scientific information on climate change and climate change impacts. In this project, the Climate Service Center Germany (GERICS) has been in charge of the development of a methodology on how to assess the uncertainties related to climate impact indicators. Existing climate data portals mainly treat the uncertainties in two ways: Either they provide generic guidance and/or express with statistical measures the quantifiable fraction of the uncertainty. However, none of the climate data portals give the users a qualitative guidance how confident they can be in the validity of the displayed data. The need for such guidance was identified in CLIPC user consultations. Therefore, we aim to provide an uncertainty assessment that provides the users with climate impact indicator-specific guidance on the degree to which they can trust the outcome. We will present an approach that provides information on the importance of different sources of uncertainties associated with a specific climate impact indicator and how these sources affect the overall 'degree of confidence' of this respective indicator. To meet users requirements in the effective communication of uncertainties, their feedback has been involved during the development process of the methodology. Assessing and visualising the quantitative component of uncertainty is part of the qualitative guidance. As visual analysis method, we apply the Climate Signal Maps (Pfeifer et al. 2015), which highlight only those areas with robust climate change signals. Here, robustness is defined as a combination of model agreement and the significance of the individual model projections. Reference Pfeifer, S., Bülow, K., Gobiet, A., Hänsler, A., Mudelsee, M., Otto, J., Rechid, D., Teichmann, C. and Jacob, D.: Robustness of Ensemble Climate Projections

  8. Adult head CT scans: the uncertainties of effective dose estimates

    International Nuclear Information System (INIS)

    Gregory, Kent J.; Bibbo, Giovanni; Pattison, John E.

    2008-01-01

    Full Text: CT scanning is a high dose imaging modality. Effective dose estimates from CT scans can provide important information to patients and medical professionals. For example, medical practitioners can use the dose to estimate the risk to the patient, and judge whether this risk is outweighed by the benefits of the CT examination, while radiographers can gauge the effect of different scanning protocols on the patient effective dose, and take this into consideration when establishing routine scan settings. Dose estimates also form an important part of epidemiological studies examining the health effects of medical radiation exposures on the wider population. Medical physicists have been devoting significant effort towards estimating patient radiation doses from diagnostic CT scans for some years. The question arises: How accurate are these effective dose estimates? The need for a greater understanding and improvement of the uncertainties in CT dose estimates is now gaining recognition as an important issue (BEIR VII 2006). This study is an attempt to analyse and quantify the uncertainty components relating to effective dose estimates from adult head CT examinations that are calculated with four commonly used methods. The dose estimation methods analysed are the Nagel method, the ImpaCT method, the Wellhoefer method and the Dose-Length Product (DLP) method. The analysis of the uncertainties was performed in accordance with the International Standards Organisation's Guide to the Expression of Uncertainty in Measurement as discussed in Gregory et al (Australas. Phys. Eng. Sci. Med., 28: 131-139, 2005). The uncertainty components vary, depending on the method used to derive the effective dose estimate. Uncertainty components in this study include the statistical and other errors from Monte Carlo simulations, uncertainties in the CT settings and positions of patients in the CT gantry, calibration errors from pencil ionization chambers, the variations in the organ

  9. Uncertainty in artificial intelligence

    CERN Document Server

    Kanal, LN

    1986-01-01

    How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

  10. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  11. Results from the Application of Uncertainty Methods in the CSNI Uncertainty Methods Study (UMS)

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a - best estimate - concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced -best estimate- thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges.

  12. [Influence of Uncertainty and Uncertainty Appraisal on Self-management in Hemodialysis Patients].

    Science.gov (United States)

    Jang, Hyung Suk; Lee, Chang Suk; Yang, Young Hee

    2015-04-01

    This study was done to examine the relation of uncertainty, uncertainty appraisal, and self-management in patients undergoing hemodialysis, and to identify factors influencing self-management. A convenience sample of 92 patients receiving hemodialysis was selected. Data were collected using a structured questionnaire and medical records. The collected data were analyzed using descriptive statistics, t-test, ANOVA, Pearson correlations and multiple regression analysis with the SPSS/WIN 20.0 program. The participants showed a moderate level of uncertainty with the highest score being for ambiguity among the four uncertainty subdomains. Scores for uncertainty danger or opportunity appraisals were under the mid points. The participants were found to perform a high level of self-management such as diet control, management of arteriovenous fistula, exercise, medication, physical management, measurements of body weight and blood pressure, and social activity. The self-management of participants undergoing hemodialysis showed a significant relationship with uncertainty and uncertainty appraisal. The significant factors influencing self-management were uncertainty, uncertainty opportunity appraisal, hemodialysis duration, and having a spouse. These variables explained 32.8% of the variance in self-management. The results suggest that intervention programs to reduce the level of uncertainty and to increase the level of uncertainty opportunity appraisal among patients would improve the self-management of hemodialysis patients.

  13. LOFT differential pressure uncertainty analysis

    International Nuclear Information System (INIS)

    Evans, R.P.; Biladeau, G.L.; Quinn, P.A.

    1977-03-01

    A performance analysis of the LOFT differential pressure (ΔP) measurement is presented. Along with completed descriptions of test programs and theoretical studies that have been conducted on the ΔP, specific sources of measurement uncertainty are identified, quantified, and combined to provide an assessment of the ability of this measurement to satisfy the SDD 1.4.1C (June 1975) requirement of measurement of differential pressure

  14. Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling

    Science.gov (United States)

    Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.

    2017-12-01

    Hyporheic exchange is the interaction of water between rivers and groundwater, and is difficult to predict. One of the largest contributions to predictive uncertainty for hyporheic fluxes have been attributed to the representation of heterogeneous subsurface properties. This research aims to evaluate which aspect of the subsurface representation - the spatial distribution of hydrofacies or the model for local-scale (within-facies) heterogeneity - most influences the predictive uncertainty. Also, we seek to identify data types that help reduce this uncertainty best. For this investigation, we conduct a modelling study of the Steinlach River meander, in Southwest Germany. The Steinlach River meander is an experimental site established in 2010 to monitor hyporheic exchange at the meander scale. We use HydroGeoSphere, a fully integrated surface water-groundwater model, to model hyporheic exchange and to assess the predictive uncertainty of hyporheic exchange transit times (HETT). A highly parameterized complex model is built and treated as `virtual reality', which is in turn modelled with simpler subsurface parameterization schemes (Figure). Then, we conduct Monte-Carlo simulations with these models to estimate the predictive uncertainty. Results indicate that: Uncertainty in HETT is relatively small for early times and increases with transit times. Uncertainty from local-scale heterogeneity is negligible compared to uncertainty in the hydrofacies distribution. Introducing more data to a poor model structure may reduce predictive variance, but does not reduce predictive bias. Hydraulic head observations alone cannot constrain the uncertainty of HETT, however an estimate of hyporheic exchange flux proves to be more effective at reducing this uncertainty. Figure: Approach for evaluating predictive model uncertainty. A conceptual model is first developed from the field investigations. A complex model (`virtual reality') is then developed based on that conceptual model

  15. Uncertainty in social dilemmas

    NARCIS (Netherlands)

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size

  16. Uncertainty and Climate Change

    OpenAIRE

    Berliner, L. Mark

    2003-01-01

    Anthropogenic, or human-induced, climate change is a critical issue in science and in the affairs of humankind. Though the target of substantial research, the conclusions of climate change studies remain subject to numerous uncertainties. This article presents a very brief review of the basic arguments regarding anthropogenic climate change with particular emphasis on uncertainty.

  17. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  18. Uncertainty and simulation

    International Nuclear Information System (INIS)

    Depres, B.; Dossantos-Uzarralde, P.

    2009-01-01

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  19. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.; Curioni, A.; Fedulova, I.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost

  20. Overview of methods for uncertainty analysis and sensitivity analysis in probabilistic risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.

    1985-01-01

    Probabilistic Risk Assessment (PRA) is playing an increasingly important role in the nuclear reactor regulatory process. The assessment of uncertainties associated with PRA results is widely recognized as an important part of the analysis process. One of the major criticisms of the Reactor Safety Study was that its representation of uncertainty was inadequate. The desire for the capability to treat uncertainties with the MELCOR risk code being developed at Sandia National Laboratories is indicative of the current interest in this topic. However, as yet, uncertainty analysis and sensitivity analysis in the context of PRA is a relatively immature field. In this paper, available methods for uncertainty analysis and sensitivity analysis in a PRA are reviewed. This review first treats methods for use with individual components of a PRA and then considers how these methods could be combined in the performance of a complete PRA. In the context of this paper, the goal of uncertainty analysis is to measure the imprecision in PRA outcomes of interest, and the goal of sensitivity analysis is to identify the major contributors to this imprecision. There are a number of areas that must be considered in uncertainty analysis and sensitivity analysis for a PRA: (1) information, (2) systems analysis, (3) thermal-hydraulic phenomena/fission product behavior, (4) health and economic consequences, and (5) display of results. Each of these areas and the synthesis of them into a complete PRA are discussed

  1. Conditional uncertainty principle

    Science.gov (United States)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  2. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  3. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  4. Visualizing Summary Statistics and Uncertainty

    KAUST Repository

    Potter, K.

    2010-08-12

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  5. Visualizing Summary Statistics and Uncertainty

    KAUST Repository

    Potter, K.; Kniss, J.; Riesenfeld, R.; Johnson, C.R.

    2010-01-01

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  6. Uncertainty propagation in nuclear forensics

    International Nuclear Information System (INIS)

    Pommé, S.; Jerome, S.M.; Venchiarutti, C.

    2014-01-01

    Uncertainty propagation formulae are presented for age dating in support of nuclear forensics. The age of radioactive material in this context refers to the time elapsed since a particular radionuclide was chemically separated from its decay product(s). The decay of the parent radionuclide and ingrowth of the daughter nuclide are governed by statistical decay laws. Mathematical equations allow calculation of the age of specific nuclear material through the atom ratio between parent and daughter nuclides, or through the activity ratio provided that the daughter nuclide is also unstable. The derivation of the uncertainty formulae of the age may present some difficulty to the user community and so the exact solutions, some approximations, a graphical representation and their interpretation are presented in this work. Typical nuclides of interest are actinides in the context of non-proliferation commitments. The uncertainty analysis is applied to a set of important parent–daughter pairs and the need for more precise half-life data is examined. - Highlights: • Uncertainty propagation formulae for age dating with nuclear chronometers. • Applied to parent–daughter pairs used in nuclear forensics. • Investigated need for better half-life data

  7. Assessing scenario and parametric uncertainties in risk analysis: a model uncertainty audit

    International Nuclear Information System (INIS)

    Tarantola, S.; Saltelli, A.; Draper, D.

    1999-01-01

    In the present study a process of model audit is addressed on a computational model used for predicting maximum radiological doses to humans in the field of nuclear waste disposal. Global uncertainty and sensitivity analyses are employed to assess output uncertainty and to quantify the contribution of parametric and scenario uncertainties to the model output. These tools are of fundamental importance for risk analysis and decision making purposes

  8. Background and Qualification of Uncertainty Methods

    International Nuclear Information System (INIS)

    D'Auria, F.; Petruzzi, A.

    2008-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. The paper reviews the salient features of two independent approaches for estimating uncertainties associated with predictions of complex system codes. Namely the propagation of code input error and the propagation of the calculation output error constitute the key-words for identifying the methods of current interest for industrial applications. Throughout the developed methods, uncertainty bands can be derived (both upper and lower) for any desired quantity of the transient of interest. For the second case, the uncertainty method is coupled with the thermal-hydraulic code to get the Code with capability of Internal Assessment of Uncertainty, whose features are discussed in more detail.

  9. Coping with uncertainty in environmental impact assessments: Open techniques

    NARCIS (Netherlands)

    Chivatá Cárdenas, Ibsen; Halman, Johannes I.M.

    2016-01-01

    Uncertainty is virtually unavoidable in environmental impact assessments (EIAs). From the literature related to treating and managing uncertainty, we have identified specific techniques for coping with uncertainty in EIAs. Here, we have focused on basic steps in the decision-making process that take

  10. Sketching Uncertainty into Simulations.

    Science.gov (United States)

    Ribicic, H; Waser, J; Gurbat, R; Sadransky, B; Groller, M E

    2012-12-01

    In a variety of application areas, the use of simulation steering in decision making is limited at best. Research focusing on this problem suggests that most user interfaces are too complex for the end user. Our goal is to let users create and investigate multiple, alternative scenarios without the need for special simulation expertise. To simplify the specification of parameters, we move from a traditional manipulation of numbers to a sketch-based input approach. Users steer both numeric parameters and parameters with a spatial correspondence by sketching a change onto the rendering. Special visualizations provide immediate visual feedback on how the sketches are transformed into boundary conditions of the simulation models. Since uncertainty with respect to many intertwined parameters plays an important role in planning, we also allow the user to intuitively setup complete value ranges, which are then automatically transformed into ensemble simulations. The interface and the underlying system were developed in collaboration with experts in the field of flood management. The real-world data they have provided has allowed us to construct scenarios used to evaluate the system. These were presented to a variety of flood response personnel, and their feedback is discussed in detail in the paper. The interface was found to be intuitive and relevant, although a certain amount of training might be necessary.

  11. Pandemic influenza: certain uncertainties

    Science.gov (United States)

    Morens, David M.; Taubenberger, Jeffery K.

    2011-01-01

    SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, “wave” patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672

  12. Physician Rating Websites: What Aspects Are Important to Identify a Good Doctor, and Are Patients Capable of Assessing Them? A Mixed-Methods Approach Including Physicians' and Health Care Consumers' Perspectives.

    Science.gov (United States)

    Rothenfluh, Fabia; Schulz, Peter J

    2017-05-01

    Physician rating websites (PRWs) offer health care consumers the opportunity to evaluate their doctor anonymously. However, physicians' professional training and experience create a vast knowledge gap in medical matters between physicians and patients. This raises ethical concerns about the relevance and significance of health care consumers' evaluation of physicians' performance. To identify the aspects physician rating websites should offer for evaluation, this study investigated the aspects of physicians and their practice relevant for identifying a good doctor, and whether health care consumers are capable of evaluating these aspects. In a first step, a Delphi study with physicians from 4 specializations was conducted, testing various indicators to identify a good physician. These indicators were theoretically derived from Donabedian, who classifies quality in health care into pillars of structure, process, and outcome. In a second step, a cross-sectional survey with health care consumers in Switzerland (N=211) was launched based on the indicators developed in the Delphi study. Participants were asked to rate the importance of these indicators to identify a good physician and whether they would feel capable to evaluate those aspects after the first visit to a physician. All indicators were ordered into a 4×4 grid based on evaluation and importance, as judged by the physicians and health care consumers. Agreement between the physicians and health care consumers was calculated applying Holsti's method. In the majority of aspects, physicians and health care consumers agreed on what facets of care were important and not important to identify a good physician and whether patients were able to evaluate them, yielding a level of agreement of 74.3%. The two parties agreed that the infrastructure, staff, organization, and interpersonal skills are both important for a good physician and can be evaluated by health care consumers. Technical skills of a doctor and outcomes

  13. Uncertainty Propagation in OMFIT

    Science.gov (United States)

    Smith, Sterling; Meneghini, Orso; Sung, Choongki

    2017-10-01

    A rigorous comparison of power balance fluxes and turbulent model fluxes requires the propagation of uncertainties in the kinetic profiles and their derivatives. Making extensive use of the python uncertainties package, the OMFIT framework has been used to propagate covariant uncertainties to provide an uncertainty in the power balance calculation from the ONETWO code, as well as through the turbulent fluxes calculated by the TGLF code. The covariant uncertainties arise from fitting 1D (constant on flux surface) density and temperature profiles and associated random errors with parameterized functions such as a modified tanh. The power balance and model fluxes can then be compared with quantification of the uncertainties. No effort is made at propagating systematic errors. A case study will be shown for the effects of resonant magnetic perturbations on the kinetic profiles and fluxes at the top of the pedestal. A separate attempt at modeling the random errors with Monte Carlo sampling will be compared to the method of propagating the fitting function parameter covariant uncertainties. Work supported by US DOE under DE-FC02-04ER54698, DE-FG2-95ER-54309, DE-SC 0012656.

  14. [Dealing with diagnostic uncertainty in general practice].

    Science.gov (United States)

    Wübken, Magdalena; Oswald, Jana; Schneider, Antonius

    2013-01-01

    In general, the prevalence of diseases is low in primary care. Therefore, the positive predictive value of diagnostic tests is lower than in hospitals where patients are highly selected. In addition, the patients present with milder forms of disease; and many diseases might hide behind the initial symptom(s). These facts lead to diagnostic uncertainty which is somewhat inherent to general practice. This narrative review discusses different sources of and reasons for uncertainty and strategies to deal with it in the context of the current literature. Fear of uncertainty correlates with higher diagnostic activities. The attitude towards uncertainty correlates with the choice of medical speciality by vocational trainees or medical students. An intolerance of uncertainty, which still increases as medicine is making steady progress, might partly explain the growing shortage of general practitioners. The bio-psycho-social context appears to be important to diagnostic decision-making. The effect of intuition and heuristics are investigated by cognitive psychologists. It is still unclear whether these aspects are prone to bias or useful, which might depend on the context of medical decisions. Good communication is of great importance to share uncertainty with the patients in a transparent way and to alleviate shared decision-making. Dealing with uncertainty should be seen as an important core component of general practice and needs to be investigated in more detail to improve the respective medical decisions. Copyright © 2013. Published by Elsevier GmbH.

  15. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    , and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  16. Uncertainty governance: an integrated framework for managing and communicating uncertainties

    International Nuclear Information System (INIS)

    Umeki, H.; Naito, M.; Takase, H.

    2004-01-01

    Treatment of uncertainty, or in other words, reasoning with imperfect information is widely recognised as being of great importance within performance assessment (PA) of the geological disposal mainly because of the time scale of interest and spatial heterogeneity that geological environment exhibits. A wide range of formal methods have been proposed for the optimal processing of incomplete information. Many of these methods rely on the use of numerical information, the frequency based concept of probability in particular, to handle the imperfections. However, taking quantitative information as a base for models that solve the problem of handling imperfect information merely creates another problem, i.e., how to provide the quantitative information. In many situations this second problem proves more resistant to solution, and in recent years several authors have looked at a particularly ingenious way in accordance with the rules of well-founded methods such as Bayesian probability theory, possibility theory, and the Dempster-Shafer theory of evidence. Those methods, while drawing inspiration from quantitative methods, do not require the kind of complete numerical information required by quantitative methods. Instead they provide information that, though less precise than that provided by quantitative techniques, is often, if not sufficient, the best that could be achieved. Rather than searching for the best method for handling all imperfect information, our strategy for uncertainty management, that is recognition and evaluation of uncertainties associated with PA followed by planning and implementation of measures to reduce them, is to use whichever method best fits the problem at hand. Such an eclectic position leads naturally to integration of the different formalisms. While uncertainty management based on the combination of semi-quantitative methods forms an important part of our framework for uncertainty governance, it only solves half of the problem

  17. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    more reliable predictions with a new and improved set of model parameters for GC (group contribution) based and CI (atom connectivity index) based models and to quantify the uncertainties in the estimated property values from a process design point-of-view. This includes: (i) parameter estimation using....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column......, the developed methodology can be used to quantify the sensitivity of process design to uncertainties in property estimates; obtain rationally the risk/safety factors in process design; and identify additional experimentation needs in order to reduce most critical uncertainties....

  18. Decay heat uncertainty quantification of MYRRHA

    Directory of Open Access Journals (Sweden)

    Fiorito Luca

    2017-01-01

    Full Text Available MYRRHA is a lead-bismuth cooled MOX-fueled accelerator driven system (ADS currently in the design phase at SCK·CEN in Belgium. The correct evaluation of the decay heat and of its uncertainty level is very important for the safety demonstration of the reactor. In the first part of this work we assessed the decay heat released by the MYRRHA core using the ALEPH-2 burnup code. The second part of the study focused on the nuclear data uncertainty and covariance propagation to the MYRRHA decay heat. Radioactive decay data, independent fission yield and cross section uncertainties/covariances were propagated using two nuclear data sampling codes, namely NUDUNA and SANDY. According to the results, 238U cross sections and fission yield data are the largest contributors to the MYRRHA decay heat uncertainty. The calculated uncertainty values are deemed acceptable from the safety point of view as they are well within the available regulatory limits.

  19. Some reflections on uncertainty analysis and management

    International Nuclear Information System (INIS)

    Aven, Terje

    2010-01-01

    A guide to quantitative uncertainty analysis and management in industry has recently been issued. The guide provides an overall framework for uncertainty modelling and characterisations, using probabilities but also other uncertainty representations (including the Dempster-Shafer theory). A number of practical applications showing how to use the framework are presented. The guide is considered as an important contribution to the field, but there is a potential for improvements. These relate mainly to the scientific basis and clarification of critical issues, for example, concerning the meaning of a probability and the concept of model uncertainty. A reformulation of the framework is suggested using probabilities as the only representation of uncertainty. Several simple examples are included to motivate and explain the basic ideas of the modified framework.

  20. Habitable zone dependence on stellar parameter uncertainties

    International Nuclear Information System (INIS)

    Kane, Stephen R.

    2014-01-01

    An important property of exoplanetary systems is the extent of the Habitable Zone (HZ), defined as that region where water can exist in a liquid state on the surface of a planet with sufficient atmospheric pressure. Both ground- and space-based observations have revealed a plethora of confirmed exoplanets and exoplanetary candidates, most notably from the Kepler mission using the transit detection technique. Many of these detected planets lie within the predicted HZ of their host star. However, as is the case with the derived properties of the planets themselves, the HZ boundaries depend on how well we understand the host star. Here we quantify the uncertainties of HZ boundaries on the parameter uncertainties of the host star. We examine the distribution of stellar parameter uncertainties from confirmed exoplanet hosts and Kepler candidate hosts and translate these into HZ boundary uncertainties. We apply this to several known systems with an HZ planet to determine the uncertainty in their HZ status.

  1. Habitable zone dependence on stellar parameter uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Kane, Stephen R., E-mail: skane@sfsu.edu [Department of Physics and Astronomy, San Francisco State University, 1600 Holloway Avenue, San Francisco, CA 94132 (United States)

    2014-02-20

    An important property of exoplanetary systems is the extent of the Habitable Zone (HZ), defined as that region where water can exist in a liquid state on the surface of a planet with sufficient atmospheric pressure. Both ground- and space-based observations have revealed a plethora of confirmed exoplanets and exoplanetary candidates, most notably from the Kepler mission using the transit detection technique. Many of these detected planets lie within the predicted HZ of their host star. However, as is the case with the derived properties of the planets themselves, the HZ boundaries depend on how well we understand the host star. Here we quantify the uncertainties of HZ boundaries on the parameter uncertainties of the host star. We examine the distribution of stellar parameter uncertainties from confirmed exoplanet hosts and Kepler candidate hosts and translate these into HZ boundary uncertainties. We apply this to several known systems with an HZ planet to determine the uncertainty in their HZ status.

  2. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  3. Statistical uncertainties and unrecognized relationships

    International Nuclear Information System (INIS)

    Rankin, J.P.

    1985-01-01

    Hidden relationships in specific designs directly contribute to inaccuracies in reliability assessments. Uncertainty factors at the system level may sometimes be applied in attempts to compensate for the impact of such unrecognized relationships. Often uncertainty bands are used to relegate unknowns to a miscellaneous category of low-probability occurrences. However, experience and modern analytical methods indicate that perhaps the dominant, most probable and significant events are sometimes overlooked in statistical reliability assurances. The author discusses the utility of two unique methods of identifying the otherwise often unforeseeable system interdependencies for statistical evaluations. These methods are sneak circuit analysis and a checklist form of common cause failure analysis. Unless these techniques (or a suitable equivalent) are also employed along with the more widely-known assurance tools, high reliability of complex systems may not be adequately assured. This concern is indicated by specific illustrations. 8 references, 5 figures

  4. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    International Nuclear Information System (INIS)

    Elert, M.

    1996-09-01

    zone concentration. Models considering faster net downward flow in the upper part of the root zone predict a more rapid decline in root zone concentration than models that assume a constant infiltration throughout the soil column. A sensitivity analysis performed on two of the models shows that the important parameters are the effective precipitation, the root water uptake and the soil K d -values. For the advection-dispersion model, the dispersion length is also important for the maximum flux to the groundwater. The amount of dispersion in radionuclide transport is of importance for the release to groundwater. For the box models, an inherent dispersion is obtained by the assumption of instantaneous mixing in the boxes. The degree of dispersion in the calculation will be a function of the size of the boxes. It is therefore important that division of the soil column is made with care in order to obtain the intended values. For many models the uncertainty calculations give very skewed distributions for the flux to the groundwater. In some cases the mean of the stochastic calculation can be several orders of magnitude higher than the value from the deterministic calculations. In relation to the objectives set up for this study it can be concluded that: The analysis of the relationship between uncertainty and model complexity proved to be a difficult task. For the studied scenario, the uncertainty in the model predictions does not have a simple relationship with the complexity of the models used. However, a complete analysis could not be performed since uncertainty results were not available for the full range of models and furthermore were not the uncertainty analysis always carried out in a consistent way. The predicted uncertainty associated with the concentration in the root zone does not show very much variation between the modelling approaches. For the predictions of the flux to groundwater, the simple models and the more complex gave very different results for the

  5. On uncertainty quantification in hydrogeology and hydrogeophysics

    Science.gov (United States)

    Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud

    2017-12-01

    Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.

  6. Optimisation of decision making under uncertainty throughout field lifetime: A fractured reservoir example

    Science.gov (United States)

    Arnold, Dan; Demyanov, Vasily; Christie, Mike; Bakay, Alexander; Gopa, Konstantin

    2016-10-01

    Assessing the change in uncertainty in reservoir production forecasts over field lifetime is rarely undertaken because of the complexity of joining together the individual workflows. This becomes particularly important in complex fields such as naturally fractured reservoirs. The impact of this problem has been identified in previous and many solutions have been proposed but never implemented on complex reservoir problems due to the computational cost of quantifying uncertainty and optimising the reservoir development, specifically knowing how many and what kind of simulations to run. This paper demonstrates a workflow that propagates uncertainty throughout field lifetime, and into the decision making process by a combination of a metric-based approach, multi-objective optimisation and Bayesian estimation of uncertainty. The workflow propagates uncertainty estimates from appraisal into initial development optimisation, then updates uncertainty through history matching and finally propagates it into late-life optimisation. The combination of techniques applied, namely the metric approach and multi-objective optimisation, help evaluate development options under uncertainty. This was achieved with a significantly reduced number of flow simulations, such that the combined workflow is computationally feasible to run for a real-field problem. This workflow is applied to two synthetic naturally fractured reservoir (NFR) case studies in appraisal, field development, history matching and mid-life EOR stages. The first is a simple sector model, while the second is a more complex full field example based on a real life analogue. This study infers geological uncertainty from an ensemble of models that are based on the carbonate Brazilian outcrop which are propagated through the field lifetime, before and after the start of production, with the inclusion of production data significantly collapsing the spread of P10-P90 in reservoir forecasts. The workflow links uncertainty

  7. Using a Meniscus to Teach Uncertainty in Measurement

    Science.gov (United States)

    Backman, Philip

    2008-01-01

    I have found that students easily understand that a measurement cannot be exact, but they often seem to lack an understanding of why it is important to know "something" about the magnitude of the uncertainty. This tends to promote an attitude that almost any uncertainty value will do. Such indifference may exist because once an uncertainty is…

  8. Moving Beyond 2% Uncertainty: A New Framework for Quantifying Lidar Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer F.; Clifton, Andrew

    2017-03-08

    Remote sensing of wind using lidar is revolutionizing wind energy. However, current generations of wind lidar are ascribed a climatic value of uncertainty, which is based on a poor description of lidar sensitivity to external conditions. In this presentation, we show how it is important to consider the complete lidar measurement process to define the measurement uncertainty, which in turn offers the ability to define a much more granular and dynamic measurement uncertainty. This approach is a progression from the 'white box' lidar uncertainty method.

  9. Quantification of uncertainties of modeling and simulation

    International Nuclear Information System (INIS)

    Ma Zhibo; Yin Jianwei

    2012-01-01

    The principles of Modeling and Simulation (M and S) is interpreted by a functional relation, from which the total uncertainties of M and S are identified and sorted to three parts considered to vary along with the conceptual models' parameters. According to the idea of verification and validation, the space of the parameters is parted to verified and applied domains, uncertainties in the verified domain are quantified by comparison between numerical and standard results, and those in the applied domain are quantified by a newly developed extrapolating method. Examples are presented to demonstrate and qualify the ideas aimed to build a framework to quantify the uncertainties of M and S. (authors)

  10. Validation of Fuel Performance Uncertainty for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Nam-Gyu; Yoo, Jong-Sung; Jung, Yil-Sup [KEPCO Nuclear Fuel Co., Daejeon (Korea, Republic of)

    2016-10-15

    To achieve this the computer code performance has to be validated based on the experimental results. And for the uncertainty quantification, important uncertainty parameters need to be selected, and combined uncertainty has to be evaluated with an acceptable statistical treatment. And important uncertainty parameters to the rod performance such as fuel enthalpy, fission gas release, cladding hoop strain etc. were chosen through the rigorous sensitivity studies. And their validity has been assessed by utilizing the experimental results, which were tested in CABRI and NSRR. Analysis results revealed that several tested rods were not bounded within combined fuel performance uncertainty. Assessment of fuel performance with an extended fuel power uncertainty on tested rods in NSRR and CABRI has been done. Analysis results showed that several tested rods were not bounded within calculated fuel performance uncertainty. This implies that the currently considered uncertainty range of the parameters is not enough to cover the fuel performance sufficiently.

  11. Uncertainty and sensitivity analysis on probabilistic safety assessment of an experimental facility

    International Nuclear Information System (INIS)

    Burgazzi, L.

    2000-01-01

    The aim of this work is to perform an uncertainty and sensitivity analysis on the probabilistic safety assessment of the International Fusion Materials Irradiation Facility (IFMIF), in order to assess the effect on the final risk values of the uncertainties associated with the generic data used for the initiating events and component reliability and to identify the key quantities contributing to this uncertainty. The analysis is conducted on the expected frequency calculated for the accident sequences, defined through the event tree (ET) modeling. This is in order to increment credit to the ET model quantification, to calculate frequency distributions for the occurrence of events and, consequently, to assess if sequences have been correctly selected on the probability standpoint and finally to verify the fulfillment of the safety conditions. Uncertainty and sensitivity analysis are performed using respectively Monte Carlo sampling and an importance parameter technique. (author)

  12. Uncertainty analysis with a view towards applications in accident consequence assessments

    International Nuclear Information System (INIS)

    Fischer, F.; Erhardt, J.

    1985-09-01

    Since the publication of the US-Reactor Safety Study WASH-1400 there has been an increasing interest to develop and apply methods which allow to quantify the uncertainty inherent in probabilistic risk assessments (PRAs) and accident consequence assessments (ACAs) for installations of the nuclear fuel cycle. Research and development in this area is forced by the fact that PRA and ACA are more and more used for comparative, decisive and fact finding studies initiated by industry and regulatory commissions. This report summarizes and reviews some of the main methods and gives some hints to do sensitivity and uncertainty analyses. Some first investigations aiming at the application of the method mentioned above to a submodel of the ACA-code UFOMOD (KfK) are presented. Sensitivity analyses and some uncertainty studies an important submodel of UFOMOD are carried out to identify the relevant parameters for subsequent uncertainty calculations. (orig./HP) [de

  13. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  14. Parameter sensitivity and uncertainty of the forest carbon flux model FORUG : a Monte Carlo analysis

    Energy Technology Data Exchange (ETDEWEB)

    Verbeeck, H.; Samson, R.; Lemeur, R. [Ghent Univ., Ghent (Belgium). Laboratory of Plant Ecology; Verdonck, F. [Ghent Univ., Ghent (Belgium). Dept. of Applied Mathematics, Biometrics and Process Control

    2006-06-15

    The FORUG model is a multi-layer process-based model that simulates carbon dioxide (CO{sub 2}) and water exchange between forest stands and the atmosphere. The main model outputs are net ecosystem exchange (NEE), total ecosystem respiration (TER), gross primary production (GPP) and evapotranspiration. This study used a sensitivity analysis to identify the parameters contributing to NEE uncertainty in the FORUG model. The aim was to determine if it is necessary to estimate the uncertainty of all parameters of a model to determine overall output uncertainty. Data used in the study were the meteorological and flux data of beech trees in Hesse. The Monte Carlo method was used to rank sensitivity and uncertainty parameters in combination with a multiple linear regression. Simulations were run in which parameters were assigned probability distributions and the effect of variance in the parameters on the output distribution was assessed. The uncertainty of the output for NEE was estimated. Based on the arbitrary uncertainty of 10 key parameters, a standard deviation of 0.88 Mg C per year per NEE was found, which was equal to 24 per cent of the mean value of NEE. The sensitivity analysis showed that the overall output uncertainty of the FORUG model could be determined by accounting for only a few key parameters, which were identified as corresponding to critical parameters in the literature. It was concluded that the 10 most important parameters determined more than 90 per cent of the output uncertainty. High ranking parameters included soil respiration; photosynthesis; and crown architecture. It was concluded that the Monte Carlo technique is a useful tool for ranking the uncertainty of parameters of process-based forest flux models. 48 refs., 2 tabs., 2 figs.

  15. Uncertainty identification for robust control using a nuclear power plant model

    International Nuclear Information System (INIS)

    Power, M.; Edwards, R.M.

    1995-01-01

    An on-line technique which identifies the uncertainty between a lower order and a higher order nuclear power plant model is presented. The uncertainty identifier produces a hard upper bound in H ∞ on the additive uncertainty. This additive uncertainty description can be used for the design of H infinity or μ-synthesis controllers

  16. Uncertainties and climatic change

    International Nuclear Information System (INIS)

    De Gier, A.M.; Opschoor, J.B.; Van de Donk, W.B.H.J.; Hooimeijer, P.; Jepma, J.; Lelieveld, J.; Oerlemans, J.; Petersen, A.

    2008-01-01

    Which processes in the climate system are misunderstood? How are scientists dealing with uncertainty about climate change? What will be done with the conclusions of the recently published synthesis report of the IPCC? These and other questions were answered during the meeting 'Uncertainties and climate change' that was held on Monday 26 November 2007 at the KNAW in Amsterdam. This report is a compilation of all the presentations and provides some conclusions resulting from the discussions during this meeting. [mk] [nl

  17. Mechanics and uncertainty

    CERN Document Server

    Lemaire, Maurice

    2014-01-01

    Science is a quest for certainty, but lack of certainty is the driving force behind all of its endeavors. This book, specifically, examines the uncertainty of technological and industrial science. Uncertainty and Mechanics studies the concepts of mechanical design in an uncertain setting and explains engineering techniques for inventing cost-effective products. Though it references practical applications, this is a book about ideas and potential advances in mechanical science.

  18. Uncertainty: lotteries and risk

    OpenAIRE

    Ávalos, Eloy

    2011-01-01

    In this paper we develop the theory of uncertainty in a context where the risks assumed by the individual are measurable and manageable. We primarily use the definition of lottery to formulate the axioms of the individual's preferences, and its representation through the utility function von Neumann - Morgenstern. We study the expected utility theorem and its properties, the paradoxes of choice under uncertainty and finally the measures of risk aversion with monetary lotteries.

  19. Uncertainty calculations made easier

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-07-01

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)

  20. Radon measurements: the sources of uncertainties

    International Nuclear Information System (INIS)

    Zhukovsky, Michael; Onischenko, Alexandra; Bastrikov, Vladislav

    2008-01-01

    Full text: Radon measurements are quite complicated process and the correct estimation of uncertainties is very important. The sources of uncertainties for grab sampling, short term measurements (charcoal canisters), long term measurements (track detectors) and retrospective measurements (surface traps) are analyzed. The main sources of uncertainties for grab sampling measurements are: systematic bias of reference equipment; random Poisson and non-Poisson errors during calibration; random Poisson and non-Poisson errors during measurements. These sources are also common both for short term measurements (charcoal canisters) and long term measurements (track detectors). Usually during the calibration the high radon concentrations are used (1-5 kBq/m 3 ) and the Poisson random error rarely exceed some percents. Nevertheless the dispersion of measured values even during the calibration usually exceeds the Poisson dispersion expected on the basis of counting statistic. The origins of such non-Poisson random errors during calibration are different for different kinds of instrumental measurements. At present not all sources of non-Poisson random errors are trustworthy identified. The initial calibration accuracy of working devices rarely exceeds the value 20%. The real radon concentrations usually are in the range from some tens to some hundreds Becquerel per cubic meter and for low radon levels Poisson random error can reach up to 20%. The random non-Poisson errors and residual systematic biases are depends on the kind of measurement technique and the environmental conditions during radon measurements. For charcoal canisters there are additional sources of the measurement errors due to influence of air humidity and the variations of radon concentration during the canister exposure. The accuracy of long term measurements by track detectors will depend on the quality of chemical etching after exposure and the influence of season radon variations. The main sources of

  1. Effect of Uncertainty Parameters in Blowdown and Reflood Models for OPR1000 LBLOCA Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huh, Byung Gil; Jin, Chang Yong; Seul, Kwangwon; Hwang, Taesuk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-05-15

    KINS(Korea Institute of Nuclear Safety) has also performed the audit calculation with the KINS Realistic Evaluation Methodology(KINS-REM) to confirm the validity of licensee's calculation. In the BEPU method, it is very important to quantify the code and model uncertainty. It is referred in the following requirement: BE calculations in Regulatory Guide 1.157 - 'the code and models used are acceptable and applicable to the specific facility over the intended operating range and must quantify the uncertainty in the specific application'. In general, the uncertainty of model/code should be obtained through the data comparison with relevant integral- and separate-effect tests at different scales. However, it is not easy to determine these kinds of uncertainty because of the difficulty for evaluating accurately various experiments. Therefore, the expert judgment has been used in many cases even with the limitation that the uncertainty range of important parameters can be wide and inaccurate. In the KINS-REM, six heat transfer parameters in the blowdown phase have been used to consider the uncertainty of models. Recently, MARS-KS code was modified to consider the uncertainty of the five heat transfer parameters in the reflood phase. Accordingly, it is required that the uncertainty range for parameters of reflood models is determined and the effect of these ranges is evaluated. In this study, the large break LOCA (LBLOCA) analysis for OPR1000 was performed to identify the effect of uncertainty parameters in blowdown and reflood models.

  2. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Elert, M. [Kemakta Konsult AB, Stockholm (Sweden)] [ed.

    1996-09-01

    zone concentration. Models considering faster net downward flow in the upper part of the root zone predict a more rapid decline in root zone concentration than models that assume a constant infiltration throughout the soil column. A sensitivity analysis performed on two of the models shows that the important parameters are the effective precipitation, the root water uptake and the soil K{sub d}-values. For the advection-dispersion model, the dispersion length is also important for the maximum flux to the groundwater. The amount of dispersion in radionuclide transport is of importance for the release to groundwater. For the box models, an inherent dispersion is obtained by the assumption of instantaneous mixing in the boxes. The degree of dispersion in the calculation will be a function of the size of the boxes. It is therefore important that division of the soil column is made with care in order to obtain the intended values. For many models the uncertainty calculations give very skewed distributions for the flux to the groundwater. In some cases the mean of the stochastic calculation can be several orders of magnitude higher than the value from the deterministic calculations. In relation to the objectives set up for this study it can be concluded that: The analysis of the relationship between uncertainty and model complexity proved to be a difficult task. For the studied scenario, the uncertainty in the model predictions does not have a simple relationship with the complexity of the models used. However, a complete analysis could not be performed since uncertainty results were not available for the full range of models and furthermore were not the uncertainty analysis always carried out in a consistent way. The predicted uncertainty associated with the concentration in the root zone does not show very much variation between the modelling approaches. For the predictions of the flux to groundwater, the simple models and the more complex gave very different results for

  3. Uncertainties propagation and global sensitivity analysis of the frequency response function of piezoelectric energy harvesters

    Science.gov (United States)

    Ruiz, Rafael O.; Meruane, Viviana

    2017-06-01

    The goal of this work is to describe a framework to propagate uncertainties in piezoelectric energy harvesters (PEHs). These uncertainties are related to the incomplete knowledge of the model parameters. The framework presented could be employed to conduct prior robust stochastic predictions. The prior analysis assumes a known probability density function for the uncertain variables and propagates the uncertainties to the output voltage. The framework is particularized to evaluate the behavior of the frequency response functions (FRFs) in PEHs, while its implementation is illustrated by the use of different unimorph and bimorph PEHs subjected to different scenarios: free of uncertainties, common uncertainties, and uncertainties as a product of imperfect clamping. The common variability associated with the PEH parameters are tabulated and reported. A global sensitivity analysis is conducted to identify the Sobol indices. Results indicate that the elastic modulus, density, and thickness of the piezoelectric layer are the most relevant parameters of the output variability. The importance of including the model parameter uncertainties in the estimation of the FRFs is revealed. In this sense, the present framework constitutes a powerful tool in the robust design and prediction of PEH performance.

  4. Uncertainties in s-process nucleosynthesis in massive stars determined by Monte Carlo variations

    Science.gov (United States)

    Nishimura, N.; Hirschi, R.; Rauscher, T.; St. J. Murphy, A.; Cescutti, G.

    2017-08-01

    The s-process in massive stars produces the weak component of the s-process (nuclei up to A ˜ 90), in amounts that match solar abundances. For heavier isotopes, such as barium, production through neutron capture is significantly enhanced in very metal-poor stars with fast rotation. However, detailed theoretical predictions for the resulting final s-process abundances have important uncertainties caused both by the underlying uncertainties in the nuclear physics (principally neutron-capture reaction and β-decay rates) as well as by the stellar evolution modelling. In this work, we investigated the impact of nuclear-physics uncertainties relevant to the s-process in massive stars. Using a Monte Carlo based approach, we performed extensive nuclear reaction network calculations that include newly evaluated upper and lower limits for the individual temperature-dependent reaction rates. We found that most of the uncertainty in the final abundances is caused by uncertainties in the neutron-capture rates, while β-decay rate uncertainties affect only a few nuclei near s-process branchings. The s-process in rotating metal-poor stars shows quantitatively different uncertainties and key reactions, although the qualitative characteristics are similar. We confirmed that our results do not significantly change at different metallicities for fast rotating massive stars in the very low metallicity regime. We highlight which of the identified key reactions are realistic candidates for improved measurement by future experiments.

  5. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1).

    Science.gov (United States)

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.

  6. Differentiating intolerance of uncertainty from three related but distinct constructs.

    Science.gov (United States)

    Rosen, Natalie O; Ivanova, Elena; Knäuper, Bärbel

    2014-01-01

    Individual differences in uncertainty have been associated with heightened anxiety, stress and approach-oriented coping. Intolerance of uncertainty (IU) is a trait characteristic that arises from negative beliefs about uncertainty and its consequences. Researchers have established the central role of IU in the development of problematic worry and maladaptive coping, highlighting the importance of this construct to anxiety disorders. However, there is a need to improve our understanding of the phenomenology of IU. The goal of this paper was to present hypotheses regarding the similarities and differences between IU and three related constructs--intolerance of ambiguity, uncertainty orientation, and need for cognitive closure--and to call for future empirical studies to substantiate these hypotheses. To assist with achieving this goal, we conducted a systematic review of the literature, which also served to identify current gaps in knowledge. This paper differentiates these constructs by outlining each definition and general approaches to assessment, reviewing the existing empirical relations, and proposing theoretical similarities and distinctions. Findings may assist researchers in selecting the appropriate construct to address their research questions. Future research directions for the application of these constructs, particularly within the field of clinical and health psychology, are discussed.

  7. Justification for recommended uncertainties

    International Nuclear Information System (INIS)

    Pronyaev, V.G.; Badikov, S.A.; Carlson, A.D.

    2007-01-01

    The uncertainties obtained in an earlier standards evaluation were considered to be unrealistically low by experts of the US Cross Section Evaluation Working Group (CSEWG). Therefore, the CSEWG Standards Subcommittee replaced the covariance matrices of evaluated uncertainties by expanded percentage errors that were assigned to the data over wide energy groups. There are a number of reasons that might lead to low uncertainties of the evaluated data: Underestimation of the correlations existing between the results of different measurements; The presence of unrecognized systematic uncertainties in the experimental data can lead to biases in the evaluated data as well as to underestimations of the resulting uncertainties; Uncertainties for correlated data cannot only be characterized by percentage uncertainties or variances. Covariances between evaluated value at 0.2 MeV and other points obtained in model (RAC R matrix and PADE2 analytical expansion) and non-model (GMA) fits of the 6 Li(n,t) TEST1 data and the correlation coefficients are presented and covariances between the evaluated value at 0.045 MeV and other points (along the line or column of the matrix) as obtained in EDA and RAC R matrix fits of the data available for reactions that pass through the formation of the 7 Li system are discussed. The GMA fit with the GMA database is shown for comparison. The following diagrams are discussed: Percentage uncertainties of the evaluated cross section for the 6 Li(n,t) reaction and the for the 235 U(n,f) reaction; estimation given by CSEWG experts; GMA result with full GMA database, including experimental data for the 6 Li(n,t), 6 Li(n,n) and 6 Li(n,total) reactions; uncertainties in the GMA combined fit for the standards; EDA and RAC R matrix results, respectively. Uncertainties of absolute and 252 Cf fission spectrum averaged cross section measurements, and deviations between measured and evaluated values for 235 U(n,f) cross-sections in the neutron energy range 1

  8. Practical Policy Applications of Uncertainty Analysis for National Greenhouse Gas Inventories

    Energy Technology Data Exchange (ETDEWEB)

    Gillenwater, M. [Environmental Resources Trust (United States)], E-mail: mgillenwater@ert.net; Sussman, F.; Cohen, J. [ICF International (United States)

    2007-09-15

    International policy makers and climate researchers use greenhouse gas emissions inventory estimates in a variety of ways. Because of the varied uses of the inventory data, as well as the high uncertainty surrounding some of the source category estimates, considerable effort has been devoted to understanding the causes and magnitude of uncertainty in national emissions inventories. In this paper, we focus on two aspects of the rationale for quantifying uncertainty: (1) the possible uses of the quantified uncertainty estimates for policy (e.g., as a means of adjusting inventories used to determine compliance with international commitments); and (2) the direct benefits of the process of investigating uncertainties in terms of improving inventory quality. We find that there are particular characteristics that an inventory uncertainty estimate should have if it is to be used for policy purposes: (1) it should be comparable across countries; (2) it should be relatively objective, or at least subject to review and verification; (3) it should not be subject to gaming by countries acting in their own self-interest; (4) it should be administratively feasible to estimate and use; (5) the quality of the uncertainty estimate should be high enough to warrant the additional compliance costs that its use in an adjustment factor may impose on countries; and (6) it should attempt to address all types of inventory uncertainty. Currently, inventory uncertainty estimates for national greenhouse gas inventories do not have these characteristics. For example, the information used to develop quantitative uncertainty estimates for national inventories is often based on expert judgments, which are, by definition, subjective rather than objective, and therefore difficult to review and compare. Further, the practical design of a potential factor to adjust inventory estimates using uncertainty estimates would require policy makers to (1) identify clear environmental goals; (2) define these

  9. Practical Policy Applications of Uncertainty Analysis for National Greenhouse Gas Inventories

    International Nuclear Information System (INIS)

    Gillenwater, M.; Sussman, F.; Cohen, J.

    2007-01-01

    International policy makers and climate researchers use greenhouse gas emissions inventory estimates in a variety of ways. Because of the varied uses of the inventory data, as well as the high uncertainty surrounding some of the source category estimates, considerable effort has been devoted to understanding the causes and magnitude of uncertainty in national emissions inventories. In this paper, we focus on two aspects of the rationale for quantifying uncertainty: (1) the possible uses of the quantified uncertainty estimates for policy (e.g., as a means of adjusting inventories used to determine compliance with international commitments); and (2) the direct benefits of the process of investigating uncertainties in terms of improving inventory quality. We find that there are particular characteristics that an inventory uncertainty estimate should have if it is to be used for policy purposes: (1) it should be comparable across countries; (2) it should be relatively objective, or at least subject to review and verification; (3) it should not be subject to gaming by countries acting in their own self-interest; (4) it should be administratively feasible to estimate and use; (5) the quality of the uncertainty estimate should be high enough to warrant the additional compliance costs that its use in an adjustment factor may impose on countries; and (6) it should attempt to address all types of inventory uncertainty. Currently, inventory uncertainty estimates for national greenhouse gas inventories do not have these characteristics. For example, the information used to develop quantitative uncertainty estimates for national inventories is often based on expert judgments, which are, by definition, subjective rather than objective, and therefore difficult to review and compare. Further, the practical design of a potential factor to adjust inventory estimates using uncertainty estimates would require policy makers to (1) identify clear environmental goals; (2) define these

  10. Bayesian uncertainty analyses of probabilistic risk models

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1989-01-01

    Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed

  11. Rapid research and implementation priority setting for wound care uncertainties.

    Directory of Open Access Journals (Sweden)

    Trish A Gray

    nurses, seven podiatrists and six managers. Participants had been qualified for a mean of 20.7 years with a mean of 16.8 years of wound care experience. One hundred and thirty-nine uncertainties were submitted electronically and a further 20 were identified on the day of the workshop following lively, interactive group discussions. Twenty-five uncertainties from the total of 159 generated made it to the final prioritised list. These included six of the 20 new uncertainties. The uncertainties varied in focus, but could be broadly categorised into three themes: service delivery and organisation, patient centred care and treatment options. Specialist nurses were more likely to vote for service delivery and organisation topics, podiatrists for patient centred topics, district nurses for treatment options and operational leads for a broad range.This collaborative priority setting project is the first to engage front-line clinicians in prioritising research and implementation topics in wound care. We have shown that it is feasible to conduct topic prioritisation in a short time frame. This project has demonstrated that with careful planning and rigor, important questions that are raised in the course of clinicians' daily decision making can be translated into meaningful research and implementation initiatives that could make a difference to service delivery and patient care.

  12. Rapid research and implementation priority setting for wound care uncertainties

    Science.gov (United States)

    Dumville, Jo C.; Christie, Janice; Cullum, Nicky A.

    2017-01-01

    , 10 district nurses, seven podiatrists and six managers. Participants had been qualified for a mean of 20.7 years with a mean of 16.8 years of wound care experience. One hundred and thirty-nine uncertainties were submitted electronically and a further 20 were identified on the day of the workshop following lively, interactive group discussions. Twenty-five uncertainties from the total of 159 generated made it to the final prioritised list. These included six of the 20 new uncertainties. The uncertainties varied in focus, but could be broadly categorised into three themes: service delivery and organisation, patient centred care and treatment options. Specialist nurses were more likely to vote for service delivery and organisation topics, podiatrists for patient centred topics, district nurses for treatment options and operational leads for a broad range. Conclusions This collaborative priority setting project is the first to engage front-line clinicians in prioritising research and implementation topics in wound care. We have shown that it is feasible to conduct topic prioritisation in a short time frame. This project has demonstrated that with careful planning and rigor, important questions that are raised in the course of clinicians’ daily decision making can be translated into meaningful research and implementation initiatives that could make a difference to service delivery and patient care. PMID:29206884

  13. Managing project risks and uncertainties

    Directory of Open Access Journals (Sweden)

    Mike Mentis

    2015-01-01

    Full Text Available This article considers threats to a project slipping on budget, schedule and fit-for-purpose. Threat is used here as the collective for risks (quantifiable bad things that can happen and uncertainties (poorly or not quantifiable bad possible events. Based on experience with projects in developing countries this review considers that (a project slippage is due to uncertainties rather than risks, (b while eventuation of some bad things is beyond control, managed execution and oversight are still the primary means to keeping within budget, on time and fit-for-purpose, (c improving project delivery is less about bigger and more complex and more about coordinated focus, effectiveness and developing thought-out heuristics, and (d projects take longer and cost more partly because threat identification is inaccurate, the scope of identified threats is too narrow, and the threat assessment product is not integrated into overall project decision-making and execution. Almost by definition, what is poorly known is likely to cause problems. Yet it is not just the unquantifiability and intangibility of uncertainties causing project slippage, but that they are insufficiently taken into account in project planning and execution that cause budget and time overruns. Improving project performance requires purpose-driven and managed deployment of scarce seasoned professionals. This can be aided with independent oversight by deeply experienced panelists who contribute technical insights and can potentially show that diligence is seen to be done.

  14. Coupled code analysis of uncertainty and sensitivity of Kalinin-3 benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Pasichnyk, Ihor; Zwermann, Winfried; Velkov, Kiril [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany); Nikonov, Sergey [VNIIAES, Moscow (Russian Federation)

    2016-09-15

    An uncertainty and sensitivity analysis is performed for the OECD/NEA coolant transient Benchmark (K-3) on measured data at Kalinin-3 Nuclear Power Plant (NPP). A switch off of one main coolant pump (MCP) at nominal reactor power is calculated using a coupled thermohydraulic and neutron-kinetic ATHLET-PARCS code. The objectives are to study uncertainty of total reactor power and to identify the main sources of reactor power uncertainty. The GRS uncertainty and sensitivity software package XSUSA is applied to propagate uncertainties in nuclear data libraries to the full core coupled transient calculations. A set of most important thermal-hydraulic parameters of the primary circuit is identified and a total of 23 thermohydraulic parameters are statistically varied using GRS code SUSA. The ATHLET model contains also a balance-of-plant (BOP) model which is simulated using ATHLET GCSM module. In particular the operation of the main steam generator regulators is modelled in detail. A set of 200 varied coupled ATHLET-PARCS calculations is analyzed. The results obtained show a clustering effect in the behavior of global reactor parameters. It is found that the GCSM system together with varied input parameters strongly influence the overall nuclear power plant behavior and can even lead to a new scenario. Possible reasons of the clustering effect are discussed in the paper. This work is a step forward in establishing a ''best-estimate calculations in combination with performing uncertainty analysis'' methodology for coupled full core calculations.

  15. A review of uncertainty research in impact assessment

    International Nuclear Information System (INIS)

    Leung, Wanda; Noble, Bram; Gunn, Jill; Jaeger, Jochen A.G.

    2015-01-01

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  16. A review of uncertainty research in impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Leung, Wanda, E-mail: wanda.leung@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Noble, Bram, E-mail: b.noble@usask.ca [Department of Geography and Planning, School of Environment and Sustainability, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Gunn, Jill, E-mail: jill.gunn@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Jaeger, Jochen A.G., E-mail: jochen.jaeger@concordia.ca [Department of Geography, Planning and Environment, Concordia University, 1455 de Maisonneuve W., Suite 1255, Montreal, Quebec H3G 1M8 (Canada); Loyola Sustainability Research Centre, Concordia University, 7141 Sherbrooke W., AD-502, Montreal, Quebec H4B 1R6 (Canada)

    2015-01-15

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  17. Advancing Uncertainty: Untangling and Discerning Related Concepts

    Directory of Open Access Journals (Sweden)

    Janice Penrod

    2002-12-01

    Full Text Available Methods of advancing concepts within the qualitative paradigm have been developed and articulated. In this section, I describe methodological perspectives of a project designed to advance the concept of uncertainty using multiple qualitative methods. Through a series of earlier studies, the concept of uncertainty arose repeatedly in varied contexts, working its way into prominence, and warranting further investigation. Processes of advanced concept analysis were used to initiate the formal investigation into the meaning of the concept. Through concept analysis, the concept was deconstructed to identify conceptual components and gaps in understanding. Using this skeletal framework of the concept identified through concept analysis, subsequent studies were carried out to add ‘flesh’ to the concept. First, a concept refinement using the literature as data was completed. Findings revealed that the current state of the concept of uncertainty failed to incorporate what was known of the lived experience. Therefore, using interview techniques as the primary data source, a phenomenological study of uncertainty among caregivers was conducted. Incorporating the findings of the phenomenology, the skeletal framework of the concept was further fleshed out using techniques of concept correction to produce a more mature conceptualization of uncertainty. In this section, I describe the flow of this qualitative project investigating the concept of uncertainty, with special emphasis on a particular threat to validity (called conceptual tunnel vision that was identified and addressed during the phases of concept correction. Though in this article I employ a study of uncertainty for illustration, limited substantive findings regarding uncertainty are presented to retain a clear focus on the methodological issues.

  18. Uncertainties in Forecasting Streamflow using Entropy Theory

    Science.gov (United States)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  19. Phenomenological uncertainty analysis of early containment failure at severe accident of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Su Won

    2011-02-15

    The severe accident has inherently significant uncertainty due to wide range of conditions and performing experiments, validation and practical application are extremely difficult because of its high temperature and pressure. Although internal and external researches were put into practice, the reference used in Korean nuclear plants were foreign data of 1980s and safety analysis as the probabilistic safety assessment has not applied the newest methodology. Also, it is applied to containment pressure formed into point value as results of thermal hydraulic analysis to identify the probability of containment failure in level 2 PSA. In this paper, the uncertainty analysis methods for phenomena of severe accident influencing early containment failure were developed, the uncertainty analysis that apply Korean nuclear plants using the MELCOR code was performed and it is a point of view to present the distribution of containment pressure as a result of uncertainty analysis. Because early containment failure is important factor of Large Early Release Frequency(LERF) that is used as representative criteria of decision-making in nuclear power plants, it was selected in this paper among various modes of containment failure. Important phenomena of early containment failure at severe accident based on previous researches were comprehended and methodology of 7th steps to evaluate uncertainty was developed. The MELCOR input for analysis of the severe accident reflected natural circulation flow was developed and the accident scenario for station black out that was representative initial event of early containment failure was determined. By reviewing the internal model and correlation for MELCOR model relevant important phenomena of early containment failure, the uncertainty factors which could affect on the uncertainty were founded and the major factors were finally identified through the sensitivity analysis. In order to determine total number of MELCOR calculations which can

  20. Chemical model reduction under uncertainty

    KAUST Repository

    Malpica Galassi, Riccardo

    2017-03-06

    A general strategy for analysis and reduction of uncertain chemical kinetic models is presented, and its utility is illustrated in the context of ignition of hydrocarbon fuel–air mixtures. The strategy is based on a deterministic analysis and reduction method which employs computational singular perturbation analysis to generate simplified kinetic mechanisms, starting from a detailed reference mechanism. We model uncertain quantities in the reference mechanism, namely the Arrhenius rate parameters, as random variables with prescribed uncertainty factors. We propagate this uncertainty to obtain the probability of inclusion of each reaction in the simplified mechanism. We propose probabilistic error measures to compare predictions from the uncertain reference and simplified models, based on the comparison of the uncertain dynamics of the state variables, where the mixture entropy is chosen as progress variable. We employ the construction for the simplification of an uncertain mechanism in an n-butane–air mixture homogeneous ignition case, where a 176-species, 1111-reactions detailed kinetic model for the oxidation of n-butane is used with uncertainty factors assigned to each Arrhenius rate pre-exponential coefficient. This illustration is employed to highlight the utility of the construction, and the performance of a family of simplified models produced depending on chosen thresholds on importance and marginal probabilities of the reactions.

  1. Dealing with exploration uncertainties

    International Nuclear Information System (INIS)

    Capen, E.

    1992-01-01

    Exploration for oil and gas should fulfill the most adventurous in their quest for excitement and surprise. This paper tries to cover that tall order. The authors will touch on the magnitude of the uncertainty (which is far greater than in most other businesses), the effects of not knowing target sizes very well, how to build uncertainty into analyses naturally, how to tie reserves and chance estimates to economics, and how to look at the portfolio effect of an exploration program. With no apologies, the authors will be using a different language for some readers - the language of uncertainty, which means probability and statistics. These tools allow one to combine largely subjective exploration information with the more analytical data from the engineering and economic side

  2. Evacuation decision-making: process and uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Mileti, D.; Sorensen, J.; Bogard, W.

    1985-09-01

    The purpose was to describe the processes of evacuation decision-making, identify and document uncertainties in that process and discuss implications for federal assumption of liability for precautionary evacuations at nuclear facilities under the Price-Anderson Act. Four major categories of uncertainty are identified concerning the interpretation of hazard, communication problems, perceived impacts of evacuation decisions and exogenous influences. Over 40 historical accounts are reviewed and cases of these uncertainties are documented. The major findings are that all levels of government, including federal agencies experience uncertainties in some evacuation situations. Second, private sector organizations are subject to uncertainties at a variety of decision points. Third, uncertainties documented in the historical record have provided the grounds for liability although few legal actions have ensued. Finally it is concluded that if liability for evacuations is assumed by the federal government, the concept of a ''precautionary'' evacuation is not useful in establishing criteria for that assumption. 55 refs., 1 fig., 4 tabs.

  3. Incorporating uncertainty in predictive species distribution modelling.

    Science.gov (United States)

    Beale, Colin M; Lennon, Jack J

    2012-01-19

    Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.

  4. Approach to uncertainty in risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  5. Approach to uncertainty in risk analysis

    International Nuclear Information System (INIS)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented

  6. Uncertainty in artificial intelligence

    CERN Document Server

    Levitt, TS; Lemmer, JF; Shachter, RD

    1990-01-01

    Clearly illustrated in this volume is the current relationship between Uncertainty and AI.It has been said that research in AI revolves around five basic questions asked relative to some particular domain: What knowledge is required? How can this knowledge be acquired? How can it be represented in a system? How should this knowledge be manipulated in order to provide intelligent behavior? How can the behavior be explained? In this volume, all of these questions are addressed. From the perspective of the relationship of uncertainty to the basic questions of AI, the book divides naturally i

  7. Assessing climate change and socio-economic uncertainties in long term management of water resources

    Science.gov (United States)

    Jahanshahi, Golnaz; Dawson, Richard; Walsh, Claire; Birkinshaw, Stephen; Glenis, Vassilis

    2015-04-01

    Long term management of water resources is challenging for decision makers given the range of uncertainties that exist. Such uncertainties are a function of long term drivers of change, such as climate, environmental loadings, demography, land use and other socio economic drivers. Impacts of climate change on frequency of extreme events such as drought make it a serious threat to water resources and water security. The release of probabilistic climate information, such as the UKCP09 scenarios, provides improved understanding of some uncertainties in climate models. This has motivated a more rigorous approach to dealing with other uncertainties in order to understand the sensitivity of investment decisions to future uncertainty and identify adaptation options that are as far as possible robust. We have developed and coupled a system of models that includes a weather generator, simulations of catchment hydrology, demand for water and the water resource system. This integrated model has been applied in the Thames catchment which supplies the city of London, UK. This region is one of the driest in the UK and hence sensitive to water availability. In addition, it is one of the fastest growing parts of the UK and plays an important economic role. Key uncertainties in long term water resources in the Thames catchment, many of which result from earth system processes, are identified and quantified. The implications of these uncertainties are explored using a combination of uncertainty analysis and sensitivity testing. The analysis shows considerable uncertainty in future rainfall, river flow and consequently water resource. For example, results indicate that by the 2050s, low flow (Q95) in the Thames catchment will range from -44 to +9% compared with the control scenario (1970s). Consequently, by the 2050s the average number of drought days are expected to increase 4-6 times relative to the 1970s. Uncertainties associated with urban growth increase these risks further

  8. Identifying sources of uncertainty to generate supply chain redesign strategies

    NARCIS (Netherlands)

    Vorst, van der J.G.A.J.; Beulens, A.J.M.

    2002-01-01

    Dynamic demands and constraints imposed by a rapidly changing business environment make it increasingly necessary for companies in the food supply chain to cooperate with each other. The main questions individual (food) companies face are whether, why, how and with whom they should start supply

  9. Multimodel Uncertainty Changes in Simulated River Flows Induced by Human Impact Parameterizations

    Science.gov (United States)

    Liu, Xingcai; Tang, Qiuhong; Cui, Huijuan; Mu, Mengfei; Gerten Dieter; Gosling, Simon; Masaki, Yoshimitsu; Satoh, Yusuke; Wada, Yoshihide

    2017-01-01

    Human impacts increasingly affect the global hydrological cycle and indeed dominate hydrological changes in some regions. Hydrologists have sought to identify the human-impact-induced hydrological variations via parameterizing anthropogenic water uses in global hydrological models (GHMs). The consequently increased model complexity is likely to introduce additional uncertainty among GHMs. Here, using four GHMs, between-model uncertainties are quantified in terms of the ratio of signal to noise (SNR) for average river flow during 1971-2000 simulated in two experiments, with representation of human impacts (VARSOC) and without (NOSOC). It is the first quantitative investigation of between-model uncertainty resulted from the inclusion of human impact parameterizations. Results show that the between-model uncertainties in terms of SNRs in the VARSOC annual flow are larger (about 2 for global and varied magnitude for different basins) than those in the NOSOC, which are particularly significant in most areas of Asia and northern areas to the Mediterranean Sea. The SNR differences are mostly negative (-20 to 5, indicating higher uncertainty) for basin-averaged annual flow. The VARSOC high flow shows slightly lower uncertainties than NOSOC simulations, with SNR differences mostly ranging from -20 to 20. The uncertainty differences between the two experiments are significantly related to the fraction of irrigation areas of basins. The large additional uncertainties in VARSOC simulations introduced by the inclusion of parameterizations of human impacts raise the urgent need of GHMs development regarding a better understanding of human impacts. Differences in the parameterizations of irrigation, reservoir regulation and water withdrawals are discussed towards potential directions of improvements for future GHM development. We also discuss the advantages of statistical approaches to reduce the between-model uncertainties, and the importance of calibration of GHMs for not only

  10. Information Seeking in Uncertainty Management Theory: Exposure to Information About Medical Uncertainty and Information-Processing Orientation as Predictors of Uncertainty Management Success.

    Science.gov (United States)

    Rains, Stephen A; Tukachinsky, Riva

    2015-01-01

    Uncertainty management theory outlines the processes through which individuals cope with health-related uncertainty. Information seeking has been frequently documented as an important uncertainty management strategy. The reported study investigates exposure to specific types of medical information during a search, and one's information-processing orientation as predictors of successful uncertainty management (i.e., a reduction in the discrepancy between the level of uncertainty one feels and the level one desires). A lab study was conducted in which participants were primed to feel more or less certain about skin cancer and then were allowed to search the World Wide Web for skin cancer information. Participants' search behavior was recorded and content analyzed. The results indicate that exposure to two health communication constructs that pervade medical forms of uncertainty (i.e., severity and susceptibility) and information-processing orientation predicted uncertainty management success.

  11. Medical Need, Equality, and Uncertainty.

    Science.gov (United States)

    Horne, L Chad

    2016-10-01

    Many hold that distributing healthcare according to medical need is a requirement of equality. Most egalitarians believe, however, that people ought to be equal on the whole, by some overall measure of well-being or life-prospects; it would be a massive coincidence if distributing healthcare according to medical need turned out to be an effective way of promoting equality overall. I argue that distributing healthcare according to medical need is important for reducing individuals' uncertainty surrounding their future medical needs. In other words, distributing healthcare according to medical need is a natural feature of healthcare insurance; it is about indemnity, not equality. © 2016 John Wiley & Sons Ltd.

  12. Religion in the face of uncertainty: an uncertainty-identity theory account of religiousness.

    Science.gov (United States)

    Hogg, Michael A; Adelman, Janice R; Blagg, Robert D

    2010-02-01

    The authors characterize religions as social groups and religiosity as the extent to which a person identifies with a religion, subscribes to its ideology or worldview, and conforms to its normative practices. They argue that religions have attributes that make them well suited to reduce feelings of self-uncertainty. According to uncertainty-identity theory, people are motivated to reduce feelings of uncertainty about or reflecting on self; and identification with groups, particularly highly entitative groups, is a very effective way to reduce uncertainty. All groups provide belief systems and normative prescriptions related to everyday life. However, religions also address the nature of existence, invoking sacred entities and associated rituals and ceremonies. They are entitative groups that provide a moral compass and rules for living that pervade a person's life, making them particularly attractive in times of uncertainty. The authors document data supporting their analysis and discuss conditions that transform religiosity into religious zealotry and extremism.

  13. Principal component analysis of normalized full spectrum mass spectrometry data in multiMS-toolbox: An effective tool to identify important factors for classification of different metabolic patterns and bacterial strains.

    Science.gov (United States)

    Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora

    2018-06-15

    Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available

  14. Formal consensus to identify clinically important changes in management resulting from the use of cardiovascular magnetic resonance (CMR) in patients who activate the primary percutaneous coronary intervention (PPCI) pathway.

    Science.gov (United States)

    Pufulete, Maria; Brierley, Rachel C; Bucciarelli-Ducci, Chiara; Greenwood, John P; Dorman, Stephen; Anderson, Richard A; Harris, Jessica; McAlindon, Elisa; Rogers, Chris A; Reeves, Barnaby C

    2017-06-22

    To define important changes in management arising from the use of cardiovascular magnetic resonance (CMR) in patients who activate the primary percutaneous coronary intervention (PPCI) pathway. Formal consensus study using literature review and cardiologist expert opinion to formulate consensus statements and setting up a consensus panel to review the statements (by completing a web-based survey, attending a face-to-face meeting to discuss survey results and modify the survey to reflect group discussion and completing the modified survey to determine which statements were in consensus). Formulation of consensus statements: four cardiologists (two CMR and two interventional) and six non-clinical researchers. Formal consensus: seven cardiologists (two CMR and three interventional, one echocardiography and one heart failure). Forty-nine additional cardiologists completed the modified survey. Thirty-seven draft statements describing changes in management following CMR were generated; these were condensed into 12 statements and reviewed through the formal consensus process. Three of 12 statements were classified in consensus in the first survey; these related to the role of CMR in identifying the cause of out-of-hospital cardiac arrest, providing a definitive diagnosis in patients found to have unobstructed arteries on angiography and identifying patients with left ventricular thrombus. Two additional statements were in consensus in the modified survey, relating to the ability of CMR to identify patients who have a poor prognosis after PPCI and assess ischaemia and viability in patients with multivessel disease. There was consensus that CMR leads to clinically important changes in management in five subgroups of patients who activate the PPCI pathway. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. MicroRNAs regulate T-cell production of interleukin-9 and identify hypoxia-inducible factor-2α as an important regulator of T helper 9 and regulatory T-cell differentiation.

    Science.gov (United States)

    Singh, Yogesh; Garden, Oliver A; Lang, Florian; Cobb, Bradley S

    2016-09-01

    MicroRNAs (miRNAs) regulate many aspects of helper T cell (Th) development and function. Here we found that they are required for the suppression of interleukin-9 (IL-9) expression in Th9 cells and other Th subsets. Two highly related miRNAs (miR-15b and miR-16) that we previously found to play an important role in regulatory T (Treg) cell differentiation were capable of suppressing IL-9 expression when they were over-expressed in Th9 cells. We used these miRNAs as tools to identify novel regulators of IL-9 expression and found that they could regulate the expression of Epas1, which encodes hypoxia-inducible factor (HIF)-2α. HIF proteins regulate metabolic pathway usage that is important in determining appropriate Th differentiation. The related protein, HIF-1α enhances Th17 differentiation and inhibits Treg cell differentiation. Here we found that HIF-2α was required for IL-9 expression in Th9 cells, but its expression was not sufficient in other Th subsets. Furthermore, HIF-2α suppressed Treg cell differentiation like HIF-1α, demonstrating both similar and distinct roles of the HIF proteins in Th differentiation and adding a further dimension to their function. Ironically, even though miR-15b and miR-16 suppressed HIF-2α expression in Treg cells, inhibiting their function in Treg cells did not lead to an increase in IL-9 expression. Therefore, the physiologically relevant miRNAs that regulate IL-9 expression in Treg cells and other subsets remain unknown. Nevertheless, the analysis of miR-15b and miR-16 function led to the discovery of the importance of HIF-2α so this work demonstrated the utility of studying miRNA function to identify novel regulatory pathways in helper T-cell development. © 2016 John Wiley & Sons Ltd.

  16. Assessment of uncertainties associated with characterization of geological environment in the Tono area. Japanese fiscal year, 2006 (Contract research)

    International Nuclear Information System (INIS)

    Toida, Masaru; Suyama, Yasuhiro; Seno, Shoji; Atsumi, Hiroyuki; Ogata, Nobuhisa

    2008-03-01

    'Geoscientific research' performed at the Tono Geoscience Center is developing site investigation, characterization and assessment techniques for understanding of geological environment. Their important themes are to establish a methodology for analyzing uncertainties in heterogeneous geological environment, and to develop investigation techniques for reducing the uncertainties efficiently. This study proposes a new approach where all the possible options in the models and data-sets that cannot be excluded in the light of the evidence available, are identified. This approach enables uncertainties associated with the understanding at a given stage of the site characterization to be made explicitly using an uncertainty analysis technique based on Fuzzy geostatistics. This, in turn, supports the design of the following investigation stage to reduce the uncertainties efficiently. In the study, current knowledge had been compiled, and the technique had been advanced through geological modeling and groundwater analyses in the Tono area. This report systematized the uncertainty analysis methodology associated with the characterization of the geological environment, and organized the procedure of the methodology with the application examples in the study. This report also dealt with investigation techniques for reducing the uncertainties efficiently, and underground facility design options for handling geological uncertainties based on the characterization of the geological environment. (author)

  17. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  18. Uncertainties in repository modeling

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1996-01-01

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling

  19. Risks, uncertainty, vagueness

    International Nuclear Information System (INIS)

    Haefele, W.; Renn, O.; Erdmann, G.

    1990-01-01

    The notion of 'risk' is discussed in its social and technological contexts, leading to an investigation of the terms factuality, hypotheticality, uncertainty, and vagueness, and to the problems of acceptance and acceptability especially in the context of political decision finding. (DG) [de

  20. Measurement Errors and Uncertainties Theory and Practice

    CERN Document Server

    Rabinovich, Semyon G

    2006-01-01

    Measurement Errors and Uncertainties addresses the most important problems that physicists and engineers encounter when estimating errors and uncertainty. Building from the fundamentals of measurement theory, the author develops the theory of accuracy of measurements and offers a wealth of practical recommendations and examples of applications. This new edition covers a wide range of subjects, including: - Basic concepts of metrology - Measuring instruments characterization, standardization and calibration -Estimation of errors and uncertainty of single and multiple measurements - Modern probability-based methods of estimating measurement uncertainty With this new edition, the author completes the development of the new theory of indirect measurements. This theory provides more accurate and efficient methods for processing indirect measurement data. It eliminates the need to calculate the correlation coefficient - a stumbling block in measurement data processing - and offers for the first time a way to obtain...

  1. Uncertainty in prediction and in inference

    International Nuclear Information System (INIS)

    Hilgevoord, J.; Uffink, J.

    1991-01-01

    The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close relationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in inference can be obtained by means of the so-called statistical distance between probability distributions. When applied to quantum mechanics, this distance leads to a measure of the distinguishability of quantum states, which essentially is the absolute value of the matrix element between the states. The importance of this result to the quantum mechanical uncertainty principle is noted. The second part of the paper provides a derivation of the statistical distance on the basis of the so-called method of support

  2. Piezoelectric energy harvesting with parametric uncertainty

    International Nuclear Information System (INIS)

    Ali, S F; Friswell, M I; Adhikari, S

    2010-01-01

    The design and analysis of energy harvesting devices is becoming increasing important in recent years. Most of the literature has focused on the deterministic analysis of these systems and the problem of uncertain parameters has received less attention. Energy harvesting devices exhibit parametric uncertainty due to errors in measurement, errors in modelling and variability in the parameters during manufacture. This paper investigates the effect of parametric uncertainty in the mechanical system on the harvested power, and derives approximate explicit formulae for the optimal electrical parameters that maximize the mean harvested power. The maximum of the mean harvested power decreases with increasing uncertainty, and the optimal frequency at which the maximum mean power occurs shifts. The effect of the parameter variance on the optimal electrical time constant and optimal coupling coefficient are reported. Monte Carlo based simulation results are used to further analyse the system under parametric uncertainty

  3. General Practitioners' Experiences of, and Responses to, Uncertainty in Prostate Cancer Screening: Insights from a Qualitative Study.

    Directory of Open Access Journals (Sweden)

    Kristen Pickles

    Full Text Available Prostate-specific antigen (PSA testing for prostate cancer is controversial. There are unresolved tensions and disagreements amongst experts, and clinical guidelines conflict. This both reflects and generates significant uncertainty about the appropriateness of screening. Little is known about general practitioners' (GPs' perspectives and experiences in relation to PSA testing of asymptomatic men. In this paper we asked the following questions: (1 What are the primary sources of uncertainty as described by GPs in the context of PSA testing? (2 How do GPs experience and respond to different sources of uncertainty?This was a qualitative study that explored general practitioners' current approaches to, and reasoning about, PSA testing of asymptomatic men. We draw on accounts generated from interviews with 69 general practitioners located in Australia (n = 40 and the United Kingdom (n = 29. The interviews were conducted in 2013-2014. Data were analysed using grounded theory methods. Uncertainty in PSA testing was identified as a core issue.Australian GPs reported experiencing substantially more uncertainty than UK GPs. This seemed partly explainable by notable differences in conditions of practice between the two countries. Using Han et al's taxonomy of uncertainty as an initial framework, we first outline the different sources of uncertainty GPs (mostly Australian described encountering in relation to prostate cancer screening and what the uncertainty was about. We then suggest an extension to Han et al's taxonomy based on our analysis of data relating to the varied ways that GPs manage uncertainties in the context of PSA testing. We outline three broad strategies: (1 taking charge of uncertainty; (2 engaging others in managing uncertainty; and (3 transferring the responsibility for reducing or managing some uncertainties to other parties.Our analysis suggests some GPs experienced uncertainties associated with ambiguous guidance and the

  4. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    ]. There are also bootstrapping and cross-validation approaches.Sometimes analyses are conducted using surrogate models [12]. The availability of so many options can be confusing. Categorizing methods based on fundamental questions assists in communicating the essential results of uncertainty analyses to stakeholders. Such questions can focus on model adequacy (e.g., How well does the model reproduce observed system characteristics and dynamics?) and sensitivity analysis (e.g., What parameters can be estimated with available data? What observations are important to parameters and predictions? What parameters are important to predictions?), as well as on the uncertainty quantification (e.g., How accurate and precise are the predictions?). The methods can also be classified by the number of model runs required: few (10s to 1000s) or many (10,000s to 1,000,000s). Of the methods listed above, the most computationally frugal are generally those based on local derivatives; MCMC methods tend to be among the most computationally demanding. Surrogate models (emulators)do not necessarily produce computational frugality because many runs of the full model are generally needed to create a meaningful surrogate model. With this categorization, we can, in general, address all the fundamental questions mentioned above using either computationally frugal or demanding methods. Model development and analysis can thus be conducted consistently using either computation-ally frugal or demanding methods; alternatively, different fundamental questions can be addressed using methods that require different levels of effort. Based on this perspective, we pose the question: Can computationally frugal methods be useful companions to computationally demanding meth-ods? The reliability of computationally frugal methods generally depends on the model being reasonably linear, which usually means smooth nonlin-earities and the assumption of Gaussian errors; both tend to be more valid with more linear

  5. Modeling Uncertainty in Climate Change: A Multi-Model Comparison

    Energy Technology Data Exchange (ETDEWEB)

    Gillingham, Kenneth; Nordhaus, William; Anthoff, David; Blanford, Geoffrey J.; Bosetti, Valentina; Christensen, Peter; McJeon, Haewon C.; Reilly, J. M.; Sztorc, Paul

    2015-10-01

    The economics of climate change involves a vast array of uncertainties, complicating both the analysis and development of climate policy. This study presents the results of the first comprehensive study of uncertainty in climate change using multiple integrated assessment models. The study looks at model and parametric uncertainties for population, total factor productivity, and climate sensitivity and estimates the pdfs of key output variables, including CO2 concentrations, temperature, damages, and the social cost of carbon (SCC). One key finding is that parametric uncertainty is more important than uncertainty in model structure. Our resulting pdfs also provide insight on tail events.

  6. Strategy under uncertainty.

    Science.gov (United States)

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  7. Development of a Dynamic Lidar Uncertainty Framework

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clifton, Andrew [WindForS; Bonin, Timothy [CIRES/NOAA ESRL; Choukulkar, Aditya [CIRES/NOAA ESRL; Brewer, W. Alan [NOAA ESRL; Delgado, Ruben [University of Maryland Baltimore County

    2017-08-07

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote-sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote-sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote-sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards for quantifying remote sensing device uncertainty for power performance testing consider uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device and are generally fixed, leading to climatic uncertainty values that apply to the entire measurement campaign. However, real-world experience and a consideration of the fundamentals of the measurement process have shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we describe the development of a new dynamic lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from a field measurement site to assess the ability of the framework to predict

  8. The Importance of Uncertainty and Sensitivity Analysis in Process-based Models of Carbon and Nitrogen Cycling in Terrestrial Ecosystems with Particular Emphasis on Forest Ecosystems — Selected Papers from a Workshop Organized by the International Society for Ecological Modelling (ISEM) at the Third Biennal Meeting of the International Environmental Modelling and Software Society (IEMSS) in Burlington, Vermont, USA, August 9-13, 2006

    Science.gov (United States)

    Larocque, Guy R.; Bhatti, Jagtar S.; Liu, Jinxun; Ascough, James C.; Gordon, Andrew M.

    2008-01-01

    Many process-based models of carbon (C) and nitrogen (N) cycles have been developed for terrestrial ecosystems, including forest ecosystems. They address many basic issues of ecosystems structure and functioning, such as the role of internal feedback in ecosystem dynamics. The critical factor in these phenomena is scale, as these processes operate at scales from the minute (e.g. particulate pollution impacts on trees and other organisms) to the global (e.g. climate change). Research efforts remain important to improve the capability of such models to better represent the dynamics of terrestrial ecosystems, including the C, nutrient, (e.g. N) and water cycles. Existing models are sufficiently well advanced to help decision makers develop sustainable management policies and planning of terrestrial ecosystems, as they make realistic predictions when used appropriately. However, decision makers must be aware of their limitations by having the opportunity to evaluate the uncertainty associated with process-based models (Smith and Heath, 2001 and Allen et al., 2004). The variation in scale of issues currently being addressed by modelling efforts makes the evaluation of uncertainty a daunting task.

  9. Do systematic reviews address community healthcare professionals' wound care uncertainties? Results from evidence mapping in wound care.

    Science.gov (United States)

    Christie, Janice; Gray, Trish A; Dumville, Jo C; Cullum, Nicky A

    2018-01-01

    Complex wounds such as leg and foot ulcers are common, resource intensive and have negative impacts on patients' wellbeing. Evidence-based decision-making, substantiated by high quality evidence such as from systematic reviews, is widely advocated for improving patient care and healthcare efficiency. Consequently, we set out to classify and map the extent to which up-to-date systematic reviews containing robust evidence exist for wound care uncertainties prioritised by community-based healthcare professionals. We asked healthcare professionals to prioritise uncertainties based on complex wound care decisions, and then classified 28 uncertainties according to the type and level of decision. For each uncertainty, we searched for relevant systematic reviews. Two independent reviewers screened abstracts and full texts of reviews against the following criteria: meeting an a priori definition of a systematic review, sufficiently addressing the uncertainty, published during or after 2012, and identifying high quality research evidence. The most common uncertainty type was 'interventions' 24/28 (85%); the majority concerned wound level decisions 15/28 (53%) however, service delivery level decisions (10/28) were given highest priority. Overall, we found 162 potentially relevant reviews of which 57 (35%) were not systematic reviews. Of 106 systematic reviews, only 28 were relevant to an uncertainty and 18 of these were published within the preceding five years; none identified high quality research evidence. Despite the growing volume of published primary research, healthcare professionals delivering wound care have important clinical uncertainties which are not addressed by up-to-date systematic reviews containing high certainty evidence. These are high priority topics requiring new research and systematic reviews which are regularly updated. To reduce clinical and research waste, we recommend systematic reviewers and researchers make greater efforts to ensure that research

  10. An appraisal of uncertainties in the Western Australian wine industry supply chain

    OpenAIRE

    Islam, Nazrul; Quaddus, Mohammed

    2005-01-01

    Wine is one of the significant export items of Western Australia. In 2001/2002, the State’s wine exports amounted to about A$42 million. Despite its economic importance research on the supply chain aspects of WA wine industry is rather limited. This paper presents the sources of uncertainties in WA wine supply chain based on the results of an electronic focus group study with WA wine industry stakeholders. The group identified 74 items of uncertainties, which were then grouped into 26 unique ...

  11. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    Science.gov (United States)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  12. Climate change decision-making: Model & parameter uncertainties explored

    Energy Technology Data Exchange (ETDEWEB)

    Dowlatabadi, H.; Kandlikar, M.; Linville, C.

    1995-12-31

    A critical aspect of climate change decision-making is uncertainties in current understanding of the socioeconomic, climatic and biogeochemical processes involved. Decision-making processes are much better informed if these uncertainties are characterized and their implications understood. Quantitative analysis of these uncertainties serve to inform decision makers about the likely outcome of policy initiatives, and help set priorities for research so that outcome ambiguities faced by the decision-makers are reduced. A family of integrated assessment models of climate change have been developed at Carnegie Mellon. These models are distinguished from other integrated assessment efforts in that they were designed from the outset to characterize and propagate parameter, model, value, and decision-rule uncertainties. The most recent of these models is ICAM 2.1. This model includes representation of the processes of demographics, economic activity, emissions, atmospheric chemistry, climate and sea level change and impacts from these changes and policies for emissions mitigation, and adaptation to change. The model has over 800 objects of which about one half are used to represent uncertainty. In this paper we show, that when considering parameter uncertainties, the relative contribution of climatic uncertainties are most important, followed by uncertainties in damage calculations, economic uncertainties and direct aerosol forcing uncertainties. When considering model structure uncertainties we find that the choice of policy is often dominated by model structure choice, rather than parameter uncertainties.

  13. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    Science.gov (United States)

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  14. Compliance uncertainty of diameter characteristic in the next-generation geometrical product specifications and verification

    International Nuclear Information System (INIS)

    Lu, W L; Jiang, X; Liu, X J; Xu, Z G

    2008-01-01

    Compliance uncertainty is one of the most important elements in the next-generation geometrical product specifications and verification (GPS). It consists of specification uncertainty, method uncertainty and implementation uncertainty, which are three of the four fundamental uncertainties in the next-generation GPS. This paper analyzes the key factors that influence compliance uncertainty and then proposes a procedure to manage the compliance uncertainty. A general model on evaluation of compliance uncertainty has been devised and a specific formula for diameter characteristic has been derived based on this general model. The case study was conducted and it revealed that the completeness of currently dominant diameter characteristic specification needs to be improved

  15. Which uncertainty is important in multistage stochastic programmes?

    DEFF Research Database (Denmark)

    Pantuso, Giovanni; Fagerholt, Kjetil; Wallace, Stein W.

    2017-01-01

    , performed before data collection, can indicate which information should be primarily sought, and which is not critical for the final decision. We apply the analysis to a real-life instance of the maritime fleet renewal. Results show that some properties of the stochastic phenomena, such as the correlation...

  16. Uncertainty importance measure for models with correlated normal variables

    International Nuclear Information System (INIS)

    Hao, Wenrui; Lu, Zhenzhou; Wei, Pengfei

    2013-01-01

    In order to explore the contributions by correlated input variables to the variance of the model output, the contribution decomposition of the correlated input variables based on Mara's definition is investigated in detail. By taking the quadratic polynomial output without cross term as an illustration, the solution of the contribution decomposition is derived analytically using the statistical inference theory. After the correction of the analytical solution is validated by the numerical examples, they are employed to two engineering examples to show their wide application. The derived analytical solutions can directly be used to recognize the contributions by the correlated input variables in case of the quadratic or linear polynomial output without cross term, and the analytical inference method can be extended to the case of higher order polynomial output. Additionally, the origins of the interaction contribution of the correlated inputs are analyzed, and the comparisons of the existing contribution indices are completed, on which the engineer can select the suitable indices to know the necessary information. At last, the degeneration of the correlated inputs to the uncorrelated ones and some computational issues are discussed in concept

  17. Impact of Pitot tube calibration on the uncertainty of water flow rate measurement

    Science.gov (United States)

    de Oliveira Buscarini, Icaro; Costa Barsaglini, Andre; Saiz Jabardo, Paulo Jose; Massami Taira, Nilson; Nader, Gilder

    2015-10-01

    Water utility companies often use Cole type Pitot tubes to map velocity profiles and thus measure flow rate. Frequent monitoring and measurement of flow rate is an important step in identifying leaks and other types of losses. In Brazil losses as high as 42% are common and in some places even higher values are found. When using Cole type Pitot tubes to measure the flow rate, the uncertainty of the calibration coefficient (Cd) is a major component of the overall flow rate measurement uncertainty. A common practice is to employ the usual value Cd = 0.869, in use since Cole proposed his Pitot tube in 1896. Analysis of 414 calibrations of Cole type Pitot tubes show that Cd varies considerably and values as high 0.020 for the expanded uncertainty are common. Combined with other uncertainty sources, the overall velocity measurement uncertainty is 0.02, increasing flowrate measurement uncertainty by 1.5% which, for the Sao Paulo metropolitan area (Brazil) corresponds to 3.5 × 107 m3/year.

  18. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    Science.gov (United States)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  19. Predicting ecological responses in a changing ocean: the effects of future climate uncertainty.

    Science.gov (United States)

    Freer, Jennifer J; Partridge, Julian C; Tarling, Geraint A; Collins, Martin A; Genner, Martin J

    2018-01-01

    Predicting how species will respond to climate change is a growing field in marine ecology, yet knowledge of how to incorporate the uncertainty from future climate data into these predictions remains a significant challenge. To help overcome it, this review separates climate uncertainty into its three components (scenario uncertainty, model uncertainty, and internal model variability) and identifies four criteria that constitute a thorough interpretation of an ecological response to climate change in relation to these parts (awareness, access, incorporation, communication). Through a literature review, the extent to which the marine ecology community has addressed these criteria in their predictions was assessed. Despite a high awareness of climate uncertainty, articles favoured the most severe emission scenario, and only a subset of climate models were used as input into ecological analyses. In the case of sea surface temperature, these models can have projections unrepresentative against a larger ensemble mean. Moreover, 91% of studies failed to incorporate the internal variability of a climate model into results. We explored the influence that the choice of emission scenario, climate model, and model realisation can have when predicting the future distribution of the pelagic fish, Electrona antarctica . Future distributions were highly influenced by the choice of climate model, and in some cases, internal variability was important in determining the direction and severity of the distribution change. Increased clarity and availability of processed climate data would facilitate more comprehensive explorations of climate uncertainty, and increase in the quality and standard of marine prediction studies.

  20. Robust nonlinear control of nuclear reactors under model uncertainty

    International Nuclear Information System (INIS)

    Park, Moon Ghu

    1993-02-01

    A nonlinear model-based control method is developed for the robust control of a nuclear reactor. The nonlinear plant model is used to design a unique control law which covers a wide operating range. The robustness is a crucial factor for the fully automatic control of reactor power due to time-varying, uncertain parameters, and state estimation error, or unmodeled dynamics. A variable structure control (VSC) method is introduced which consists of an adaptive performance specification (fime control) after the tracking error reaches the narrow boundary-layer by a time-optimal control (coarse control). Variable structure control is a powerful method for nonlinear system controller design which has inherent robustness to parameter variations or external disturbances using the known uncertainty bounds, and it requires very low computational efforts. In spite of its desirable properties, conventional VSC presents several important drawbacks that limit its practical applicability. One of the most undesirable phenomena is chattering, which implies extremely high control activity and may excite high-frequency unmodeled dynamics. This problem is due to the neglected actuator time-delay or sampling effects. The problem was partially remedied by replacing chattering control by a smooth control inter-polation in a boundary layer neighnboring a time-varying sliding surface. But, for the nuclear reactor systems which has very fast dynamic response, the sampling effect may destroy the narrow boundary layer when a large uncertainty bound is used. Due to the very short neutron life time, large uncertainty bound leads to the high gain in feedback control. To resolve this problem, a derivative feedback is introduced that gives excellent performance by reducing the uncertainty bound. The stability of tracking error dynamics is guaranteed by the second method of Lyapunov using the two-level uncertainty bounds that are obtained from the knowledge of uncertainty bound and the estimated

  1. Large break LOCA uncertainty evaluation and comparison with conservative calculation

    International Nuclear Information System (INIS)

    Glaeser, H.G.

    2004-01-01

    The first formulation of the USA Code of Federal Regulations (CFR) 10CFR50 with applicable sections specific to NPP licensing requirements was released 1976. Over a decade later 10CFR 50.46 allowed the use of BE codes instead of conservative code models but uncertainties have to be identified and quantified. Guidelines were released that described interpretations developed over the intervening years that are applicable. Other countries established similar conservative procedures and acceptance criteria. Because conservative methods were used to calculate the peak values of key parameters, such as peak clad temperature (PCT), it was always acknowledged that a large margin, between the 'conservative' calculated value and the 'true' value, existed. Beside USA, regulation in other countries, like Germany, for example, allowed that the state of science and technology is applied in licensing. I.e. the increase of experimental evidence and progress in code development during time could be used. There was no requirement to apply a pure evaluation methodology with licensed assumptions and frozen codes. The thermal-hydraulic system codes became more and more best-estimate codes based on comprehensive validation. This development was and is possible because the rules and guidelines provide the necessary latitude to consider further development of safety technology. Best estimate codes are allowed to be used in licensing in combination with conservative initial and boundary conditions. However, uncertainty quantification is not required. Since some of the initial and boundary conditions are more conservative compared with those internationally used (e.g. 106% reactor power instead 102%, a single failure plus a non-availability due to preventive maintenance is assumed, etc.) it is claimed that the uncertainties of code models are covered. Since many utilities apply for power increase, calculation results come closer to some licensing criteria. The situation in German licensing

  2. Quantifying uncertainty in NDSHA estimates due to earthquake catalogue

    Science.gov (United States)

    Magrin, Andrea; Peresan, Antonella; Vaccari, Franco; Panza, Giuliano

    2014-05-01

    The procedure for the neo-deterministic seismic zoning, NDSHA, is based on the calculation of synthetic seismograms by the modal summation technique. This approach makes use of information about the space distribution of large magnitude earthquakes, which can be defined based on seismic history and seismotectonics, as well as incorporating information from a wide set of geological and geophysical data (e.g., morphostructural features and ongoing deformation processes identified by earth observations). Hence the method does not make use of attenuation models (GMPE), which may be unable to account for the complexity of the product between seismic source tensor and medium Green function and are often poorly constrained by the available observations. NDSHA defines the hazard from the envelope of the values of ground motion parameters determined considering a wide set of scenario earthquakes; accordingly, the simplest outcome of this method is a map where the maximum of a given seismic parameter is associated to each site. In NDSHA uncertainties are not statistically treated as in PSHA, where aleatory uncertainty is traditionally handled with probability density functions (e.g., for magnitude and distance random variables) and epistemic uncertainty is considered by applying logic trees that allow the use of alternative models and alternative parameter values of each model, but the treatment of uncertainties is performed by sensitivity analyses for key modelling parameters. To fix the uncertainty related to a particular input parameter is an important component of the procedure. The input parameters must account for the uncertainty in the prediction of fault radiation and in the use of Green functions for a given medium. A key parameter is the magnitude of sources used in the simulation that is based on catalogue informations, seismogenic zones and seismogenic nodes. Because the largest part of the existing catalogues is based on macroseismic intensity, a rough estimate

  3. Uncertainty Quantification Bayesian Framework for Porous Media Flows

    Science.gov (United States)

    Demyanov, V.; Christie, M.; Erbas, D.

    2005-12-01

    Uncertainty quantification is an increasingly important aspect of many areas of applied science, where the challenge is to make reliable predictions about the performance of complex physical systems in the absence of complete or reliable data. Predicting flows of fluids through undersurface reservoirs is an example of a complex system where accuracy in prediction is needed (e.g. in oil industry it is essential for financial reasons). Simulation of fluid flow in oil reservoirs is usually carried out using large commercially written finite difference simulators solving conservation equations describing the multi-phase flow through the porous reservoir rocks, which is a highly computationally expensive task. This work examines a Bayesian Framework for uncertainty quantification in porous media flows that uses a stochastic sampling algorithm to generate models that match observed time series data. The framework is flexible for a wide range of general physical/statistical parametric models, which are used to describe the underlying hydro-geological process in its temporal dynamics. The approach is based on exploration of the parameter space and update of the prior beliefs about what the most likely model definitions are. Optimization problem for a highly parametric physical model usually have multiple solutions, which impact the uncertainty of the made predictions. Stochastic search algorithm (e.g. genetic algorithm) allows to identify multiple "good enough" models in the parameter space. Furthermore, inference of the generated model ensemble via MCMC based algorithm evaluates the posterior probability of the generated models and quantifies uncertainty of the predictions. Machine learning algorithm - Artificial Neural Networks - are used to speed up the identification of regions in parameter space where good matches to observed data can be found. Adaptive nature of ANN allows to develop different ways of integrating them into the Bayesian framework: as direct time

  4. Methods and computer codes for probabilistic sensitivity and uncertainty analysis

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1985-01-01

    This paper describes the methods and applications experience with two computer codes that are now available from the National Energy Software Center at Argonne National Laboratory. The purpose of the SCREEN code is to identify a group of most important input variables of a code that has many (tens, hundreds) input variables with uncertainties, and do this without relying on judgment or exhaustive sensitivity studies. Purpose of the PROSA-2 code is to propagate uncertainties and calculate the distributions of interesting output variable(s) of a safety analysis code using response surface techniques, based on the same runs used for screening. Several applications are discussed, but the codes are generic, not tailored to any specific safety application code. They are compatible in terms of input/output requirements but also independent of each other, e.g., PROSA-2 can be used without first using SCREEN if a set of important input variables has first been selected by other methods. Also, although SCREEN can select cases to be run (by random sampling), a user can select cases by other methods if he so prefers, and still use the rest of SCREEN for identifying important input variables

  5. Uncertainty Quantification and Regional Sensitivity Analysis of Snow-related Parameters in the Canadian LAnd Surface Scheme (CLASS)

    Science.gov (United States)

    Badawy, B.; Fletcher, C. G.

    2017-12-01

    The parameterization of snow processes in land surface models is an important source of uncertainty in climate simulations. Quantifying the importance of snow-related parameters, and their uncertainties, may therefore lead to better understanding and quantification of uncertainty within integrated earth system models. However, quantifying the uncertainty arising from parameterized snow processes is challenging due to the high-dimensional parameter space, poor observational constraints, and parameter interaction. In this study, we investigate the sensitivity of the land simulation to uncertainty in snow microphysical parameters in the Canadian LAnd Surface Scheme (CLASS) using an uncertainty quantification (UQ) approach. A set of training cases (n=400) from CLASS is used to sample each parameter across its full range of empirical uncertainty, as determined from available observations and expert elicitation. A statistical learning model using support vector regression (SVR) is then constructed from the training data (CLASS output variables) to efficiently emulate the dynamical CLASS simulations over a much larger (n=220) set of cases. This approach is used to constrain the plausible range for each parameter using a skill score, and to identify the parameters with largest influence on the land simulation in CLASS at global and regional scales, using a random forest (RF) permutation importance algorithm. Preliminary sensitivity tests indicate that snow albedo refreshment threshold and the limiting snow depth, below which bare patches begin to appear, have the highest impact on snow output variables. The results also show a considerable reduction of the plausible ranges of the parameters values and hence reducing their uncertainty ranges, which can lead to a significant reduction of the model uncertainty. The implementation and results of this study will be presented and discussed in details.

  6. The uncertainty principle

    International Nuclear Information System (INIS)

    Martens, Hans.

    1991-01-01

    The subject of this thesis is the uncertainty principle (UP). The UP is one of the most characteristic points of differences between quantum and classical mechanics. The starting point of this thesis is the work of Niels Bohr. Besides the discussion the work is also analyzed. For the discussion of the different aspects of the UP the formalism of Davies and Ludwig is used instead of the more commonly used formalism of Neumann and Dirac. (author). 214 refs.; 23 figs

  7. Economic uncertainty principle?

    OpenAIRE

    Alexander Harin

    2006-01-01

    The economic principle of (hidden) uncertainty is presented. New probability formulas are offered. Examples of solutions of three types of fundamental problems are reviewed.; Principe d'incertitude économique? Le principe économique d'incertitude (cachée) est présenté. De nouvelles formules de chances sont offertes. Les exemples de solutions des trois types de problèmes fondamentaux sont reconsidérés.

  8. Citizen Candidates Under Uncertainty

    OpenAIRE

    Eguia, Jon X.

    2005-01-01

    In this paper we make two contributions to the growing literature on "citizen-candidate" models of representative democracy. First, we add uncertainty about the total vote count. We show that in a society with a large electorate, where the outcome of the election is uncertain and where winning candidates receive a large reward from holding office, there will be a two-candidate equilibrium and no equilibria with a single candidate. Second, we introduce a new concept of equilibrium, which we te...

  9. Calibration Under Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  10. Participation under Uncertainty

    International Nuclear Information System (INIS)

    Boudourides, Moses A.

    2003-01-01

    This essay reviews a number of theoretical perspectives about uncertainty and participation in the present-day knowledge-based society. After discussing the on-going reconfigurations of science, technology and society, we examine how appropriate for policy studies are various theories of social complexity. Post-normal science is such an example of a complexity-motivated approach, which justifies civic participation as a policy response to an increasing uncertainty. But there are different categories and models of uncertainties implying a variety of configurations of policy processes. A particular role in all of them is played by expertise whose democratization is an often-claimed imperative nowadays. Moreover, we discuss how different participatory arrangements are shaped into instruments of policy-making and framing regulatory processes. As participation necessitates and triggers deliberation, we proceed to examine the role and the barriers of deliberativeness. Finally, we conclude by referring to some critical views about the ultimate assumptions of recent European policy frameworks and the conceptions of civic participation and politicization that they invoke

  11. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  12. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  13. Uncertainty in biodiversity science, policy and management: a conceptual overview

    Directory of Open Access Journals (Sweden)

    Yrjö Haila

    2014-10-01

    Full Text Available The protection of biodiversity is a complex societal, political and ultimately practical imperative of current global society. The imperative builds upon scientific knowledge on human dependence on the life-support systems of the Earth. This paper aims at introducing main types of uncertainty inherent in biodiversity science, policy and management, as an introduction to a companion paper summarizing practical experiences of scientists and scholars (Haila et al. 2014. Uncertainty is a cluster concept: the actual nature of uncertainty is inherently context-bound. We use semantic space as a conceptual device to identify key dimensions of uncertainty in the context of biodiversity protection; these relate to [i] data; [ii] proxies; [iii] concepts; [iv] policy and management; and [v] normative goals. Semantic space offers an analytic perspective for drawing critical distinctions between types of uncertainty, identifying fruitful resonances that help to cope with the uncertainties, and building up collaboration between different specialists to support mutual social learning.

  14. Parameter Uncertainty on AGCM-simulated Tropical Cyclones

    Science.gov (United States)

    He, F.

    2015-12-01

    This work studies the parameter uncertainty on tropical cyclone (TC) simulations in Atmospheric General Circulation Models (AGCMs) using the Reed-Jablonowski TC test case, which is illustrated in Community Atmosphere Model (CAM). It examines the impact from 24 parameters across the physical parameterization schemes that represent the convection, turbulence, precipitation and cloud processes in AGCMs. The one-at-a-time (OAT) sensitivity analysis method first quantifies their relative importance on TC simulations and identifies the key parameters to the six different TC characteristics: intensity, precipitation, longwave cloud radiative forcing (LWCF), shortwave cloud radiative forcing (SWCF), cloud liquid water path (LWP) and ice water path (IWP). Then, 8 physical parameters are chosen and perturbed using the Latin-Hypercube Sampling (LHS) method. The comparison between OAT ensemble run and LHS ensemble run shows that the simulated TC intensity is mainly affected by the parcel fractional mass entrainment rate in Zhang-McFarlane (ZM) deep convection scheme. The nonlinear interactive effect among different physical parameters is negligible on simulated TC intensity. In contrast, this nonlinear interactive effect plays a significant role in other simulated tropical cyclone characteristics (precipitation, LWCF, SWCF, LWP and IWP) and greatly enlarge their simulated uncertainties. The statistical emulator Extended Multivariate Adaptive Regression Splines (EMARS) is applied to characterize the response functions for nonlinear effect. Last, we find that the intensity uncertainty caused by physical parameters is in a degree comparable to uncertainty caused by model structure (e.g. grid) and initial conditions (e.g. sea surface temperature, atmospheric moisture). These findings suggest the importance of using the perturbed physics ensemble (PPE) method to revisit tropical cyclone prediction under climate change scenario.

  15. Application of intelligence based uncertainty analysis for HLW disposal

    International Nuclear Information System (INIS)

    Kato, Kazuyuki

    2003-01-01

    Safety assessment for geological disposal of high level radioactive waste inevitably involves factors that cannot be specified in a deterministic manner. These are namely: (1) 'variability' that arises from stochastic nature of the processes and features considered, e.g., distribution of canister corrosion times and spatial heterogeneity of a host geological formation; (2) 'ignorance' due to incomplete or imprecise knowledge of the processes and conditions expected in the future, e.g., uncertainty in the estimation of solubilities and sorption coefficients for important nuclides. In many cases, a decision in assessment, e.g., selection among model options or determination of a parameter value, is subjected to both variability and ignorance in a combined form. It is clearly important to evaluate both influences of variability and ignorance on the result of a safety assessment in a consistent manner. We developed a unified methodology to handle variability and ignorance by using probabilistic and possibilistic techniques respectively. The methodology has been applied to safety assessment of geological disposal of high level radioactive waste. Uncertainties associated with scenarios, models and parameters were defined in terms of fuzzy membership functions derived through a series of interviews to the experts while variability was formulated by means of probability density functions (pdfs) based on available data set. The exercise demonstrated applicability of the new methodology and, in particular, its advantage in quantifying uncertainties based on expert's opinion and in providing information on dependence of assessment result on the level of conservatism. In addition, it was also shown that sensitivity analysis could identify key parameters in reducing uncertainties associated with the overall assessment. The above information can be used to support the judgment process and guide the process of disposal system development in optimization of protection against

  16. Uncertainty in project phases: A framework for organisational change management

    DEFF Research Database (Denmark)

    Kreye, Melanie; Balangalibun, Sarah

    2015-01-01

    in the early stage of the change project but was delayed until later phases. Furthermore, the sources of uncertainty were found to be predominantly within the organisation that initiated the change project and connected to the project scope. Based on these findings, propositions for future research are defined......Uncertainty is an integral challenge when managing organisational change projects (OCPs). Current literature highlights the importance of uncertainty; however, falls short of giving insights into the nature of uncertainty and suggestions for managing it. Specifically, no insights exist on how...... uncertainty develops over the different phases of OCPs. This paper presents case-based evidence on different sources of uncertainty in OCPs and how these develop over the different project phases. The results showed some surprising findings as the majority of the uncertainty did not manifest itself...

  17. Importance measures

    International Nuclear Information System (INIS)

    Gomez Cobo, A.

    1997-01-01

    The presentation discusses the following: general concepts of importance measures; example fault tree, used to illustrate importance measures; Birnbaum's structural importance; criticality importance; Fussel-Vesely importance; upgrading function; risk achievement worth; risk reduction worth

  18. The characterisation and evaluation of uncertainty in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Parry, G.W.; Winter, P.W.

    1980-10-01

    The sources of uncertainty in probabilistic risk analysis are discussed using the event/fault tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which is, at present, unquantifiable, using either classical or Bayesian statistics. It is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events and a short review is given with some discussion on the representation of ignorance. (author)

  19. Quantification of uncertainties in source term estimates for a BWR with Mark I containment

    International Nuclear Information System (INIS)

    Khatib-Rahbar, M.; Cazzoli, E.; Davis, R.; Ishigami, T.; Lee, M.; Nourbakhsh, H.; Schmidt, E.; Unwin, S.

    1988-01-01

    A methodology for quantification and uncertainty analysis of source terms for severe accident in light water reactors (QUASAR) has been developed. The objectives of the QUASAR program are (1) to develop a framework for performing an uncertainty evaluation of the input parameters of the phenomenological models used in the Source Term Code Package (STCP), and (2) to quantify the uncertainties in certain phenomenological aspects of source terms (that are not modeled by STCP) using state-of-the-art methods. The QUASAR methodology consists of (1) screening sensitivity analysis, where the most sensitive input variables are selected for detailed uncertainty analysis, (2) uncertainty analysis, where probability density functions (PDFs) are established for the parameters identified by the screening stage and propagated through the codes to obtain PDFs for the outputs (i.e., release fractions to the environment), and (3) distribution sensitivity analysis, which is performed to determine the sensitivity of the output PDFs to the input PDFs. In this paper attention is limited to a single accident progression sequence, namely; a station blackout accident in a BWR with a Mark I containment buildings. Identified as an important accident in the draft NUREG-1150 a station blackout involves loss of both off-site power and DC power resulting in failure of the diesels to start and in the unavailability of the high pressure injection and core isolation coding systems

  20. A review on the CIRCE methodology to quantify the uncertainty of the physical models of a code

    International Nuclear Information System (INIS)

    Jeon, Seong Su; Hong, Soon Joon; Bang, Young Seok

    2012-01-01

    In the field of nuclear engineering, recent regulatory audit calculations of large break loss of coolant accident (LBLOCA) have been performed with the best estimate code such as MARS, RELAP5 and CATHARE. Since the credible regulatory audit calculation is very important in the evaluation of the safety of the nuclear power plant (NPP), there have been many researches to develop rules and methodologies for the use of best estimate codes. One of the major points is to develop the best estimate plus uncertainty (BEPU) method for uncertainty analysis. As a representative BEPU method, NRC proposes the CSAU (Code scaling, applicability and uncertainty) methodology, which clearly identifies the different steps necessary for an uncertainty analysis. The general idea is 1) to determine all the sources of uncertainty in the code, also called basic uncertainties, 2) quantify them and 3) combine them in order to obtain the final uncertainty for the studied application. Using the uncertainty analysis such as CSAU methodology, an uncertainty band for the code response (calculation result), important from the safety point of view is calculated and the safety margin of the NPP is quantified. An example of such a response is the peak cladding temperature (PCT) for a LBLOCA. However, there is a problem in the uncertainty analysis with the best estimate codes. Generally, it is very difficult to determine the uncertainties due to the empiricism of closure laws (also called correlations or constitutive relationships). So far the only proposed approach is based on the expert judgment. For this case, the uncertainty range of important parameters can be wide and inaccurate so that the confidence level of the BEPU calculation results can be decreased. In order to solve this problem, recently CEA (France) proposes a statistical method of data analysis, called CIRCE. The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment

  1. Noodles: a tool for visualization of numerical weather model ensemble uncertainty.

    Science.gov (United States)

    Sanyal, Jibonananda; Zhang, Song; Dyer, Jamie; Mercer, Andrew; Amburn, Philip; Moorhead, Robert J

    2010-01-01

    Numerical weather prediction ensembles are routinely used for operational weather forecasting. The members of these ensembles are individual simulations with either slightly perturbed initial conditions or different model parameterizations, or occasionally both. Multi-member ensemble output is usually large, multivariate, and challenging to interpret interactively. Forecast meteorologists are interested in understanding the uncertainties associated with numerical weather prediction; specifically variability between the ensemble members. Currently, visualization of ensemble members is mostly accomplished through spaghetti plots of a single mid-troposphere pressure surface height contour. In order to explore new uncertainty visualization methods, the Weather Research and Forecasting (WRF) model was used to create a 48-hour, 18 member parameterization ensemble of the 13 March 1993 "Superstorm". A tool was designed to interactively explore the ensemble uncertainty of three important weather variables: water-vapor mixing ratio, perturbation potential temperature, and perturbation pressure. Uncertainty was quantified using individual ensemble member standard deviation, inter-quartile range, and the width of the 95% confidence interval. Bootstrapping was employed to overcome the dependence on normality in the uncertainty metrics. A coordinated view of ribbon and glyph-based uncertainty visualization, spaghetti plots, iso-pressure colormaps, and data transect plots was provided to two meteorologists for expert evaluation. They found it useful in assessing uncertainty in the data, especially in finding outliers in the ensemble run and therefore avoiding the WRF parameterizations that lead to these outliers. Additionally, the meteorologists could identify spatial regions where the uncertainty was significantly high, allowing for identification of poorly simulated storm environments and physical interpretation of these model issues.

  2. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  3. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  4. Uncertainties in risk assessment and decision making

    International Nuclear Information System (INIS)

    Starzec, Peter; Purucker, Tom; Stewart, Robert

    2008-02-01

    The general concept for risk assessment in accordance with the Swedish model for contaminated soil implies that the toxicological reference value for a given receptor is first back-calculated to a corresponding concentration of a compound in soil and (if applicable) then modified with respect to e.g. background levels, acute toxicity, and factor of safety. This result in a guideline value that is subsequently compared to the observed concentration levels. Many sources of uncertainty exist when assessing whether the risk for a receptor is significant or not. In this study, the uncertainty aspects have been addressed from three standpoints: 1. Uncertainty in the comparison between the level of contamination (source) and a given risk criterion (e.g. a guideline value) and possible implications on subsequent decisions. This type of uncertainty is considered to be most important in situations where a contaminant is expected to be spatially heterogeneous without any tendency to form isolated clusters (hotspots) that can be easily delineated, i.e. where mean values are appropriate to compare to the risk criterion. 2. Uncertainty in spatial distribution of a contaminant. Spatial uncertainty should be accounted for when hotspots are to be delineated and the volume of soil contaminated with levels above a stated decision criterion has to be assessed (quantified). 3. Uncertainty in an ecological exposure model with regard to the moving pattern of a receptor in relation to spatial distribution of contaminant in question. The study points out that the choice of methodology to characterize the relation between contaminant concentration and a pre-defined risk criterion is governed by a conceptual perception of the contaminant's spatial distribution and also depends on the structure of collected data (observations). How uncertainty in transition from contaminant concentration into risk criterion can be quantified was demonstrated by applying hypothesis tests and the concept of

  5. Understanding the origin of Paris Agreement emission uncertainties.

    Science.gov (United States)

    Rogelj, Joeri; Fricko, Oliver; Meinshausen, Malte; Krey, Volker; Zilliacus, Johanna J J; Riahi, Keywan

    2017-06-06

    The UN Paris Agreement puts in place a legally binding mechanism to increase mitigation action over time. Countries put forward pledges called nationally determined contributions (NDC) whose impact is assessed in global stocktaking exercises. Subsequently, actions can then be strengthened in light of the Paris climate objective: limiting global mean temperature increase to well below 2 °C and pursuing efforts to limit it further to 1.5 °C. However, pledged actions are currently described ambiguously and this complicates the global stocktaking exercise. Here, we systematically explore possible interpretations of NDC assumptions, and show that this results in estimated emissions for 2030 ranging from 47 to 63 GtCO 2 e yr -1 . We show that this uncertainty has critical implications for the feasibility and cost to limit warming well below 2 °C and further to 1.5 °C. Countries are currently working towards clarifying the modalities of future NDCs. We identify salient avenues to reduce the overall uncertainty by about 10 percentage points through simple, technical clarifications regarding energy accounting rules. Remaining uncertainties depend to a large extent on politically valid choices about how NDCs are expressed, and therefore raise the importance of a thorough and robust process that keeps track of where emissions are heading over time.

  6. Integrating uncertainty into public energy research and development decisions

    Science.gov (United States)

    Anadón, Laura Díaz; Baker, Erin; Bosetti, Valentina

    2017-05-01

    Public energy research and development (R&D) is recognized as a key policy tool for transforming the world's energy system in a cost-effective way. However, managing the uncertainty surrounding technological change is a critical challenge for designing robust and cost-effective energy policies. The design of such policies is particularly important if countries are going to both meet the ambitious greenhouse-gas emissions reductions goals set by the Paris Agreement and achieve the required harmonization with the broader set of objectives dictated by the Sustainable Development Goals. The complexity of informing energy technology policy requires, and is producing, a growing collaboration between different academic disciplines and practitioners. Three analytical components have emerged to support the integration of technological uncertainty into energy policy: expert elicitations, integrated assessment models, and decision frameworks. Here we review efforts to incorporate all three approaches to facilitate public energy R&D decision-making under uncertainty. We highlight emerging insights that are robust across elicitations, models, and frameworks, relating to the allocation of public R&D investments, and identify gaps and challenges that remain.

  7. Understanding the origin of Paris Agreement emission uncertainties

    Science.gov (United States)

    Rogelj, Joeri; Fricko, Oliver; Meinshausen, Malte; Krey, Volker; Zilliacus, Johanna J. J.; Riahi, Keywan

    2017-06-01

    The UN Paris Agreement puts in place a legally binding mechanism to increase mitigation action over time. Countries put forward pledges called nationally determined contributions (NDC) whose impact is assessed in global stocktaking exercises. Subsequently, actions can then be strengthened in light of the Paris climate objective: limiting global mean temperature increase to well below 2 °C and pursuing efforts to limit it further to 1.5 °C. However, pledged actions are currently described ambiguously and this complicates the global stocktaking exercise. Here, we systematically explore possible interpretations of NDC assumptions, and show that this results in estimated emissions for 2030 ranging from 47 to 63 GtCO2e yr-1. We show that this uncertainty has critical implications for the feasibility and cost to limit warming well below 2 °C and further to 1.5 °C. Countries are currently working towards clarifying the modalities of future NDCs. We identify salient avenues to reduce the overall uncertainty by about 10 percentage points through simple, technical clarifications regarding energy accounting rules. Remaining uncertainties depend to a large extent on politically valid choices about how NDCs are expressed, and therefore raise the importance of a thorough and robust process that keeps track of where emissions are heading over time.

  8. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    NARCIS (Netherlands)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G.; Ring, David; Parisien, Robert

    2016-01-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if

  9. Calorimetric and reactor coolant system flow uncertainty

    International Nuclear Information System (INIS)

    Bates, L.; McLean, T.

    1991-01-01

    This paper describes a methodology for the quantification of errors associated with the determination of a feedwater flow, secondary power, and Reactor Coolant System (RCS) flow used at the Trojan Nuclear Plant to ensure compliance with regulatory requirements. The sources of error in Plant indications and process measurement are identified and tracked, using examples, through the mathematical processes necessary to calculate the uncertainty in the RCS flow measurement. An error of approximately 1.4 percent is calculated for secondary power. This error results, along with the consideration of other errors, in an uncertainty of approximately 3 percent in the RCS flow determination

  10. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    International Nuclear Information System (INIS)

    Kirchner, G.; Peterson, R.

    1996-11-01

    variation between the best estimate predictions of the group. The assumptions of the users result in more uncertainty in the predictions (taking into account the 95% confidence intervals) than is shown by the confidence interval on the predictions of one user. Mistakes, being examples of incorrect user assumptions, cannot be ignored and must be accepted as contributing to the variability seen in the spread of predictions. The user's confidence in his/her understanding of a scenario description and/or confidence in working with a code does not necessarily mean that the predictions will be more accurate. Choice of parameter values contributed most to user-induced uncertainty followed by scenario interpretation. The contribution due to code implementation was low, but may have been limited due to the decision of the majority of the group not to submit predictions using the most complex of the three codes. Most modelers had difficulty adapting the models for certain expected output. Parameter values for wet and dry deposition, transfer from forage to milk and concentration ratios were mostly taken from the extensive database of Chernobyl fallout radionuclides, no matter what the scenario. Examples provided in the code manuals may influence code users considerably when preparing their own input files. A major problem concerns pasture concentrations given in fresh or dry weight: parameter values in codes have to be based on one or the other and the request for predictions in the scenario description may or may not be the same unit. This is a surprisingly common source of error. Most of the predictions showed order of magnitude discrepancies when best estimates are compared with the observations, although the participants had a highly professional background in radioecology and a good understanding of the importance of the processes modelled. When uncertainties are considered, however, mostly there was overlap between predictions and observations. A failure to reproduce the time

  11. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Kirchner, G. [Univ. of Bremen (Germany); Peterson, R. [AECL, Chalk River, ON (Canada)] [and others

    1996-11-01

    variation between the best estimate predictions of the group. The assumptions of the users result in more uncertainty in the predictions (taking into account the 95% confidence intervals) than is shown by the confidence interval on the predictions of one user. Mistakes, being examples of incorrect user assumptions, cannot be ignored and must be accepted as contributing to the variability seen in the spread of predictions. The user's confidence in his/her understanding of a scenario description and/or confidence in working with a code does not necessarily mean that the predictions will be more accurate. Choice of parameter values contributed most to user-induced uncertainty followed by scenario interpretation. The contribution due to code implementation was low, but may have been limited due to the decision of the majority of the group not to submit predictions using the most complex of the three codes. Most modelers had difficulty adapting the models for certain expected output. Parameter values for wet and dry deposition, transfer from forage to milk and concentration ratios were mostly taken from the extensive database of Chernobyl fallout radionuclides, no matter what the scenario. Examples provided in the code manuals may influence code users considerably when preparing their own input files. A major problem concerns pasture concentrations given in fresh or dry weight: parameter values in codes have to be based on one or the other and the request for predictions in the scenario description may or may not be the same unit. This is a surprisingly common source of error. Most of the predictions showed order of magnitude discrepancies when best estimates are compared with the observations, although the participants had a highly professional background in radioecology and a good understanding of the importance of the processes modelled. When uncertainties are considered, however, mostly there was overlap between predictions and observations. A failure to reproduce the

  12. A putative biomarker signature for clinically effective AKT inhibition: correlation of in vitro, in vivo and clinical data identifies the importance of modulation of the mTORC1 pathway.

    Science.gov (United States)

    Cheraghchi-Bashi, Azadeh; Parker, Christine A; Curry, Ed; Salazar, Jean-Frederic; Gungor, Hatice; Saleem, Azeem; Cunnea, Paula; Rama, Nona; Salinas, Cristian; Mills, Gordon B; Morris, Shannon R; Kumar, Rakesh; Gabra, Hani; Stronach, Euan A

    2015-12-08

    Our identification of dysregulation of the AKT pathway in ovarian cancer as a platinum resistance specific event led to a comprehensive analysis of in vitro, in vivo and clinical behaviour of the AKT inhibitor GSK2141795. Proteomic biomarker signatures correlating with effects of GSK2141795 were developed using in vitro and in vivo models, well characterised for related molecular, phenotypic and imaging endpoints. Signatures were validated in temporally paired biopsies from patients treated with GSK2141795 in a clinical study. GSK2141795 caused growth-arrest as single agent in vitro, enhanced cisplatin-induced apoptosis in vitro and reduced tumour volume in combination with platinum in vivo. GSK2141795 treatment in vitro and in vivo resulted in ~50-90% decrease in phospho-PRAS40 and 20-80% decrease in fluoro-deoxyglucose (FDG) uptake. Proteomic analysis of GSK2141795 in vitro and in vivo identified a signature of pathway inhibition including changes in AKT and p38 phosphorylation and total Bim, IGF1R, AR and YB1 levels. In patient biopsies, prior to treatment with GSK2141795 in a phase 1 clinical trial, this signature was predictive of post-treatment changes in the response marker CA125. Development of this signature represents an opportunity to demonstrate the clinical importance of AKT inhibition for re-sensitisation of platinum resistant ovarian cancer to platinum.

  13. Bipolar disorder: The importance of clinical assessment in identifying prognostic factors - An Audit. Part 3: A comparison between Italian and English mental health services and a survey of bipolar disorder.

    Science.gov (United States)

    Verdolini, Norma; Dean, Jonathon; Massucci, Giampaolo; Elisei, Sandro; Quartesan, Roberto; Zaman, Rashid; Agius, Mark

    2014-11-01

    Most of the prognostic factors of bipolar disorder, which determine disease course and outcome, could be detected from simple but often-unrecorded questions asked during the psychiatric clinic assessments. In previous parts of this research, we analysed various prognostic factors and focused on mixed states and rapid cycling subsets. We now compare our sample in England with a small sample from Italy to demonstrate the utility of focused prognostic questioning and of international comparison. We collected data from the clinical notes of 70 English bipolar and 8 Italian bipolar outpatients seen at the initial psychiatric assessment clinic about socio-demographic and clinical factors to determine whether various factors had relevance to prevalence, prognosis, or outcome. The sample comprised 16 bipolar I (22.9%) and 54 bipolar II (77.1%) English outpatients and 7 bipolar I (87.5%) and 1 bipolar II (12.5%) Italian outpatients. Differences between the groups are seen mainly in terms of age of onset, duration of both depressive and hypomanic episodes, presence of psychiatric family history, incidence of mixed state features and rapid cycling, presence of elated mood in response to past antidepressant treatment, and misuse of illicit drugs and alcohol. In order to promote improved mental health primary care, mental health systems in all countries should develop standardized epidemiological tools that are shared between countries. We recommend the use of a questionnaire that reminds clinicians of potentially prognostic information and suggest that this might identify important components of a potential standardized diagnostic and prognostic tool.

  14. Investment and uncertainty

    DEFF Research Database (Denmark)

    Greasley, David; Madsen, Jakob B.

    2006-01-01

    A severe collapse of fixed capital formation distinguished the onset of the Great Depression from other investment downturns between the world wars. Using a model estimated for the years 1890-2000, we show that the expected profitability of capital measured by Tobin's q, and the uncertainty...... surrounding expected profits indicated by share price volatility, were the chief influences on investment levels, and that heightened share price volatility played the dominant role in the crucial investment collapse in 1930. Investment did not simply follow the downward course of income at the onset...

  15. Optimization under Uncertainty

    KAUST Repository

    Lopez, Rafael H.

    2016-01-06

    The goal of this poster is to present the main approaches to optimization of engineering systems in the presence of uncertainties. We begin by giving an insight about robust optimization. Next, we detail how to deal with probabilistic constraints in optimization, the so called the reliability based design. Subsequently, we present the risk optimization approach, which includes the expected costs of failure in the objective function. After that the basic description of each approach is given, the projects developed by CORE are presented. Finally, the main current topic of research of CORE is described.

  16. Optimizing production under uncertainty

    DEFF Research Database (Denmark)

    Rasmussen, Svend

    This Working Paper derives criteria for optimal production under uncertainty based on the state-contingent approach (Chambers and Quiggin, 2000), and discusses po-tential problems involved in applying the state-contingent approach in a normative context. The analytical approach uses the concept...... of state-contingent production functions and a definition of inputs including both sort of input, activity and alloca-tion technology. It also analyses production decisions where production is combined with trading in state-contingent claims such as insurance contracts. The final part discusses...

  17. Principles of Uncertainty

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    An intuitive and mathematical introduction to subjective probability and Bayesian statistics. An accessible, comprehensive guide to the theory of Bayesian statistics, Principles of Uncertainty presents the subjective Bayesian approach, which has played a pivotal role in game theory, economics, and the recent boom in Markov Chain Monte Carlo methods. Both rigorous and friendly, the book contains: Introductory chapters examining each new concept or assumption Just-in-time mathematics -- the presentation of ideas just before they are applied Summary and exercises at the end of each chapter Discus

  18. Mathematical Analysis of Uncertainty

    Directory of Open Access Journals (Sweden)

    Angel GARRIDO

    2016-01-01

    Full Text Available Classical Logic showed early its insufficiencies for solving AI problems. The introduction of Fuzzy Logic aims at this problem. There have been research in the conventional Rough direction alone or in the Fuzzy direction alone, and more recently, attempts to combine both into Fuzzy Rough Sets or Rough Fuzzy Sets. We analyse some new and powerful tools in the study of Uncertainty, as the Probabilistic Graphical Models, Chain Graphs, Bayesian Networks, and Markov Networks, integrating our knowledge of graphs and probability.

  19. A model of designing as the intersection between uncertainty perception, information processing, and coevolution

    DEFF Research Database (Denmark)

    Lasso, Sarah Venturim; Cash, Philip; Daalhuizen, Jaap

    2016-01-01

    , the designer's perceived uncertainty is the motivation to start a process of collecting, exchanging, and integrating knowledge. This has been formalised in Information-Processing Theory and more generally described by authors such as Aurisicchio et al. (2013) who describe design as an information...... takes the first steps towards linking these disparate perspectives in a model of designing that synthesises coevolution and information processing. How designers act has been shown to play an important role in the process of New Product Development (NPD) (See e.g. Badke-Schaub and Frankenberger, 2012...... transformation process. Here the aim of the activity is to reduce the perceived uncertainty through identifying and integrating external information and knowledge within the design team. For2example, when perceiving uncertainty the designer might seek new information online, process this information, and share...

  20. Assessing flood forecast uncertainty with fuzzy arithmetic

    Directory of Open Access Journals (Sweden)

    de Bruyn Bertrand

    2016-01-01

    Full Text Available Providing forecasts for flow rates and water levels during floods have to be associated with uncertainty estimates. The forecast sources of uncertainty are plural. For hydrological forecasts (rainfall-runoff performed using a deterministic hydrological model with basic physics, two main sources can be identified. The first obvious source is the forcing data: rainfall forecast data are supplied in real time by meteorological forecasting services to the Flood Forecasting Service within a range between a lowest and a highest predicted discharge. These two values define an uncertainty interval for the rainfall variable provided on a given watershed. The second source of uncertainty is related to the complexity of the modeled system (the catchment impacted by the hydro-meteorological phenomenon, the number of variables that may describe the problem and their spatial and time variability. The model simplifies the system by reducing the number of variables to a few parameters. Thus it contains an intrinsic uncertainty. This model uncertainty is assessed by comparing simulated and observed rates for a large number of hydro-meteorological events. We propose a method based on fuzzy arithmetic to estimate the possible range of flow rates (and levels of water making a forecast based on possible rainfalls provided by forcing and uncertainty model. The model uncertainty is here expressed as a range of possible values. Both rainfall and model uncertainties are combined with fuzzy arithmetic. This method allows to evaluate the prediction uncertainty range. The Flood Forecasting Service of Oise and Aisne rivers, in particular, monitors the upstream watershed of the Oise at Hirson. This watershed’s area is 310 km2. Its response time is about 10 hours. Several hydrological models are calibrated for flood forecasting in this watershed and use the rainfall forecast. This method presents the advantage to be easily implemented. Moreover, it permits to be carried out

  1. Uncertainty in Reference and Information Service

    Science.gov (United States)

    VanScoy, Amy

    2015-01-01

    Introduction: Uncertainty is understood as an important component of the information seeking process, but it has not been explored as a component of reference and information service. Method: Interpretative phenomenological analysis was used to examine the practitioner perspective of reference and information service for eight academic research…

  2. Uncertainty estimates for theoretical atomic and molecular data

    International Nuclear Information System (INIS)

    Chung, H-K; Braams, B J; Bartschat, K; Császár, A G; Drake, G W F; Kirchner, T; Kokoouline, V; Tennyson, J

    2016-01-01

    Sources of uncertainty are reviewed for calculated atomic and molecular data that are important for plasma modeling: atomic and molecular structures and cross sections for electron-atom, electron-molecule, and heavy particle collisions. We concentrate on model uncertainties due to approximations to the fundamental many-body quantum mechanical equations and we aim to provide guidelines to estimate uncertainties as a routine part of computations of data for structure and scattering. (topical review)

  3. A New Framework for Quantifying Lidar Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer, F.; Clifton, Andrew; Bonin, Timothy A.; Churchfield, Matthew J.

    2017-03-24

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards discuss uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device. However, real-world experience has shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we propose the development of a new lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from an operational wind farm to assess the ability of the framework to predict errors in lidar-measured wind speed.

  4. The explicit treatment of model uncertainties in the presence of aleatory and epistemic parameter uncertainties in risk and reliability analysis

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eon

    2003-01-01

    In the risk and reliability analysis of complex technological systems, the primary concern of formal uncertainty analysis is to understand why uncertainties arise, and to evaluate how they impact the results of the analysis. In recent times, many of the uncertainty analyses have focused on parameters of the risk and reliability analysis models, whose values are uncertain in an aleatory or an epistemic way. As the field of parametric uncertainty analysis matures, however, more attention is being paid to the explicit treatment of uncertainties that are addressed in the predictive model itself as well as the accuracy of the predictive model. The essential steps for evaluating impacts of these model uncertainties in the presence of parameter uncertainties are to determine rigorously various sources of uncertainties to be addressed in an underlying model itself and in turn model parameters, based on our state-of-knowledge and relevant evidence. Answering clearly the question of how to characterize and treat explicitly the forgoing different sources of uncertainty is particularly important for practical aspects such as risk and reliability optimization of systems as well as more transparent risk information and decision-making under various uncertainties. The main purpose of this paper is to provide practical guidance for quantitatively treating various model uncertainties that would often be encountered in the risk and reliability modeling process of complex technological systems

  5. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes.

    Science.gov (United States)

    Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul

    2013-11-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Exploring Uncertainty Perception as a Driver of Design Activity

    DEFF Research Database (Denmark)

    Cash, Philip; Kreye, Melanie

    2018-01-01

    , and representation action. We bring together prior works on uncertainty perception in the design and management literatures to derive three contributions. First, we describe how uncertainty perception is associated with activity progression, linking all three core actions. Second, we identify characteristic patterns...

  7. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  8. Investment, regulation, and uncertainty

    Science.gov (United States)

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

  9. Probabilistic Mass Growth Uncertainties

    Science.gov (United States)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  10. Proposed standardized definitions for vertical resolution and uncertainty in the NDACC lidar ozone and temperature algorithms - Part 3: Temperature uncertainty budget

    Science.gov (United States)

    Leblanc, Thierry; Sica, Robert J.; van Gijsel, Joanna A. E.; Haefele, Alexander; Payen, Guillaume; Liberti, Gianluigi

    2016-08-01

    A standardized approach for the definition, propagation, and reporting of uncertainty in the temperature lidar data products contributing to the Network for the Detection for Atmospheric Composition Change (NDACC) database is proposed. One important aspect of the proposed approach is the ability to propagate all independent uncertainty components in parallel through the data processing chain. The individual uncertainty components are then combined together at the very last stage of processing to form the temperature combined standard uncertainty. The identified uncertainty sources comprise major components such as signal detection, saturation correction, background noise extraction, temperature tie-on at the top of the profile, and absorption by ozone if working in the visible spectrum, as well as other components such as molecular extinction, the acceleration of gravity, and the molecular mass of air, whose magnitudes depend on the instrument, data processing algorithm, and altitude range of interest. The expression of the individual uncertainty components and their step-by-step propagation through the temperature data processing chain are thoroughly estimated, taking into account the effect of vertical filtering and the merging of multiple channels. All sources of uncertainty except detection noise imply correlated terms in the vertical dimension, which means that covariance terms must be taken into account when vertical filtering is applied and when temperature is integrated from the top of the profile. Quantitatively, the uncertainty budget is presented in a generic form (i.e., as a function of instrument performance and wavelength), so that any NDACC temperature lidar investigator can easily estimate the expected impact of individual uncertainty components in the case of their own instrument. Using this standardized approach, an example of uncertainty budget is provided for the Jet Propulsion Laboratory (JPL) lidar at Mauna Loa Observatory, Hawai'i, which is

  11. Uncertainty Regarding Waste Handling in Everyday Life

    Directory of Open Access Journals (Sweden)

    Susanne Ewert

    2010-09-01

    Full Text Available According to our study, based on interviews with households in a residential area in Sweden, uncertainty is a cultural barrier to improved recycling. Four causes of uncertainty are identified. Firstly, professional categories not matching cultural categories—people easily discriminate between certain categories (e.g., materials such as plastic and paper but not between others (e.g., packaging and “non-packaging”. Thus a frequent cause of uncertainty is that the basic categories of the waste recycling system do not coincide with the basic categories used in everyday life. Challenged habits—source separation in everyday life is habitual, but when a habit is challenged, by a particular element or feature of the waste system, uncertainty can arise. Lacking fractions—some kinds of items cannot be left for recycling and this makes waste collection incomplete from the user’s point of view and in turn lowers the credibility of the system. Missing or contradictory rules of thumb—the above causes seem to be particularly relevant if no motivating principle or rule of thumb (within the context of use is successfully conveyed to the user. This paper discusses how reducing uncertainty can improve recycling.

  12. Characterizing spatial uncertainty when integrating social data in conservation planning.

    Science.gov (United States)

    Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C

    2014-12-01

    Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches. © 2014 Society for Conservation Biology.

  13. Decay heat uncertainty quantification of MYRRHA

    OpenAIRE

    Fiorito Luca; Buss Oliver; Hoefer Axel; Stankovskiy Alexey; Eynde Gert Van den

    2017-01-01

    MYRRHA is a lead-bismuth cooled MOX-fueled accelerator driven system (ADS) currently in the design phase at SCK·CEN in Belgium. The correct evaluation of the decay heat and of its uncertainty level is very important for the safety demonstration of the reactor. In the first part of this work we assessed the decay heat released by the MYRRHA core using the ALEPH-2 burnup code. The second part of the study focused on the nuclear data uncertainty and covariance propagation to the MYRRHA decay hea...

  14. Optical Model and Cross Section Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Herman,M.W.; Pigni, M.T.; Dietrich, F.S.; Oblozinsky, P.

    2009-10-05

    Distinct minima and maxima in the neutron total cross section uncertainties were observed in model calculations using spherical optical potential. We found this oscillating structure to be a general feature of quantum mechanical wave scattering. Specifically, we analyzed neutron interaction with 56Fe from 1 keV up to 65 MeV, and investigated physical origin of the minima.We discuss their potential importance for practical applications as well as the implications for the uncertainties in total and absorption cross sections.

  15. Capacity and Entry Deterrence under Demand Uncertainty

    DEFF Research Database (Denmark)

    Poddar, Sougata

    I consider a two period model with an incumbent firm and a potential entrant each of whom produces a homogeneous good. There is a demand uncertainty: it can be high or low and it realizes in the second period. The question I ask: How by choosing capacity at an earlier period of actual production...... of output and, more importently, not knowing which state of demand is going to realize, and knowing that there is a potential entrant, the incumbent firm can influence the outcome of the game by changing its initial condition. To that end, I study how the impact of the distribution of uncertainty deeply...

  16. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  17. Embracing uncertainty in applied ecology.

    Science.gov (United States)

    Milner-Gulland, E J; Shea, K

    2017-12-01

    Applied ecologists often face uncertainty that hinders effective decision-making.Common traps that may catch the unwary are: ignoring uncertainty, acknowledging uncertainty but ploughing on, focussing on trivial uncertainties, believing your models, and unclear objectives.We integrate research insights and examples from a wide range of applied ecological fields to illustrate advances that are generally underused, but could facilitate ecologists' ability to plan and execute research to support management.Recommended approaches to avoid uncertainty traps are: embracing models, using decision theory, using models more effectively, thinking experimentally, and being realistic about uncertainty. Synthesis and applications . Applied ecologists can become more effective at informing management by using approaches that explicitly take account of uncertainty.

  18. Oil price uncertainty in Canada

    Energy Technology Data Exchange (ETDEWEB)

    Elder, John [Department of Finance and Real Estate, 1272 Campus Delivery, Colorado State University, Fort Collins, CO 80523 (United States); Serletis, Apostolos [Department of Economics, University of Calgary, Calgary, Alberta (Canada)

    2009-11-15

    Bernanke [Bernanke, Ben S. Irreversibility, uncertainty, and cyclical investment. Quarterly Journal of Economics 98 (1983), 85-106.] shows how uncertainty about energy prices may induce optimizing firms to postpone investment decisions, thereby leading to a decline in aggregate output. Elder and Serletis [Elder, John and Serletis, Apostolos. Oil price uncertainty.] find empirical evidence that uncertainty about oil prices has tended to depress investment in the United States. In this paper we assess the robustness of these results by investigating the effects of oil price uncertainty in Canada. Our results are remarkably similar to existing results for the United States, providing additional evidence that uncertainty about oil prices may provide another explanation for why the sharp oil price declines of 1985 failed to produce rapid output growth. Impulse-response analysis suggests that uncertainty about oil prices may tend to reinforce the negative response of output to positive oil shocks. (author)

  19. Neural Mechanisms of Updating under Reducible and Irreducible Uncertainty.

    Science.gov (United States)

    Kobayashi, Kenji; Hsu, Ming

    2017-07-19

    Adaptive decision making depends on an agent's ability to use environmental signals to reduce uncertainty. However, because of multiple types of uncertainty, agents must take into account not only the extent to which signals violate prior expectations but also whether uncertainty can be reduced in the first place. Here we studied how human brains of both sexes respond to signals under conditions of reducible and irreducible uncertainty. We show behaviorally that subjects' value updating was sensitive to the reducibility of uncertainty, and could be quantitatively characterized by a Bayesian model where agents ignore expectancy violations that do not update beliefs or values. Using fMRI, we found that neural processes underlying belief and value updating were separable from responses to expectancy violation, and that reducibility of uncertainty in value modulated connections from belief-updating regions to value-updating regions. Together, these results provide insights into how agents use knowledge about uncertainty to make better decisions while ignoring mere expectancy violation. SIGNIFICANCE STATEMENT To make good decisions, a person must observe the environment carefully, and use these observations to reduce uncertainty about consequences of actions. Importantly, uncertainty should not be reduced purely based on how surprising the observations are, particularly because in some cases uncertainty is not reducible. Here we show that the human brain indeed reduces uncertainty adaptively by taking into account the nature of uncertainty and ignoring mere surprise. Behaviorally, we show that human subjects reduce uncertainty in a quasioptimal Bayesian manner. Using fMRI, we characterize brain regions that may be involved in uncertainty reduction, as well as the network they constitute, and dissociate them from brain regions that respond to mere surprise. Copyright © 2017 the authors 0270-6474/17/376972-11$15.00/0.

  20. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  1. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    The question of how to derive and present uncertainty information in climate data records (CDRs) has received sustained attention within the European Space Agency Climate Change Initiative (CCI), a programme to generate CDRs addressing a range of essential climate variables (ECVs) from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO) should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions). The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the shape of the

  2. Uncertainty of a hydrological climate change impact assessment - Is it really all about climate uncertainty?

    Science.gov (United States)

    Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian

    2013-04-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980's with another boost after the publication of the IPCC AR4 report. During hundreds of impact studies a quasi-standard methodology emerged, which is mainly shaped by the growing public demand for predicting how water resources management or flood protection should change in the close future. The ``standard'' workflow considers future climate under a specific IPCC emission scenario simulated by global circulation models (GCMs), possibly downscaled by a regional climate model (RCM) and/or a stochastic weather generator. The output from the climate models is typically corrected for bias before feeding it into a calibrated hydrological model, which is run on the past and future meteorological data to analyse the impacts of climate change on the hydrological indicators of interest. The impact predictions are as uncertain as any forecast that tries to describe the behaviour of an extremely complex system decades into the future. Future climate predictions are uncertain due to the scenario uncertainty and the GCM model uncertainty that is obvious on finer resolution than continental scale. Like in any hierarchical model system, uncertainty propagates through the descendant components. Downscaling increases uncertainty with the deficiencies of RCMs and/or weather generators. Bias correction adds a strong deterministic shift to the input data. Finally the predictive uncertainty of the hydrological model ends the cascade that leads to the total uncertainty of the hydrological impact assessment. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. There are only few studies, which found that the predictive uncertainty of hydrological models can be in the same range or even larger than climatic uncertainty. We carried out a

  3. Uncertainties in projecting climate-change impacts in marine ecosystems

    DEFF Research Database (Denmark)

    Payne, Mark; Barange, Manuel; Cheung, William W. L.

    2016-01-01

    with a projection and building confidence in its robustness. We review how uncertainties in such projections are handled in marine science. We employ an approach developed in climate modelling by breaking uncertainty down into (i) structural (model) uncertainty, (ii) initialization and internal variability......Projections of the impacts of climate change on marine ecosystems are a key prerequisite for the planning of adaptation strategies, yet they are inevitably associated with uncertainty. Identifying, quantifying, and communicating this uncertainty is key to both evaluating the risk associated...... and highlight the opportunities and challenges associated with doing a better job. We find that even within a relatively small field such as marine science, there are substantial differences between subdisciplines in the degree of attention given to each type of uncertainty. We find that initialization...

  4. GRAPH THEORY APPROACH TO QUANTIFY UNCERTAINTY OF PERFORMANCE MEASURES

    Directory of Open Access Journals (Sweden)

    Sérgio D. Sousa

    2015-03-01

    Full Text Available In this work, the performance measurement process is studied to quantify the uncertainty induced in the resulting performance measure (PM. To that end, the causes of uncertainty are identified, analysing the activities undertaken in the three following stages of the performance measurement process: design and implementation, data collection and record, and determination and analysis. A quantitative methodology based on graph theory and on the sources of uncertainty of the performance measurement process is used to calculate an uncertainty index to evaluate the level of uncertainty of a given PM or (key performance indicator. An application example is presented. The quantification of PM uncertainty could contribute to better represent the risk associated with a given decision and also to improve the PM to increase its precision and reliability.

  5. Analysis and evaluation of regulatory uncertainties in 10 CFR 60 subparts B and E

    International Nuclear Information System (INIS)

    Weiner, R.F.; Patrick, W.C.

    1990-01-01

    This paper presents an attribute analysis scheme for prioritizing the resolution of regulatory uncertainties. Attributes are presented which assist in identifying the need for timeliness and durability of the resolution of an uncertainty

  6. Neural mechanisms mediating degrees of strategic uncertainty.

    Science.gov (United States)

    Nagel, Rosemarie; Brovelli, Andrea; Heinemann, Frank; Coricelli, Giorgio

    2018-01-01

    In social interactions, strategic uncertainty arises when the outcome of one's choice depends on the choices of others. An important question is whether strategic uncertainty can be resolved by assessing subjective probabilities to the counterparts' behavior, as if playing against nature, and thus transforming the strategic interaction into a risky (individual) situation. By means of functional magnetic resonance imaging with human participants we tested the hypothesis that choices under strategic uncertainty are supported by the neural circuits mediating choices under individual risk and deliberation in social settings (i.e. strategic thinking). Participants were confronted with risky lotteries and two types of coordination games requiring different degrees of strategic thinking of the kind 'I think that you think that I think etc.' We found that the brain network mediating risk during lotteries (anterior insula, dorsomedial prefrontal cortex and parietal cortex) is also engaged in the processing of strategic uncertainty in games. In social settings, activity in this network is modulated by the level of strategic thinking that is reflected in the activity of the dorsomedial and dorsolateral prefrontal cortex. These results suggest that strategic uncertainty is resolved by the interplay between the neural circuits mediating risk and higher order beliefs (i.e. beliefs about others' beliefs). © The Author(s) (2017). Published by Oxford University Press.

  7. Uncertainty and Complementarity in Axiomatic Quantum Mechanics

    Science.gov (United States)

    Lahti, Pekka J.

    1980-11-01

    In this work an investigation of the uncertainty principle and the complementarity principle is carried through. A study of the physical content of these principles and their representation in the conventional Hilbert space formulation of quantum mechanics forms a natural starting point for this analysis. Thereafter is presented more general axiomatic framework for quantum mechanics, namely, a probability function formulation of the theory. In this general framework two extra axioms are stated, reflecting the ideas of the uncertainty principle and the complementarity principle, respectively. The quantal features of these axioms are explicated. The sufficiency of the state system guarantees that the observables satisfying the uncertainty principle are unbounded and noncompatible. The complementarity principle implies a non-Boolean proposition structure for the theory. Moreover, nonconstant complementary observables are always noncompatible. The uncertainty principle and the complementarity principle, as formulated in this work, are mutually independent. Some order is thus brought into the confused discussion about the interrelations of these two important principles. A comparison of the present formulations of the uncertainty principle and the complementarity principle with the Jauch formulation of the superposition principle is also given. The mutual independence of the three fundamental principles of the quantum theory is hereby revealed.

  8. Hotspots of uncertainty in land-use and land-cover change projections: a global-scale model comparison.

    Science.gov (United States)

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D A; Arneth, Almut; Calvin, Katherine; Doelman, Jonathan; Eitelberg, David A; Engström, Kerstin; Fujimori, Shinichiro; Hasegawa, Tomoko; Havlik, Petr; Humpenöder, Florian; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Meiyappan, Prasanth; Popp, Alexander; Sands, Ronald D; Schaldach, Rüdiger; Schüngel, Jan; Stehfest, Elke; Tabeau, Andrzej; Van Meijl, Hans; Van Vliet, Jasper; Verburg, Peter H

    2016-12-01

    Model-based global projections of future land-use and land-cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socioeconomic conditions. We attribute components of uncertainty to input data, model structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios, we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g., boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process and improving the allocation mechanisms of LULC change models remain important challenges. Current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches, and many studies ignore the uncertainty in LULC projections in assessments of LULC

  9. Uncertainty in a monthly water balance model using the generalized likelihood uncertainty estimation methodology

    Science.gov (United States)

    Rivera, Diego; Rivas, Yessica; Godoy, Alex

    2015-02-01

    Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s-1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.

  10. Uncertainty quantification and race car aerodynamics

    OpenAIRE

    Bradford, J; Montomoli, F; D'Ammaro, A

    2014-01-01

    28.04.15 KB. Ok to add accepted version to spiral, embargo expired Car aerodynamics are subjected to a number of random variables which introduce uncertainty into the downforce performance. These can include, but are not limited to, pitch variations and ride height variations. Studying the effect of the random variations in these parameters is important to predict accurately the car performance during the race. Despite their importance the assessment of these variations is difficult and it...

  11. Evaluation of risk impact of changes to Completion Times addressing model and parameter uncertainties

    International Nuclear Information System (INIS)

    Martorell, S.; Martón, I.; Villamizar, M.; Sánchez, A.I.; Carlos, S.

    2014-01-01

    This paper presents an approach and an example of application for the evaluation of risk impact of changes to Completion Times within the License Basis of a Nuclear Power Plant based on the use of the Probabilistic Risk Assessment addressing identification, treatment and analysis of uncertainties in an integrated manner. It allows full development of a three tired approach (Tier 1–3) following the principles of the risk-informed decision-making accounting for uncertainties as proposed by many regulators. Completion Time is the maximum outage time a safety related equipment is allowed to be down, e.g. for corrective maintenance, which is established within the Limiting Conditions for Operation included into Technical Specifications for operation of a Nuclear Power Plant. The case study focuses on a Completion Time change of the Accumulators System of a Nuclear Power Plant using a level 1 PRA. It focuses on several sources of model and parameter uncertainties. The results obtained show the risk impact of the proposed CT change including both types of epistemic uncertainties is small as compared with current safety goals of concern to Tier 1. However, what concerns to Tier 2 and 3, the results obtained show how the use of some traditional and uncertainty importance measures helps in identifying high risky configurations that should be avoided in NPP technical specifications no matter the duration of CT (Tier 2), and other configurations that could take part of a configuration risk management program (Tier 3). - Highlights: • New approach for evaluation of risk impact of changes to Completion Times. • Integrated treatment and analysis of model and parameter uncertainties. • PSA based application to support risk-informed decision-making. • Measures of importance for identification of risky configurations. • Management of important safety issues to accomplish safety goals

  12. Heisenberg's principle of uncertainty and the uncertainty relations

    International Nuclear Information System (INIS)

    Redei, Miklos

    1987-01-01

    The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs

  13. Inflation, inflation uncertainty and output growth in the USA

    Science.gov (United States)

    Bhar, Ramprasad; Mallik, Girijasankar

    2010-12-01

    Employing a multivariate EGARCH-M model, this study investigates the effects of inflation uncertainty and growth uncertainty on inflation and output growth in the United States. Our results show that inflation uncertainty has a positive and significant effect on the level of inflation and a negative and significant effect on the output growth. However, output uncertainty has no significant effect on output growth or inflation. The oil price also has a positive and significant effect on inflation. These findings are robust and have been corroborated by use of an impulse response function. These results have important implications for inflation-targeting monetary policy, and the aim of stabilization policy in general.

  14. Public Perception of Uncertainties Within Climate Change Science.

    Science.gov (United States)

    Visschers, Vivianne H M

    2018-01-01

    Climate change is a complex, multifaceted problem involving various interacting systems and actors. Therefore, the intensities, locations, and timeframes of the consequences of climate change are hard to predict and cause uncertainties. Relatively little is known about how the public perceives this scientific uncertainty and how this relates to their concern about climate change. In this article, an online survey among 306 Swiss people is reported that investigated whether people differentiate between different types of uncertainty in climate change research. Also examined was the way in which the perception of uncertainty is related to people's concern about climate change, their trust in science, their knowledge about climate change, and their political attitude. The results of a principal component analysis showed that respondents differentiated between perceived ambiguity in climate research, measurement uncertainty, and uncertainty about the future impact of climate change. Using structural equation modeling, it was found that only perceived ambiguity was directly related to concern about climate change, whereas measurement uncertainty and future uncertainty were not. Trust in climate science was strongly associated with each type of uncertainty perception and was indirectly associated with concern about climate change. Also, more knowledge about climate change was related to less strong perceptions of each type of climate science uncertainty. Hence, it is suggested that to increase public concern about climate change, it may be especially important to consider the perceived ambiguity about climate research. Efforts that foster trust in climate science also appear highly worthwhile. © 2017 Society for Risk Analysis.

  15. Uncertainty as Certaint

    Science.gov (United States)

    Petzinger, Tom

    I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.

  16. Orientation and uncertainties

    International Nuclear Information System (INIS)

    Peters, H.P.; Hennen, L.

    1990-01-01

    The authors report on the results of three representative surveys that made a closer inquiry into perceptions and valuations of information and information sources concering Chernobyl. If turns out that the information sources are generally considered little trustworthy. This was generally attributable to the interpretation of the events being tied to attitudes in the atmonic energy issue. The greatest credit was given to television broadcasting. The authors summarize their discourse as follows: There is good reason to interpret the widespread uncertainty after Chernobyl as proof of the fact that large parts of the population are prepared and willing to assume a critical stance towards information and prefer to draw their information from various sources representing different positions. (orig.) [de

  17. DOD ELAP Lab Uncertainties

    Science.gov (United States)

    2012-03-01

    ISO / IEC   17025  Inspection Bodies – ISO / IEC  17020  RMPs – ISO  Guide 34 (Reference...certify to :  ISO  9001 (QMS),  ISO  14001 (EMS),   TS 16949 (US Automotive)  etc. 2 3 DoD QSM 4.2 standard   ISO / IEC   17025 :2005  Each has uncertainty...IPV6, NLLAP, NEFAP  TRAINING Programs  Certification Bodies – ISO / IEC  17021  Accreditation for  Management System 

  18. Traceability and Measurement Uncertainty

    DEFF Research Database (Denmark)

    Tosello, Guido; De Chiffre, Leonardo

    2004-01-01

    . The project partnership aims (composed by 7 partners in 5 countries, thus covering a real European spread in high tech production technology) to develop and implement an advanced e-learning system that integrates contributions from quite different disciplines into a user-centred approach that strictly....... Machine tool testing 9. The role of manufacturing metrology for QM 10. Inspection planning 11. Quality management of measurements incl. Documentation 12. Advanced manufacturing measurement technology The present report (which represents the section 2 - Traceability and Measurement Uncertainty – of the e-learning......This report is made as a part of the project ‘Metro-E-Learn: European e-Learning in Manufacturing Metrology’, an EU project under the program SOCRATES MINERVA (ODL and ICT in Education), Contract No: 101434-CP-1-2002-1-DE-MINERVA, coordinated by Friedrich-Alexander-University Erlangen...

  19. Decision making under uncertainty

    International Nuclear Information System (INIS)

    Cyert, R.M.

    1989-01-01

    This paper reports on ways of improving the reliability of products and systems in this country if we are to survive as a first-rate industrial power. The use of statistical techniques have, since the 1920s, been viewed as one of the methods for testing quality and estimating the level of quality in a universe of output. Statistical quality control is not relevant, generally, to improving systems in an industry like yours, but certainly the use of probability concepts is of significance. In addition, when it is recognized that part of the problem involves making decisions under uncertainty, it becomes clear that techniques such as sequential decision making and Bayesian analysis become major methodological approaches that must be utilized

  20. Sustainability and uncertainty

    DEFF Research Database (Denmark)

    Jensen, Karsten Klint

    2007-01-01

    The widely used concept of sustainability is seldom precisely defined, and its clarification involves making up one's mind about a range of difficult questions. One line of research (bottom-up) takes sustaining a system over time as its starting point and then infers prescriptions from...... this requirement. Another line (top-down) takes an economical interpretation of the Brundtland Commission's suggestion that the present generation's needsatisfaction should not compromise the need-satisfaction of future generations as its starting point. It then measures sustainability at the level of society...... a clarified ethical goal, disagreements can arise. At present we do not know what substitutions will be possible in the future. This uncertainty clearly affects the prescriptions that follow from the measure of sustainability. Consequently, decisions about how to make future agriculture sustainable...

  1. Assessing framing of uncertainties in water management practice

    NARCIS (Netherlands)

    Isendahl, N.; Dewulf, A.; Brugnach, M.; Francois, G.; Möllenkamp, S.; Pahl-Wostl, C.

    2009-01-01

    Dealing with uncertainties in water management is an important issue and is one which will only increase in light of global changes, particularly climate change. So far, uncertainties in water management have mostly been assessed from a scientific point of view, and in quantitative terms. In this

  2. Understanding the Role of Uncertainty in Jealousy Experience and Expression.

    Science.gov (United States)

    Afifi, Walid A.; Reichert, Tom

    1996-01-01

    Confirms the value of uncertainty for understanding jealousy. Finds that subjects were more likely to experience and less likely to directly express jealousy at high, versus low, levels of relational state uncertainty. Highlights the importance of differentiating jealousy experience from expression, and corroborates recent evidence showing a…

  3. Finite Project Life and Uncertainty Effects on Investment

    NARCIS (Netherlands)

    Gryglewicz, S.; Huisman, K.J.M.; Kort, P.M.

    2006-01-01

    This paper revisits the important result of the real options approach to investment under uncertainty, which states that increased uncertainty raises the value of waiting and thus decelerates investment.Typically in this literature projects are assumed to be perpetual.However, in today.s economy

  4. Uncertainty and endogenous technical change in climate policy models

    International Nuclear Information System (INIS)

    Baker, Erin; Shittu, Ekundayo

    2008-01-01

    Until recently endogenous technical change and uncertainty have been modeled separately in climate policy models. In this paper, we review the emerging literature that considers both these elements together. Taken as a whole the literature indicates that explicitly including uncertainty has important quantitative and qualitative impacts on optimal climate change technology policy. (author)

  5. Treasury bond volatility and uncertainty about monetary policy

    NARCIS (Netherlands)

    Arnold, I.J.M.; Vrugt, E.B.

    2010-01-01

    We show that dispersion-based uncertainty about the future course of monetary policy is the single most important determinant of Treasury bond volatility across all maturities. The link between Treasury bond volatility and uncertainty about macroeconomic variables is much stronger than for the more

  6. Investments in technology subject to uncertainty. Analysis and policy

    DEFF Research Database (Denmark)

    Pedersen, Jørgen Lindgaard

    1997-01-01

    Investments in technology are today of such a magnitude that it matters. In the paper there are three important questions. First on the question in which sense technological uncertainty can be said to be a problem. Second on strategies for diminishing technological uncertainties. Three on policy...

  7. Uncertainty evaluation methods for waste package performance assessment

    International Nuclear Information System (INIS)

    Wu, Y.T.; Nair, P.K.; Journel, A.G.; Abramson, L.R.

    1991-01-01

    This report identifies and investigates methodologies to deal with uncertainties in assessing high-level nuclear waste package performance. Four uncertainty evaluation methods (probability-distribution approach, bounding approach, expert judgment, and sensitivity analysis) are suggested as the elements of a methodology that, without either diminishing or enhancing the input uncertainties, can evaluate performance uncertainty. Such a methodology can also help identify critical inputs as a guide to reducing uncertainty so as to provide reasonable assurance that the risk objectives are met. This report examines the current qualitative waste containment regulation and shows how, in conjunction with the identified uncertainty evaluation methodology, a framework for a quantitative probability-based rule can be developed that takes account of the uncertainties. Current US Nuclear Regulatory Commission (NRC) regulation requires that the waste packages provide ''substantially complete containment'' (SCC) during the containment period. The term ''SCC'' is ambiguous and subject to interpretation. This report, together with an accompanying report that describes the technical considerations that must be addressed to satisfy high-level waste containment requirements, provides a basis for a third report to develop recommendations for regulatory uncertainty reduction in the ''containment''requirement of 10 CFR Part 60. 25 refs., 3 figs., 2 tabs

  8. Facing uncertainty in ecosystem services-based resource management.

    Science.gov (United States)

    Grêt-Regamey, Adrienne; Brunner, Sibyl H; Altwegg, Jürg; Bebi, Peter

    2013-09-01

    The concept of ecosystem services is increasingly used as a support for natural resource management decisions. While the science for assessing ecosystem services is improving, appropriate methods to address uncertainties in a quantitative manner are missing. Ignoring parameter uncertainties, modeling uncertainties and uncertainties related to human-environment interactions can modify decisions and lead to overlooking important management possibilities. In this contribution, we present a new approach for mapping the uncertainties in the assessment of multiple ecosystem services. The spatially explicit risk approach links Bayesian networks to a Geographic Information System for forecasting the value of a bundle of ecosystem services and quantifies the uncertainties related to the outcomes in a spatially explicit manner. We demonstrate that mapping uncertainties in ecosystem services assessments provides key information for decision-makers seeking critical areas in the delivery of ecosystem services in a case study in the Swiss Alps. The results suggest that not only the total value of the bundle of ecosystem services is highly dependent on uncertainties, but the spatial pattern of the ecosystem services values changes substantially when considering uncertainties. This is particularly important for the long-term management of mountain forest ecosystems, which have long rotation stands and are highly sensitive to pressing climate and socio-economic changes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Essays on model uncertainty in financial models

    NARCIS (Netherlands)

    Li, Jing

    2018-01-01

    This dissertation studies model uncertainty, particularly in financial models. It consists of two empirical chapters and one theoretical chapter. The first empirical chapter (Chapter 2) classifies model uncertainty into parameter uncertainty and misspecification uncertainty. It investigates the

  10. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    Science.gov (United States)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  11. Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry

    Science.gov (United States)

    West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat

    2016-01-01

    The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.

  12. A probabilistic approach to cost and duration uncertainties in environmental decisions

    International Nuclear Information System (INIS)

    Boak, D.M.; Painton, L.

    1996-01-01

    Sandia National Laboratories has developed a method for analyzing life-cycle costs using probabilistic cost forecasting and utility theory to determine the most cost-effective alternatives for safe interim storage of radioactive materials. The method explicitly incorporates uncertainties in cost and storage duration by (1) treating uncertain component costs as random variables represented by probability distributions, (2) treating uncertain durations as chance nodes in a decision tree, and (3) using stochastic simulation tools to generate life-cycle cost forecasts for each storage alternative. The method applies utility functions to the forecasted costs to incorporate the decision maker's risk preferences, making it possible to compare alternatives on the basis of both cost and cost utility. Finally, the method is used to help identify key contributors to the uncertainty in forecasted costs to focus efforts aimed at reducing cost uncertainties. Where significant cost and duration uncertainties exist, and where programmatic decisions must be made despite these uncertainties, probabilistic forecasting techniques can yield important insights into decision alternatives, especially when used as part of a larger decision analysis framework and when properly balanced with deterministic analyses. Although the method is built around an interim storage example, it is potentially applicable to many other environmental decision problems

  13. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    Energy Technology Data Exchange (ETDEWEB)

    Huan, Xun [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Geraci, Gianluca [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldred, Michael S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vane, Zachary P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Lacaze, Guilhem [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Oefelein, Joseph C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  14. Additivity of entropic uncertainty relations

    Directory of Open Access Journals (Sweden)

    René Schwonnek

    2018-03-01

    Full Text Available We consider the uncertainty between two pairs of local projective measurements performed on a multipartite system. We show that the optimal bound in any linear uncertainty relation, formulated in terms of the Shannon entropy, is additive. This directly implies, against naive intuition, that the minimal entropic uncertainty can always be realized by fully separable states. Hence, in contradiction to proposals by other authors, no entanglement witness can be constructed solely by comparing the attainable uncertainties of entangled and separable states. However, our result gives rise to a huge simplification for computing global uncertainty bounds as they now can be deduced from local ones. Furthermore, we provide the natural generalization of the Maassen and Uffink inequality for linear uncertainty relations with arbitrary positive coefficients.

  15. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, C. J.

    2017-12-01

    How to derive and present uncertainty in climate data records (CDRs) has been debated within the European Space Agency Climate Change Initiative, in search of common principles applicable across a range of essential climate variables. Various points of consensus have been reached, including the importance of improving provision of uncertainty information and the benefit of adopting international norms of metrology for language around the distinct concepts of uncertainty and error. Providing an estimate of standard uncertainty per datum (or the means to readily calculate it) emerged as baseline good practice, and should be highly relevant to users of CDRs when the uncertainty in data is variable (the usual case). Given this baseline, the role of quality flags is clarified as being complementary to and not repetitive of uncertainty information. Data with high uncertainty are not poor quality if a valid estimate of the uncertainty is available. For CDRs and their applications, the error correlation properties across spatio-temporal scales present important challenges that are not fully solved. Error effects that are negligible in the uncertainty of a single pixel may dominate uncertainty in the large-scale and long-term. A further principle is that uncertainty estimates should themselves be validated. The concepts of estimating and propagating uncertainty are generally acknowledged in geophysical sciences, but less widely practised in Earth observation and development of CDRs. Uncertainty in a CDR depends in part (and usually significantly) on the error covariance of the radiances and auxiliary data used in the retrieval. Typically, error covariance information is not available in the fundamental CDR (FCDR) (i.e., with the level-1 radiances), since provision of adequate level-1 uncertainty information is not yet standard practice. Those deriving CDRs thus cannot propagate the radiance uncertainty to their geophysical products. The FIDUCEO project (www.fiduceo.eu) is

  16. Uncertainty assessment of a model for biological nitrogen and phosphorus removal: Application to a large wastewater treatment plant

    Science.gov (United States)

    Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare

    In the last few years, the use of mathematical models in WasteWater Treatment Plant (WWTP) processes has become a common way to predict WWTP behaviour. However, mathematical models generally demand advanced input for their implementation that must be evaluated by an extensive data-gathering campaign, which cannot always be carried out. This fact, together with the intrinsic complexity of the model structure, leads to model results that may be very uncertain. Quantification of the uncertainty is imperative. However, despite the importance of uncertainty quantification, only few studies have been carried out in the wastewater treatment field, and those studies only included a few of the sources of model uncertainty. Seeking the development of the area, the paper presents the uncertainty assessment of a mathematical model simulating biological nitrogen and phosphorus removal. The uncertainty assessment was conducted according to the Generalised Likelihood Uncertainty Estimation (GLUE) methodology that has been scarcely applied in wastewater field. The model was based on activated-sludge models 1 (ASM) and 2 (ASM2). Different approaches can be used for uncertainty analysis. The GLUE methodology requires a large number of Monte Carlo simulations in which a random sampling of individual parameters drawn from probability distributions is used to determine a set of parameter values. Using this approach, model reliability was evaluated based on its capacity to globally limit the uncertainty. The method was applied to a large full-scale WWTP for which quantity and quality data was gathered. The analysis enabled to gain useful insights for WWTP modelling identifying the crucial aspects where higher uncertainty rely and where therefore, more efforts should be provided in terms of both data gathering and modelling practises.

  17. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib; Galassi, R. Malpica; Valorani, M.

    2016-01-01

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  18. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib

    2016-01-05

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  19. The Uncertainty of Measurement Results

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Factors affecting the uncertainty of measurement are explained, basic statistical formulae given, and the theoretical concept explained in the context of pesticide formulation analysis. Practical guidance is provided on how to determine individual uncertainty components within an analytical procedure. An extended and comprehensive table containing the relevant mathematical/statistical expressions elucidates the relevant underlying principles. Appendix I provides a practical elaborated example on measurement uncertainty estimation, above all utilizing experimental repeatability and reproducibility laboratory data. (author)

  20. Uncertainty analysis of environmental models

    International Nuclear Information System (INIS)

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  1. Uncertainty quantification in resonance absorption

    International Nuclear Information System (INIS)

    Williams, M.M.R.

    2012-01-01

    We assess the uncertainty in the resonance escape probability due to uncertainty in the neutron and radiation line widths for the first 21 resonances in 232 Th as given by . Simulation, quadrature and polynomial chaos methods are used and the resonance data are assumed to obey a beta distribution. We find the uncertainty in the total resonance escape probability to be the equivalent, in reactivity, of 75–130 pcm. Also shown are pdfs of the resonance escape probability for each resonance and the variation of the uncertainty with temperature. The viability of the polynomial chaos expansion method is clearly demonstrated.

  2. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  3. Simplified propagation of standard uncertainties

    International Nuclear Information System (INIS)

    Shull, A.H.

    1997-01-01

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper