WorldWideScience

Sample records for subjective uncertainty analysis

  1. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  2. FRACTURE MECHANICS UNCERTAINTY ANALYSIS IN THE RELIABILITY ASSESSMENT OF THE REACTOR PRESSURE VESSEL: (2D SUBJECTED TO INTERNAL PRESSURE

    Directory of Open Access Journals (Sweden)

    Entin Hartini

    2016-06-01

    Full Text Available ABSTRACT FRACTURE MECHANICS UNCERTAINTY ANALYSIS IN THE RELIABILITY ASSESSMENT OF THE REACTOR PRESSURE VESSEL: (2D SUBJECTED TO INTERNAL PRESSURE. The reactor pressure vessel (RPV is a pressure boundary in the PWR type reactor which serves to confine radioactive material during chain reaction process. The integrity of the RPV must be guaranteed either  in a normal operation or accident conditions. In analyzing the integrity of RPV, especially related to the crack behavior which can introduce break to the reactor pressure vessel, a fracture mechanic approach should be taken for this assessment. The uncertainty of input used in the assessment, such as mechanical properties and physical environment, becomes a reason that the assessment is not sufficient if it is perfomed only by deterministic approach. Therefore, the uncertainty approach should be applied. The aim of this study is to analize the uncertainty of fracture mechanics calculations in evaluating the reliability of PWR`s reactor pressure vessel. Random character of input quantity was generated using probabilistic principles and theories. Fracture mechanics analysis is solved by Finite Element Method (FEM with  MSC MARC software, while uncertainty input analysis is done based on probability density function with Latin Hypercube Sampling (LHS using python script. The output of MSC MARC is a J-integral value, which is converted into stress intensity factor for evaluating the reliability of RPV’s 2D. From the result of the calculation, it can be concluded that the SIF from  probabilistic method, reached the limit value of  fracture toughness earlier than SIF from  deterministic method.  The SIF generated by the probabilistic method is 105.240 MPa m0.5. Meanwhile, the SIF generated by deterministic method is 100.876 MPa m0.5. Keywords: Uncertainty analysis, fracture mechanics, LHS, FEM, reactor pressure vessels   ABSTRAK ANALISIS KETIDAKPASTIAN FRACTURE MECHANIC PADA EVALUASI KEANDALAN

  3. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  4. Stochastic Analysis of the Efficiency of a Wireless Power Transfer System Subject to Antenna Variability and Position Uncertainties.

    Science.gov (United States)

    Rossi, Marco; Stockman, Gert-Jan; Rogier, Hendrik; Vande Ginste, Dries

    2016-07-19

    The efficiency of a wireless power transfer (WPT) system in the radiative near-field is inevitably affected by the variability in the design parameters of the deployed antennas and by uncertainties in their mutual position. Therefore, we propose a stochastic analysis that combines the generalized polynomial chaos (gPC) theory with an efficient model for the interaction between devices in the radiative near-field. This framework enables us to investigate the impact of random effects on the power transfer efficiency (PTE) of a WPT system. More specifically, the WPT system under study consists of a transmitting horn antenna and a receiving textile antenna operating in the Industrial, Scientific and Medical (ISM) band at 2.45 GHz. First, we model the impact of the textile antenna's variability on the WPT system. Next, we include the position uncertainties of the antennas in the analysis in order to quantify the overall variations in the PTE. The analysis is carried out by means of polynomial-chaos-based macromodels, whereas a Monte Carlo simulation validates the complete technique. It is shown that the proposed approach is very accurate, more flexible and more efficient than a straightforward Monte Carlo analysis, with demonstrated speedup factors up to 2500.

  5. Stochastic Analysis of the Efficiency of a Wireless Power Transfer System Subject to Antenna Variability and Position Uncertainties

    Directory of Open Access Journals (Sweden)

    Marco Rossi

    2016-07-01

    Full Text Available The efficiency of a wireless power transfer (WPT system in the radiative near-field is inevitably affected by the variability in the design parameters of the deployed antennas and by uncertainties in their mutual position. Therefore, we propose a stochastic analysis that combines the generalized polynomial chaos (gPC theory with an efficient model for the interaction between devices in the radiative near-field. This framework enables us to investigate the impact of random effects on the power transfer efficiency (PTE of a WPT system. More specifically, the WPT system under study consists of a transmitting horn antenna and a receiving textile antenna operating in the Industrial, Scientific and Medical (ISM band at 2.45 GHz. First, we model the impact of the textile antenna’s variability on the WPT system. Next, we include the position uncertainties of the antennas in the analysis in order to quantify the overall variations in the PTE. The analysis is carried out by means of polynomial-chaos-based macromodels, whereas a Monte Carlo simulation validates the complete technique. It is shown that the proposed approach is very accurate, more flexible and more efficient than a straightforward Monte Carlo analysis, with demonstrated speedup factors up to 2500.

  6. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  7. Hybrid processing of stochastic and subjective uncertainty data

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, J.A. [Sandia National Labs., Albuquerque, NM (United States); Ferson, S. [Applied Biomathematics, Setauket, NY (United States); Ginzburg, L. [State Univ. of New York, Stony Brook, NY (United States)

    1995-11-01

    Uncertainty analyses typically recognize separate stochastic and subjective sources of uncertainty, but do not systematically combine the two, although a large amount of data used in analyses is partly stochastic and partly subjective. We have developed methodology for mathematically combining stochastic and subjective data uncertainty, based on new ``hybrid number`` approaches. The methodology can be utilized in conjunction with various traditional techniques, such as PRA (probabilistic risk assessment) and risk analysis decision support. Hybrid numbers have been previously examined as a potential method to represent combinations of stochastic and subjective information, but mathematical processing has been impeded by the requirements inherent in the structure of the numbers, e.g., there was no known way to multiply hybrids. In this paper, we will demonstrate methods for calculating with hybrid numbers that avoid the difficulties. By formulating a hybrid number as a probability distribution that is only fuzzy known, or alternatively as a random distribution of fuzzy numbers, methods are demonstrated for the full suite of arithmetic operations, permitting complex mathematical calculations. It will be shown how information about relative subjectivity (the ratio of subjective to stochastic knowledge about a particular datum) can be incorporated. Techniques are also developed for conveying uncertainty information visually, so that the stochastic and subjective constituents of the uncertainty, as well as the ratio of knowledge about the two, are readily apparent. The techniques demonstrated have the capability to process uncertainty information for independent, uncorrelated data, and for some types of dependent and correlated data. Example applications are suggested, illustrative problems are worked, and graphical results are given.

  8. Analysis of Infiltration Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    R. McCurley

    2003-10-27

    The primary objectives of this uncertainty analysis are: (1) to develop and justify a set of uncertain parameters along with associated distributions; and (2) to use the developed uncertain parameter distributions and the results from selected analog site calculations done in ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]) to obtain the net infiltration weighting factors for the glacial transition climate. These weighting factors are applied to unsaturated zone (UZ) flow fields in Total System Performance Assessment (TSPA), as outlined in the ''Total System Performance Assessment-License Application Methods and Approach'' (BSC 2002 [160146], Section 3.1) as a method for the treatment of uncertainty. This report is a scientific analysis because no new and mathematical physical models are developed herein, and it is based on the use of the models developed in or for ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). Any use of the term model refers to those developed in the infiltration numerical model report. TSPA License Application (LA) has included three distinct climate regimes in the comprehensive repository performance analysis for Yucca Mountain: present-day, monsoon, and glacial transition. Each climate regime was characterized using three infiltration-rate maps, including a lower- and upper-bound and a mean value (equal to the average of the two boundary values). For each of these maps, which were obtained based on analog site climate data, a spatially averaged value was also calculated by the USGS. For a more detailed discussion of these infiltration-rate maps, see ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). For this Scientific Analysis Report, spatially averaged values were calculated for the lower-bound, mean, and upper

  9. Nanoparticles: Uncertainty Risk Analysis

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Hansen, Steffen Foss; Baun, Anders

    2012-01-01

    Scientific uncertainty plays a major role in assessing the potential environmental risks of nanoparticles. Moreover, there is uncertainty within fundamental data and information regarding the potential environmental and health risks of nanoparticles, hampering risk assessments based on standard...... approaches. To date, there have been a number of different approaches to assess uncertainty of environmental risks in general, and some have also been proposed in the case of nanoparticles and nanomaterials. In recent years, others have also proposed that broader assessments of uncertainty are also needed...... in order to handle the complex potential risks of nanoparticles, including more descriptive characterizations of uncertainty. Some of these approaches are presented and discussed herein, in which the potential strengths and limitations of these approaches are identified along with further challenges...

  10. Uncertainty Principles and Fourier Analysis

    Indian Academy of Sciences (India)

    meta-theorem in harmonic analysis that can be summarized as follows: A nonzero function and its Fourier transform cannot both be sharply localized." I t is the last part of the paragraph that is the raison d 'etre for the mathematician's interest in uncertainty principles. Another way to express the meta uncertainty principle is: A.

  11. Uncertainties in offsite consequence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Young, M.L.; Harper, F.T.; Lui, C.H.

    1996-03-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequences from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the U.S. Nuclear Regulatory Commission and the European Commission began co-sponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables using a formal expert judgment elicitation and evaluation process. This paper focuses on the methods used in and results of this on-going joint effort.

  12. Dealing with Uncertainties A Guide to Error Analysis

    CERN Document Server

    Drosg, Manfred

    2009-01-01

    Dealing with Uncertainties is an innovative monograph that lays special emphasis on the deductive approach to uncertainties and on the shape of uncertainty distributions. This perspective has the potential for dealing with the uncertainty of a single data point and with sets of data that have different weights. It is shown that the inductive approach that is commonly used to estimate uncertainties is in fact not suitable for these two cases. The approach that is used to understand the nature of uncertainties is novel in that it is completely decoupled from measurements. Uncertainties which are the consequence of modern science provide a measure of confidence both in scientific data and in information in everyday life. Uncorrelated uncertainties and correlated uncertainties are fully covered and the weakness of using statistical weights in regression analysis is discussed. The text is abundantly illustrated with examples and includes more than 150 problems to help the reader master the subject.

  13. Uncertainty Propagation in Hypersonic Vehicle Aerothermoelastic Analysis

    Science.gov (United States)

    Lamorte, Nicolas Etienne

    Hypersonic vehicles face a challenging flight environment. The aerothermoelastic analysis of its components requires numerous simplifying approximations. Identifying and quantifying the effect of uncertainties pushes the limits of the existing deterministic models, and is pursued in this work. An uncertainty quantification framework is used to propagate the effects of identified uncertainties on the stability margins and performance of the different systems considered. First, the aeroelastic stability of a typical section representative of a control surface on a hypersonic vehicle is examined. Variability in the uncoupled natural frequencies of the system is modeled to mimic the effect of aerodynamic heating. Next, the stability of an aerodynamically heated panel representing a component of the skin of a generic hypersonic vehicle is considered. Uncertainty in the location of transition from laminar to turbulent flow and the heat flux prediction is quantified using CFD. In both cases significant reductions of the stability margins are observed. A loosely coupled airframe--integrated scramjet engine is considered next. The elongated body and cowl of the engine flow path are subject to harsh aerothermodynamic loading which causes it to deform. Uncertainty associated with deformation prediction is propagated to the engine performance analysis. The cowl deformation is the main contributor to the sensitivity of the propulsion system performance. Finally, a framework for aerothermoelastic stability boundary calculation for hypersonic vehicles using CFD is developed. The usage of CFD enables one to consider different turbulence conditions, laminar or turbulent, and different models of the air mixture, in particular real gas model which accounts for dissociation of molecules at high temperature. The system is found to be sensitive to turbulence modeling as well as the location of the transition from laminar to turbulent flow. Real gas effects play a minor role in the

  14. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2012-01-01

    , a modified Monte-Carlo based method is presented which reduces the subjectivity inherent in typical uncertainty approaches (e.g. cut-off thresholds), while using tangible concepts and providing practical outcomes for practitioners. The method compares the model’s uncertainty bands to the uncertainty inherent...

  15. Who plays dice? Subjective uncertainty in deterministic quantum world

    Science.gov (United States)

    Carter, Brandon

    2006-11-01

    Einstein's 1905 recognition that light consists of discrete ``quanta'' inaugurated the duality (wave versus particle) paradox that was resolved 20 years later by Born's introduction of the probability interpretation on which modem quantum theory is based. Einstein's refusal to abandon the classical notion of deterministic evolution - despite the unqualified success of the new paradigm on a local scale - foreshadowed the restoration of determinism in the attempt to develop a global treatment applicable to cosmology by Everett, who failed however to provide a logically coherent treatment of subjective uncertainty at a local level. This drawback has recently been overcome in an extended formulation allowing deterministic description of a physical universe in which the uncertainty concerns only our own particular local situations, whose probability is prescribed by an appropriate micro-anthropic principle.

  16. Uncertainty analysis in seismic tomography

    Science.gov (United States)

    Owoc, Bartosz; Majdański, Mariusz

    2017-04-01

    Velocity field from seismic travel time tomography depends on several factors like regularization, inversion path, model parameterization etc. The result also strongly depends on an initial velocity model and precision of travel times picking. In this research we test dependence on starting model in layered tomography and compare it with effect of picking precision. Moreover, in our analysis for manual travel times picking the uncertainty distribution is asymmetric. This effect is shifting the results toward faster velocities. For calculation we are using JIVE3D travel time tomographic code. We used data from geo-engineering and industrial scale investigations, which were collected by our team from IG PAS.

  17. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  18. Uncertainty Management and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Georgiadis, Stylianos; Fantke, Peter

    2017-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... of main sources of uncertainty and how to quantify them; (3) a presentation of approaches to calculate uncertainty for the final results (propagation); (4) a discussion of how to use uncertainty information and how to take it into account in the interpretation of the results; and finally (5) a discussion...

  19. Probability of flooding: An uncertainty analysis

    NARCIS (Netherlands)

    Slijkhuis, K.A.H.; Frijters, M.P.C.; Cooke, R.M.; Vrouwenvelder, A.C.W.M.

    1998-01-01

    In the Netherlands a new safety approach concerning the flood defences will probably be implemented in the near future. Therefore, an uncertainty analysis is currently being carried out to determine the uncertainty in the probability of flooding . The uncertainty of the probability of flooding could

  20. [A Concept Analysis of Uncertainty in Epilepsy].

    Science.gov (United States)

    Lee, Juna; Lee, Insook

    2017-08-01

    This concept analysis was done to clarify 'uncertainty in epilepsy'. Walker and Avant's methodology guided the analysis. In addition, the concept was compared with uncertainty in other health problems. 'Uncertainty in epilepsy' was defined as being in the condition as seen from the epilepsy experience where cues were difficult to understand because they changed, were in discord with past ones, or they had two or more contradictory values at the same time. Uncertainty in epilepsy is evolved from appraisal of the epilepsy experience. As a result, uncertainty leads epilepsy patients, their family or health care providers to impaired functioning and proactive/passive coping behavior. Epilepsy patients with uncertainty need to be supported by nursing strategies for proactive, rational coping behavior. This achievement has implications for interventions aimed at changing perception of epilepsy patients, their families or health care providers who must deal with uncertainty.

  1. Analysis of structures under uncertainties

    OpenAIRE

    da Cunha, Fabio; Chaves, Charlos; Degenhardt, Richard; Araujo, Franciso Cekio

    2011-01-01

    Theories of uncertainty are applied to a large range of fields, and they have a special interest on engineering, because of safety and risk concerns related to human lives. This concern is very high on aerospace applications, where safety is truly the most critical requirement. The most common approach to deal with uncertainties due to lack of knowledge and inherent variability on aerospace structures is through the application of safety factors, which are design margins against failure. Safe...

  2. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  3. Approach to uncertainty in risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  4. Risk analysis methods of the water resources system under uncertainty

    Directory of Open Access Journals (Sweden)

    Zeying GUI,Chenglong ZHANG,Mo Li,Ping GUO

    2015-09-01

    Full Text Available The main characteristic of the water resources system (WRS is its great complexity and uncertainty, which makes it highly desirable to carry out a risk analysis of the WRS. The natural environmental, social economic conditions as well as limitations of human cognitive ability are possible sources of the uncertainties that need to be taken into account in the risk analysis process. In this paper the inherent stochastic uncertainty and cognitive subjective uncertainty of the WRS are discussed first, from both objective and subjective perspectives. Then the quantitative characterization methods of risk analysis are introduced, including three criteria (reliability, resiliency and vulnerability and five basic optimization models (the expected risk value model, conditional value at risk model, chance-constrained risk model, minimizing probability of risk events model, and the multi-objective optimization model. Finally, this paper focuses on the various methods of risk analysis under uncertainty, which are summarized as random, fuzzy and mixed methods. A more comprehensive risk analysis methodology for the WRS is proposed based on the comparison of the advantages, disadvantages and applicable conditions of these three methods. This paper provides a decision support of risk analysis for researchers, policy makers and stakeholders of the WRS.

  5. Robustness analysis for real parametric uncertainty

    Science.gov (United States)

    Sideris, Athanasios

    1989-01-01

    Some key results in the literature in the area of robustness analysis for linear feedback systems with structured model uncertainty are reviewed. Some new results are given. Model uncertainty is described as a combination of real uncertain parameters and norm bounded unmodeled dynamics. Here the focus is on the case of parametric uncertainty. An elementary and unified derivation of the celebrated theorem of Kharitonov and the Edge Theorem is presented. Next, an algorithmic approach for robustness analysis in the cases of multilinear and polynomic parametric uncertainty (i.e., the closed loop characteristic polynomial depends multilinearly and polynomially respectively on the parameters) is given. The latter cases are most important from practical considerations. Some novel modifications in this algorithm which result in a procedure of polynomial time behavior in the number of uncertain parameters is outlined. Finally, it is shown how the more general problem of robustness analysis for combined parametric and dynamic (i.e., unmodeled dynamics) uncertainty can be reduced to the case of polynomic parametric uncertainty, and thus be solved by means of the algorithm.

  6. Bounding the Failure Probability Range of Polynomial Systems Subject to P-box Uncertainties

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2012-01-01

    This paper proposes a reliability analysis framework for systems subject to multiple design requirements that depend polynomially on the uncertainty. Uncertainty is prescribed by probability boxes, also known as p-boxes, whose distribution functions have free or fixed functional forms. An approach based on the Bernstein expansion of polynomials and optimization is proposed. In particular, we search for the elements of a multi-dimensional p-box that minimize (i.e., the best-case) and maximize (i.e., the worst-case) the probability of inner and outer bounding sets of the failure domain. This technique yields intervals that bound the range of failure probabilities. The offset between this bounding interval and the actual failure probability range can be made arbitrarily tight with additional computational effort.

  7. Uncertainty Analysis in the Noise Parameters Estimation

    Directory of Open Access Journals (Sweden)

    Pawlik P.

    2012-07-01

    Full Text Available The new approach to the uncertainty estimation in modelling acoustic hazards by means of the interval arithmetic is presented in the paper. In the case of the noise parameters estimation the selection of parameters specifying the acoustic wave propagation in an open space as well as parameters which are required in a form of average values – often constitutes a difficult problem. In such case, it is necessary to determine the variance and then, related strictly to it, the uncertainty of model parameters. The application of the interval arithmetic formalism allows to estimate the input data uncertainties without the necessity of the determination their probability distribution, which is required by other methods of uncertainty assessment. A successive problem in the acoustic hazards estimation is a lack of the exact knowledge of the input parameters. In connection with the above, the analysis of the modelling uncertainty in dependence of inaccuracy of model parameters was performed. To achieve this aim the interval arithmetic formalism – representing the value and its uncertainty in a form of an interval – was applied. The proposed approach was illustrated by the example of the application the Dutch RMR SRM Method, recommended by the European Union Directive 2002/49/WE, in the railway noise modelling.

  8. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Haihua Zhao; Vincent A. Mousseau

    2011-09-01

    Verification and validation (V&V) are playing more important roles to quantify uncertainties and realize high fidelity simulations in engineering system analyses, such as transients happened in a complex nuclear reactor system. Traditional V&V in the reactor system analysis focused more on the validation part or did not differentiate verification and validation. The traditional approach to uncertainty quantification is based on a 'black box' approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. The 'black box' method mixes numerical errors with all other uncertainties. It is also not efficient to perform sensitivity analysis. Contrary to the 'black box' method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In these types of approaches equations for the propagation of uncertainty are constructed and the sensitivities are directly solved for as variables in the simulation. This paper presents the forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical

  9. Uncertainty Analysis for a Jet Flap Airfoil

    Science.gov (United States)

    Green, Lawrence L.; Cruz, Josue

    2006-01-01

    An analysis of variance (ANOVA) study was performed to quantify the potential uncertainties of lift and pitching moment coefficient calculations from a computational fluid dynamics code, relative to an experiment, for a jet flap airfoil configuration. Uncertainties due to a number of factors including grid density, angle of attack and jet flap blowing coefficient were examined. The ANOVA software produced a numerical model of the input coefficient data, as functions of the selected factors, to a user-specified order (linear, 2-factor interference, quadratic, or cubic). Residuals between the model and actual data were also produced at each of the input conditions, and uncertainty confidence intervals (in the form of Least Significant Differences or LSD) for experimental, computational, and combined experimental / computational data sets were computed. The LSD bars indicate the smallest resolvable differences in the functional values (lift or pitching moment coefficient) attributable solely to changes in independent variable, given just the input data points from selected data sets. The software also provided a collection of diagnostics which evaluate the suitability of the input data set for use within the ANOVA process, and which examine the behavior of the resultant data, possibly suggesting transformations which should be applied to the data to reduce the LSD. The results illustrate some of the key features of, and results from, the uncertainty analysis studies, including the use of both numerical (continuous) and categorical (discrete) factors, the effects of the number and range of the input data points, and the effects of the number of factors considered simultaneously.

  10. Error Analysis of CM Data Products Sources of Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beal, Bill [National Security Technologies, Joint Base Andrews, MD (United States); Okada, Colin [National Security Technologies, LLC. (NSTec), Las Vegas, NV (United States); Simpson, Mathew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across a wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.

  11. Parameter Uncertainty for Repository Thermal Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Greenberg, Harris [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dupont, Mark [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-10-01

    This report is one follow-on to a study of reference geologic disposal design concepts (Hardin et al. 2011a). Based on an analysis of maximum temperatures, that study concluded that certain disposal concepts would require extended decay storage prior to emplacement, or the use of small waste packages, or both. The study used nominal values for thermal properties of host geologic media and engineered materials, demonstrating the need for uncertainty analysis to support the conclusions. This report is a first step that identifies the input parameters of the maximum temperature calculation, surveys published data on measured values, uses an analytical approach to determine which parameters are most important, and performs an example sensitivity analysis. Using results from this first step, temperature calculations planned for FY12 can focus on only the important parameters, and can use the uncertainty ranges reported here. The survey of published information on thermal properties of geologic media and engineered materials, is intended to be sufficient for use in generic calculations to evaluate the feasibility of reference disposal concepts. A full compendium of literature data is beyond the scope of this report. The term “uncertainty” is used here to represent both measurement uncertainty and spatial variability, or variability across host geologic units. For the most important parameters (e.g., buffer thermal conductivity) the extent of literature data surveyed samples these different forms of uncertainty and variability. Finally, this report is intended to be one chapter or section of a larger FY12 deliverable summarizing all the work on design concepts and thermal load management for geologic disposal (M3FT-12SN0804032, due 15Aug2012).

  12. Urban drainage models simplifying uncertainty analysis for practitioners

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2013-01-01

    , a modified Monte-Carlo based method is presented that reduces the subjectivity inherent in typical uncertainty approaches (e.g. cut-off thresholds), while using tangible concepts and providing practical outcomes for practitioners. The method compares the model's uncertainty bands to the uncertainty inherent...

  13. LCA data quality: sensitivity and uncertainty analysis.

    Science.gov (United States)

    Guo, M; Murphy, R J

    2012-10-01

    Life cycle assessment (LCA) data quality issues were investigated by using case studies on products from starch-polyvinyl alcohol based biopolymers and petrochemical alternatives. The time horizon chosen for the characterization models was shown to be an important sensitive parameter for the environmental profiles of all the polymers. In the global warming potential and the toxicity potential categories the comparison between biopolymers and petrochemical counterparts altered as the time horizon extended from 20 years to infinite time. These case studies demonstrated that the use of a single time horizon provide only one perspective on the LCA outcomes which could introduce an inadvertent bias into LCA outcomes especially in toxicity impact categories and thus dynamic LCA characterization models with varying time horizons are recommended as a measure of the robustness for LCAs especially comparative assessments. This study also presents an approach to integrate statistical methods into LCA models for analyzing uncertainty in industrial and computer-simulated datasets. We calibrated probabilities for the LCA outcomes for biopolymer products arising from uncertainty in the inventory and from data variation characteristics this has enabled assigning confidence to the LCIA outcomes in specific impact categories for the biopolymer vs. petrochemical polymer comparisons undertaken. Uncertainty combined with the sensitivity analysis carried out in this study has led to a transparent increase in confidence in the LCA findings. We conclude that LCAs lacking explicit interpretation of the degree of uncertainty and sensitivities are of limited value as robust evidence for decision making or comparative assertions. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Uncertainty analysis for low-level radioactive waste disposal performance assessment at Oak Ridge National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Lee, D.W.; Yambert, M.W.; Kocher, D.C.

    1994-12-31

    A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and human exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.

  15. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Science.gov (United States)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  16. Propagation of nuclear data Uncertainties for PWR core analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cabellos, O.; Castro, E.; Ahnert, C.; Holgado, C. [Dept. of Nuclear Engineering, Universidad Politecnica de Madrid, Madrid (Spain)

    2014-06-15

    An uncertainty propagation methodology based on the Monte Carlo method is applied to PWR nuclear design analysis to assess the impact of nuclear data uncertainties. The importance of the nuclear data uncertainties for {sup 235,238}U, {sup 239}Pu, and the thermal scattering library for hydrogen in water is analyzed. This uncertainty analysis is compared with the design and acceptance criteria to assure the adequacy of bounding estimates in safety margins.

  17. PROPAGATION OF NUCLEAR DATA UNCERTAINTIES FOR PWR CORE ANALYSIS

    Directory of Open Access Journals (Sweden)

    O. CABELLOS

    2014-06-01

    Full Text Available An uncertainty propagation methodology based on the Monte Carlo method is applied to PWR nuclear design analysis to assess the impact of nuclear data uncertainties. The importance of the nuclear data uncertainties for 235,238U, 239Pu, and the thermal scattering library for hydrogen in water is analyzed. This uncertainty analysis is compared with the design and acceptance criteria to assure the adequacy of bounding estimates in safety margins.

  18. The Essential Uncertainty of Thinking: Education and Subject in John Dewey

    Science.gov (United States)

    D'Agnese, Vasco

    2017-01-01

    In this paper, I analyse the Deweyan account of thinking and subject and discuss the educational consequences that follow from such an account. I argue that despite the grouping of thinking and reflective thought that has largely appeared in the interpretation of Deweyan work, Dewey discloses an inescapable uncertainty at the core of human…

  19. Representation of analysis results involving aleatory and epistemic uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean (ProStat, Mesa, AZ); Helton, Jon Craig (Arizona State University, Tempe, AZ); Oberkampf, William Louis; Sallaberry, Cedric J.

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.

  20. Uncertainty Budget Analysis for Dimensional Inspection Processes (U)

    Energy Technology Data Exchange (ETDEWEB)

    Valdez, Lucas M. [Los Alamos National Laboratory

    2012-07-26

    This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensional inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.

  1. A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS

    Science.gov (United States)

    The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...

  2. mu analysis with real parametric uncertainty

    Science.gov (United States)

    Young, Peter M.; Newlin, Matthew P.; Doyle, John C.

    1991-01-01

    The authors give a broad overview, from a LFT (linear fractional transformation)/mu perspective, of some of the theoretical and practical issues associated with robustness in the presence of real parametric uncertainty, with a focus on computation. Recent results on the properties of mu in the mixed case are reviewed, including issues of NP completeness, continuity, computation of bounds, the equivalence of mu and its bounds, and some direct comparisons with Kharitonov-type analysis methods. In addition, some advances in the computational aspects of the problem, including a novel branch and bound algorithm, are briefly presented together with numerical results. The results suggest that while the mixed mu problem may have inherently combinatoric worst-case behavior, practical algorithms with modest computational requirements can be developed for problems of medium size (less than 100 parameters) that are of engineering interest.

  3. An uncertainty analysis of wildfire modeling [Chapter 13

    Science.gov (United States)

    Karin Riley; Matthew Thompson

    2017-01-01

    Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...

  4. Uncertainty analysis of doses from ingestion of plutonium and americium.

    Science.gov (United States)

    Puncher, M; Harrison, J D

    2012-02-01

    Uncertainty analyses have been performed on the biokinetic model for americium currently used by the International Commission on Radiological Protection (ICRP), and the model for plutonium recently derived by Leggett, considering acute intakes by ingestion by adult members of the public. The analyses calculated distributions of doses per unit intake. Those parameters having the greatest impact on prospective doses were identified by sensitivity analysis; the most important were the fraction absorbed from the alimentary tract, f(1), and rates of uptake from blood to bone surfaces. Probability distributions were selected based on the observed distribution of plutonium and americium in human subjects where possible; the distributions for f(1) reflected uncertainty on the average value of this parameter for non-specified plutonium and americium compounds ingested by adult members of the public. The calculated distributions of effective doses for ingested (239)Pu and (241)Am were well described by log-normal distributions, with doses varying by around a factor of 3 above and below the central values; the distributions contain the current ICRP Publication 67 dose coefficients for ingestion of (239)Pu and (241)Am by adult members of the public. Uncertainty on f(1) values had the greatest impact on doses, particularly effective dose. It is concluded that: (1) more precise data on f(1) values would have a greater effect in reducing uncertainties on doses from ingested (239)Pu and (241)Am, than reducing uncertainty on other model parameter values and (2) the results support the dose coefficients (Sv Bq(-1) intake) derived by ICRP for ingestion of (239)Pu and (241)Am by adult members of the public.

  5. Possibilistic uncertainty analysis of a conceptual model of snowmelt runoff

    OpenAIRE

    A. P. Jacquin

    2010-01-01

    This study presents the analysis of predictive uncertainty of a conceptual type snowmelt runoff model. The method applied uses possibilistic rather than probabilistic calculus for the evaluation of predictive uncertainty. Possibility theory is an information theory meant to model uncertainties caused by imprecise or incomplete knowledge about a real system rather than by randomness. A snow dominated catchment in the Chilean Andes is used as case study. Predictive uncertainty arising from para...

  6. Methodological considerations with data uncertainty in road safety analysis.

    Science.gov (United States)

    Schlögl, Matthias; Stütz, Rainer

    2017-02-16

    The analysis of potential influencing factors that affect the likelihood of road accident occurrence has been of major interest for safety researchers throughout the recent decades. Even though steady methodological progresses were made over the years, several impediments pertaining to the statistical analysis of crash data remain. While issues related to methodological approaches have been subject to constructive discussion, uncertainties inherent to the most fundamental part of any analysis have been widely neglected: data. This paper scrutinizes data from various sources that are commonly used in road safety studies with respect to their actual suitability for applications in this area. Issues related to spatial and temporal aspects of data uncertainty are pointed out and their implications for road safety analysis are discussed in detail. These general methodological considerations are exemplary illustrated with data from Austria, providing suggestions and methods how to overcome these obstacles. Considering these aspects is of major importance for expediting further advances in road safety data analysis and thus for increasing road safety. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Loss, uncertainty, or acceptance: subjective experience of changes to fertility after breast cancer.

    Science.gov (United States)

    Perz, J; Ussher, J; Gilbert, E

    2014-07-01

    This qualitative study examines the subjective experience of infertility in a large sample of Australian women with breast cancer. Participants were 1830 women, average age 54, who responded to an email invitation to complete an online survey on sexual well-being and fertility concerns after breast cancer. 24.6% (n = 452) reported that cancer had affected their fertility; 21.3% (n = 391) did not know their fertility status. In thematic analysis of open-ended responses provided by 381 women about changes to fertility status, reactions to infertility, and experiences of information and interventions to assist fertility, five themes were identified: 'Negative responses to infertility and early menopause'; 'Sexual changes associated with menopause and infertility'; 'Uncertainty and anxiety about fertility status'; 'Information and fertility preservation'; 'Acceptance of the end of fertility'. These findings confirm previous reports that infertility and premature menopause are a significant cause of anxiety for many women with breast cancer. However, some women closer to natural menopause, or who had completed their families, reported acceptance of changed fertility status. Accounts of deficits in information provision and fertility counselling suggest an urgent need for accessible and comprehensive information about fertility and cancer to be developed and evaluated, as well as education and training of health professionals in addressing fertility concerns following cancer. © 2013 John Wiley & Sons Ltd.

  8. Uncertainties.

    Science.gov (United States)

    Dalla Chiara, Maria Luisa

    2010-09-01

    In contemporary science uncertainty is often represented as an intrinsic feature of natural and of human phenomena. As an example we need only think of two important conceptual revolutions that occurred in physics and logic during the first half of the twentieth century: (1) the discovery of Heisenberg's uncertainty principle in quantum mechanics; (2) the emergence of many-valued logical reasoning, which gave rise to so-called 'fuzzy thinking'. I discuss the possibility of applying the notions of uncertainty, developed in the framework of quantum mechanics, quantum information and fuzzy logics, to some problems of political and social sciences.

  9. Cultural-psychological and clinical perspectives of research on phenomena of subjective uncertainty and ambiguity

    Directory of Open Access Journals (Sweden)

    Sokolova, Elena T.

    2013-06-01

    Full Text Available The article analyzes certain socio-cultural and personal predispositions, which determine the modern diversity of subjective uncertainty and ambiguity manifestations. It stresses that for the creation of ‘realistic’ clinical psychology (in terms of A.R. Luria one needs to retrace the relations between the resourceful and the psychopathological aspects of the ambiguity phenomenon and the cultural environment with its destructive ideals and mythologems, manipulative media-technologies and all-pervading idea of ‘deconstruction’. Methods for modeling the experiences of ambiguity in experimental settings, in pathopsychological examination and in projective psychological diagnostics are put in comparison. The arguments are adduced for the interpretation of deficient manifestations of subjective uncertainty as a criterion for diagnostics of the severity of personality disorder.

  10. Dealing with Uncertainties A Guide to Error Analysis

    CERN Document Server

    Drosg, Manfred

    2007-01-01

    Dealing with Uncertainties proposes and explains a new approach for the analysis of uncertainties. Firstly, it is shown that uncertainties are the consequence of modern science rather than of measurements. Secondly, it stresses the importance of the deductive approach to uncertainties. This perspective has the potential of dealing with the uncertainty of a single data point and of data of a set having differing weights. Both cases cannot be dealt with the inductive approach, which is usually taken. This innovative monograph also fully covers both uncorrelated and correlated uncertainties. The weakness of using statistical weights in regression analysis is discussed. Abundant examples are given for correlation in and between data sets and for the feedback of uncertainties on experiment design.

  11. Uncertainty Analysis of Historical Hurricane Data

    Science.gov (United States)

    Green, Lawrence L.

    2007-01-01

    An analysis of variance (ANOVA) study was conducted for historical hurricane data dating back to 1851 that was obtained from the U. S. Department of Commerce National Oceanic and Atmospheric Administration (NOAA). The data set was chosen because it is a large, publicly available collection of information, exhibiting great variability which has made the forecasting of future states, from current and previous states, difficult. The availability of substantial, high-fidelity validation data, however, made for an excellent uncertainty assessment study. Several factors (independent variables) were identified from the data set, which could potentially influence the track and intensity of the storms. The values of these factors, along with the values of responses of interest (dependent variables) were extracted from the data base, and provided to a commercial software package for processing via the ANOVA technique. The primary goal of the study was to document the ANOVA modeling uncertainty and predictive errors in making predictions about hurricane location and intensity 24 to 120 hours beyond known conditions, as reported by the data set. A secondary goal was to expose the ANOVA technique to a broader community within NASA. The independent factors considered to have an influence on the hurricane track included the current and starting longitudes and latitudes (measured in degrees), and current and starting maximum sustained wind speeds (measured in knots), and the storm starting date, its current duration from its first appearance, and the current year fraction of each reading, all measured in years. The year fraction and starting date were included in order to attempt to account for long duration cyclic behaviors, such as seasonal weather patterns, and years in which the sea or atmosphere were unusually warm or cold. The effect of short duration weather patterns and ocean conditions could not be examined with the current data set. The responses analyzed were the storm

  12. Sustainable Process Design under uncertainty analysis: targeting environmental indicators

    DEFF Research Database (Denmark)

    L. Gargalo, Carina; Gani, Rafiqul

    2015-01-01

    This study focuses on uncertainty analysis of environmental indicators used to support sustainable process design efforts. To this end, the Life Cycle Assessment methodology is extended with a comprehensive uncertainty analysis to propagate the uncertainties in input LCA data to the environmental...... indicators. The resulting uncertainties in the environmental indicators are then represented by empirical cumulative distribution function, which provides a probabilistic basis for the interpretation of the indicators. In order to highlight the main features of the extended LCA, the production of biodiesel...... from algae biomass is used as a case study. The results indicate there are considerable uncertainties in the calculated environmental indicators as revealed by CDFs. The underlying sources of these uncertainties are indeed the significant variation in the databases used for the LCA analysis...

  13. Uncertainty analysis in integrated assessment: the users' perspective

    NARCIS (Netherlands)

    Gabbert, S.G.M.; Ittersum, van M.K.; Kroeze, C.; Stalpers, S.I.P.; Ewert, F.; Alkan Olsson, J.

    2010-01-01

    Integrated Assessment (IA) models aim at providing information- and decision-support to complex problems. This paper argues that uncertainty analysis in IA models should be user-driven in order to strengthen science–policy interaction. We suggest an approach to uncertainty analysis that starts with

  14. Uncertainty analysis of energy consumption in dwellings

    Energy Technology Data Exchange (ETDEWEB)

    Pettersen, Trine Dyrstad

    1997-12-31

    This thesis presents a comprehensive study of an energy estimation model that can be used to examine the uncertainty of predicted energy consumption in a dwelling. The variation and uncertainty of input parameters due to the outdoor climate, the building construction and the inhabitants are studied as a basis for further energy evaluations. The occurring variations of energy consumption in nominal similar dwellings are also investigated due to verification of the simulated energy consumption. The main topics are (1) a study of expected variations and uncertainties in both input parameters used in energy consumption calculations and the energy consumption in the dwelling, (2) the development and evaluation of a simplified energy calculation model that considers uncertainties due to the input parameters, (3) an evaluation of the influence of the uncertain parameters on the total variation so that the most important parameters can be identified, and (4) the recommendation of a simplified procedure for treating uncertainties or possible deviations from average conditions. 90 refs., 182 figs., 73 tabs.

  15. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  16. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  17. On the logistic equation subject to uncertainties in the environmental carrying capacity and initial population density

    Science.gov (United States)

    Dorini, F. A.; Cecconello, M. S.; Dorini, L. B.

    2016-04-01

    It is recognized that handling uncertainty is essential to obtain more reliable results in modeling and computer simulation. This paper aims to discuss the logistic equation subject to uncertainties in two parameters: the environmental carrying capacity, K, and the initial population density, N0. We first provide the closed-form results for the first probability density function of time-population density, N(t), and its inflection point, t*. We then use the Maximum Entropy Principle to determine both K and N0 density functions, treating such parameters as independent random variables and considering fluctuations of their values for a situation that commonly occurs in practice. Finally, closed-form results for the density functions and statistical moments of N(t), for a fixed t > 0, and of t* are provided, considering the uniform distribution case. We carried out numerical experiments to validate the theoretical results and compared them against that obtained using Monte Carlo simulation.

  18. Uncertainty Analysis in Space Radiation Protection

    Science.gov (United States)

    Cucinotta, Francis A.

    2011-01-01

    Space radiation is comprised of high energy and charge (HZE) nuclei, protons, and secondary radiation including neutrons. The uncertainties in estimating the health risks from galactic cosmic rays (GCR) are a major limitation to the length of space missions, the evaluation of potential risk mitigation approaches, and application of the As Low As Reasonably Achievable (ALARA) principle. For long duration space missio ns, risks may approach radiation exposure limits, therefore the uncertainties in risk projections become a major safety concern and methodologies used for ground-based works are not deemed to be sufficient. NASA limits astronaut exposures to a 3% risk of exposure induced death (REID) and protects against uncertainties in risks projections using an assessment of 95% confidence intervals in the projection model. We discuss NASA s approach to space radiation uncertainty assessments and applications for the International Space Station (ISS) program and design studies of future missions to Mars and other destinations. Several features of NASA s approach will be discussed. Radiation quality descriptions are based on the properties of radiation tracks rather than LET with probability distribution functions (PDF) for uncertainties derived from radiobiology experiments at particle accelerators. The application of age and gender specific models for individual astronauts is described. Because more than 90% of astronauts are never-smokers, an alternative risk calculation for never-smokers is used and will be compared to estimates for an average U.S. population. Because of the high energies of the GCR limits the benefits of shielding and the limited role expected for pharmaceutical countermeasures, uncertainty reduction continues to be the optimal approach to improve radiation safety for space missions.

  19. Contributions to Physics-Based Aeroservoelastic Uncertainty Analysis

    Science.gov (United States)

    Wu, Sang

    The thesis presents the development of a new fully-integrated, MATLAB based simulation capability for aeroservoelastic (ASE) uncertainty analysis that accounts for uncertainties in all disciplines as well as discipline interactions. This new capability allows probabilistic studies of complex configuration at a scope and with depth not known before. Several statistical tools and methods have been integrated into the capability to guide the tasks such as parameter prioritization, uncertainty reduction, and risk mitigation. (Abstract shortened by ProQuest.).

  20. Uncertainty Analysis of the Estimated Risk in Formal Safety Assessment

    Directory of Open Access Journals (Sweden)

    Molin Sun

    2018-01-01

    Full Text Available An uncertainty analysis is required to be carried out in formal safety assessment (FSA by the International Maritime Organization. The purpose of this article is to introduce the uncertainty analysis technique into the FSA process. Based on the uncertainty identification of input parameters, probability and possibility distributions are used to model the aleatory and epistemic uncertainties, respectively. An approach which combines the Monte Carlo random sampling of probability distribution functions with the a-cuts for fuzzy calculus is proposed to propagate the uncertainties. One output of the FSA process is societal risk (SR, which can be evaluated in the two-dimensional frequency–fatality (FN diagram. Thus, the confidence-level-based SR is presented to represent the uncertainty of SR in two dimensions. In addition, a method for time window selection is proposed to estimate the magnitude of uncertainties, which is an important aspect of modeling uncertainties. Finally, a case study is carried out on an FSA study on cruise ships. The results show that the uncertainty analysis of SR generates a two-dimensional area for a certain degree of confidence in the FN diagram rather than a single FN curve, which provides more information to authorities to produce effective risk control measures.

  1. New Programming Environments for Uncertainty Analysis

    Science.gov (United States)

    Hill, M. C.; Poeter, E. P.; Banta, E. R.; Christensen, S.; Cooley, R. L.; Ely, D. M.; Babendreier, J.; Leavesley, G.; Tonkin, M.; Julich, R.

    2005-12-01

    We live in a world of faster computers, better GUI's and visualization technology, increasing international cooperation made possible by new digital infrastructure, new agreements between US federal agencies (such as ISCMEM), new European Union programs (such as Harmoniqua), and greater collaboration between US university scientists through CUAHSI. These changes provide new resources for tackling the difficult job of quantifying how well our models perform. This talk introduces new programming environments that take advantage of these new developments and will change the paradigm of how we develop methods for uncertainty evaluation. For example, the programming environments provided by COSU API, JUPITER API, and Sensitivity/Optimization Toolbox provide enormous opportunities for faster and more meaningful evaluation of uncertainties. Instead of waiting years for ideas and theories to be compared in the complex circumstances of interest to resource managers, these new programming environments will expedite the process. In the new paradigm, unproductive ideas and theories will be revealed more quickly, productive ideas and theories will more quickly be used to address our increasingly difficult water resources problems. As examples, two ideas in JUPITER API applications are presented: uncertainty correction factors that account for system complexities not represented in models, and PPR and OPR statistics used to identify new data needed to reduce prediction uncertainty.

  2. Reservoir Sedimentation Based on Uncertainty Analysis

    Directory of Open Access Journals (Sweden)

    Farhad Imanshoar

    2014-01-01

    Full Text Available Reservoir sedimentation can result in loss of much needed reservoir storage capacity, reducing the useful life of dams. Thus, sufficient sediment storage capacity should be provided for the reservoir design stage to ensure that sediment accumulation will not impair the functioning of the reservoir during the useful operational-economic life of the project. However, an important issue to consider when estimating reservoir sedimentation and accumulation is the uncertainty involved in reservoir sedimentation. In this paper, the basic factors influencing the density of sediments deposited in reservoirs are discussed, and uncertainties in reservoir sedimentation have been determined using the Delta method. Further, Kenny Reservoir in the White River Basin in northwestern Colorado was selected to determine the density of deposits in the reservoir and the coefficient of variation. The results of this investigation have indicated that by using the Delta method in the case of Kenny Reservoir, the uncertainty regarding accumulated sediment density, expressed by the coefficient of variation for a period of 50 years of reservoir operation, could be reduced to about 10%. Results of the Delta method suggest an applicable approach for dead storage planning via interfacing with uncertainties associated with reservoir sedimentation.

  3. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  4. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  5. Uncertainty Analysis of Air Radiation for Lunar Return Shock Layers

    Science.gov (United States)

    Kleb, Bil; Johnston, Christopher O.

    2008-01-01

    By leveraging a new uncertainty markup technique, two risk analysis methods are used to compute the uncertainty of lunar-return shock layer radiation predicted by the High temperature Aerothermodynamic Radiation Algorithm (HARA). The effects of epistemic uncertainty, or uncertainty due to a lack of knowledge, is considered for the following modeling parameters: atomic line oscillator strengths, atomic line Stark broadening widths, atomic photoionization cross sections, negative ion photodetachment cross sections, molecular bands oscillator strengths, and electron impact excitation rates. First, a simplified shock layer problem consisting of two constant-property equilibrium layers is considered. The results of this simplified problem show that the atomic nitrogen oscillator strengths and Stark broadening widths in both the vacuum ultraviolet and infrared spectral regions, along with the negative ion continuum, are the dominant uncertainty contributors. Next, three variable property stagnation-line shock layer cases are analyzed: a typical lunar return case and two Fire II cases. For the near-equilibrium lunar return and Fire 1643-second cases, the resulting uncertainties are very similar to the simplified case. Conversely, the relatively nonequilibrium 1636-second case shows significantly larger influence from electron impact excitation rates of both atoms and molecules. For all cases, the total uncertainty in radiative heat flux to the wall due to epistemic uncertainty in modeling parameters is 30% as opposed to the erroneously-small uncertainty levels (plus or minus 6%) found when treating model parameter uncertainties as aleatory (due to chance) instead of epistemic (due to lack of knowledge).

  6. Uncertainty Analysis of Consequence Management (CM) Data Products.

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Fournier, Sean Donovan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schetnan, Richard Reed [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Simpson, Matthew D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Okada, Colin E. [Remote Sensing Lab. (RSL), Nellis AFB, Las Vegas, NV (United States); Bingham, Avery A. [Remote Sensing Lab. (RSL), Nellis AFB, Las Vegas, NV (United States)

    2018-01-01

    The goal of this project is to develop and execute methods for characterizing uncertainty in data products that are deve loped and distributed by the DOE Consequence Management (CM) Program. A global approach to this problem is necessary because multiple sources of error and uncertainty from across the CM skill sets contribute to the ultimate p roduction of CM data products. This report presents the methods used to develop a probabilistic framework to characterize this uncertainty and provides results for an uncertainty analysis for a study scenario analyzed using this framework.

  7. Uncertainty Analysis of Light Water Reactor Fuel Lattices

    Directory of Open Access Journals (Sweden)

    C. Arenas

    2013-01-01

    Full Text Available The study explored the calculation of uncertainty based on available cross-section covariance data and computational tool on fuel lattice levels, which included pin cell and the fuel assembly models. Uncertainty variations due to temperatures changes and different fuel compositions are the main focus of this analysis. Selected assemblies and unit pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analysis were performed using TSUNAMI-2D sequence in SCALE 6.1. It was found that uncertainties increase with increasing temperature, while kinf decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributing reaction of uncertainty, namely, the neutron capture reaction 238U(n, γ due to the Doppler broadening. In addition, three types (UOX, MOX, and UOX-Gd2O3 of fuel material compositions were analyzed. A remarkable increase in uncertainty in kinf was observed for the case of MOX fuel. The increase in uncertainty of kinf in MOX fuel was nearly twice the corresponding value in UOX fuel. The neutron-nuclide reaction of 238U, mainly inelastic scattering (n, n′, contributed the most to the uncertainties in the MOX fuel, shifting the neutron spectrum to higher energy compared to the UOX fuel.

  8. Reservoir Sedimentation Based on Uncertainty Analysis

    OpenAIRE

    Farhad Imanshoar; Afshin Jahangirzadeh; Hossein Basser; Shatirah Akib; Babak Kamali; Tabatabaei, Mohammad Reza M.; Masoud Kakouei

    2013-01-01

    Reservoir sedimentation can result in loss of much needed reservoir storage capacity, reducing the useful life of dams. Thus, sufficient sediment storage capacity should be provided for the reservoir design stage to ensure that sediment accumulation will not impair the functioning of the reservoir during the useful operational-economic life of the project. However, an important issue to consider when estimating reservoir sedimentation and accumulation is the uncertainty involved in reservoir ...

  9. Statistical analysis of hydroclimatic time series: Uncertainty and insights

    Science.gov (United States)

    Koutsoyiannis, Demetris; Montanari, Alberto

    2007-05-01

    Today, hydrologic research and modeling depends largely on climatological inputs, whose physical and statistical behavior are the subject of many debates in the scientific community. A relevant ongoing discussion is focused on long-term persistence (LTP), a natural behavior identified in several studies of instrumental and proxy hydroclimatic time series, which, nevertheless, is neglected in some climatological studies. LTP may reflect a long-term variability of several factors and thus can support a more complete physical understanding and uncertainty characterization of climate. The implications of LTP in hydroclimatic research, especially in statistical questions and problems, may be substantial but appear to be not fully understood or recognized. To offer insights on these implications, we demonstrate by using analytical methods that the characteristics of temperature series, which appear to be compatible with the LTP hypothesis, imply a dramatic increase of uncertainty in statistical estimation and reduction of significance in statistical testing, in comparison with classical statistics. Therefore we maintain that statistical analysis in hydroclimatic research should be revisited in order not to derive misleading results and simultaneously that merely statistical arguments do not suffice to verify or falsify the LTP (or another) climatic hypothesis.

  10. Affordability Tradeoffs Under Uncertainty Using Epoch-Era Analysis

    Science.gov (United States)

    2013-09-30

    large numbers of eras can be constructed through combinatorics , probabilistic state transitions, and other means. Process 9 focuses on the analysis ...pmlkploba=obmloq=pbofbp= Affordability Tradeoffs Under Uncertainty Using Epoch-Era Analysis 30 September 2013 Dr. Donna H. Rhodes, Principal...AND SUBTITLE Affordability Tradeoffs Under Uncertainty Using Epoch-Era Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  11. Altered subjective reward valuation among drug-deprived heavy marijuana users: Aversion to uncertainty

    Science.gov (United States)

    Hefner, Kathryn R.; Starr, Mark. J.; Curtin, John. J.

    2015-01-01

    Marijuana is the most commonly used illicit drug in the United States and its use is rising. Nonetheless, scientific efforts to clarify the risk for addiction and other harm associated with marijuana use have been lacking. Maladaptive decision-making is a cardinal feature of addiction that is likely to emerge in heavy users. In particular, distorted subjective reward valuation related to homeostatic or allostatic processes has been implicated for many drugs of abuse. Selective changes in responses to uncertainty have been observed in response to intoxication and deprivation from various drugs of abuse. To assess for these potential neuroadaptive changes in reward valuation associated with marijuana deprivation, we examined the subjective value of uncertain and certain rewards among deprived and non-deprived heavy marijuana users in a behavioral economics decision-making task. Deprived users displayed reduced valuation of uncertain rewards, particularly when these rewards were more objectively valuable. This uncertainty aversion increased with increasing quantity of marijuana use. These results suggest comparable decision-making vulnerability from marijuana use as other drugs of abuse, and highlights targets for intervention. PMID:26595464

  12. Uncertainty about probability: a decision analysis perspective

    Energy Technology Data Exchange (ETDEWEB)

    Howard, R.A.

    1988-03-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group.

  13. A Stochastic Collocation Algorithm for Uncertainty Analysis

    Science.gov (United States)

    Mathelin, Lionel; Hussaini, M. Yousuff; Zang, Thomas A. (Technical Monitor)

    2003-01-01

    This report describes a stochastic collocation method to adequately handle a physically intrinsic uncertainty in the variables of a numerical simulation. For instance, while the standard Galerkin approach to Polynomial Chaos requires multi-dimensional summations over the stochastic basis functions, the stochastic collocation method enables to collapse those summations to a one-dimensional summation only. This report furnishes the essential algorithmic details of the new stochastic collocation method and provides as a numerical example the solution of the Riemann problem with the stochastic collocation method used for the discretization of the stochastic parameters.

  14. An educational model for ensemble streamflow simulation and uncertainty analysis

    National Research Council Canada - National Science Library

    AghaKouchak, A; Nakhjiri, N; Habib, E

    2013-01-01

    ...) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity...

  15. Uncertainty Analysis of Knowledge Reductions in Rough Sets

    Directory of Open Access Journals (Sweden)

    Ying Wang

    2014-01-01

    Full Text Available Uncertainty analysis is a vital issue in intelligent information processing, especially in the age of big data. Rough set theory has attracted much attention to this field since it was proposed. Relative reduction is an important problem of rough set theory. Different relative reductions have been investigated for preserving some specific classification abilities in various applications. This paper examines the uncertainty analysis of five different relative reductions in four aspects, that is, reducts’ relationship, boundary region granularity, rules variance, and uncertainty measure according to a constructed decision table.

  16. Uncertainty analysis of knowledge reductions in rough sets.

    Science.gov (United States)

    Wang, Ying; Zhang, Nan

    2014-01-01

    Uncertainty analysis is a vital issue in intelligent information processing, especially in the age of big data. Rough set theory has attracted much attention to this field since it was proposed. Relative reduction is an important problem of rough set theory. Different relative reductions have been investigated for preserving some specific classification abilities in various applications. This paper examines the uncertainty analysis of five different relative reductions in four aspects, that is, reducts' relationship, boundary region granularity, rules variance, and uncertainty measure according to a constructed decision table.

  17. Assessment and uncertainty analysis of groundwater risk.

    Science.gov (United States)

    Li, Fawen; Zhu, Jingzhao; Deng, Xiyuan; Zhao, Yong; Li, Shaofei

    2018-01-01

    Groundwater with relatively stable quantity and quality is commonly used by human being. However, as the over-mining of groundwater, problems such as groundwater funnel, land subsidence and salt water intrusion have emerged. In order to avoid further deterioration of hydrogeological problems in over-mining regions, it is necessary to conduct the assessment of groundwater risk. In this paper, risks of shallow and deep groundwater in the water intake area of the South-to-North Water Transfer Project in Tianjin, China, were evaluated. Firstly, two sets of four-level evaluation index system were constructed based on the different characteristics of shallow and deep groundwater. Secondly, based on the normalized factor values and the synthetic weights, the risk values of shallow and deep groundwater were calculated. Lastly, the uncertainty of groundwater risk assessment was analyzed by indicator kriging method. The results meet the decision maker's demand for risk information, and overcome previous risk assessment results expressed in the form of deterministic point estimations, which ignore the uncertainty of risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. MOUSE (MODULAR ORIENTED UNCERTAINTY SYSTEM): A COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM (FOR MICRO- COMPUTERS)

    Science.gov (United States)

    Environmental engineering calculations involving uncertainties; either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. There exist a number of general-purpose computer simulation languages, using Mon...

  19. Uncertainty Analysis of Knowledge Reductions in Rough Sets

    OpenAIRE

    Ying Wang; Nan Zhang

    2014-01-01

    Uncertainty analysis is a vital issue in intelligent information processing, especially in the age of big data. Rough set theory has attracted much attention to this field since it was proposed. Relative reduction is an important problem of rough set theory. Different relative reductions have been investigated for preserving some specific classification abilities in various applications. This paper examines the uncertainty analysis of five different relative reductions in four aspects, that i...

  20. Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions

    Science.gov (United States)

    Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  1. Uncertainty Analysis via Failure Domain Characterization: Unrestricted Requirement Functions

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. The methods developed herein, which are based on nonlinear constrained optimization, are applicable to requirement functions whose functional dependency on the uncertainty is arbitrary and whose explicit form may even be unknown. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the assumed uncertainty model (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  2. Uncertainty analysis of geothermal energy economics

    Science.gov (United States)

    Sener, Adil Caner

    This dissertation research endeavors to explore geothermal energy economics by assessing and quantifying the uncertainties associated with the nature of geothermal energy and energy investments overall. The study introduces a stochastic geothermal cost model and a valuation approach for different geothermal power plant development scenarios. The Monte Carlo simulation technique is employed to obtain probability distributions of geothermal energy development costs and project net present values. In the study a stochastic cost model with incorporated dependence structure is defined and compared with the model where random variables are modeled as independent inputs. One of the goals of the study is to attempt to shed light on the long-standing modeling problem of dependence modeling between random input variables. The dependence between random input variables will be modeled by employing the method of copulas. The study focuses on four main types of geothermal power generation technologies and introduces a stochastic levelized cost model for each technology. Moreover, we also compare the levelized costs of natural gas combined cycle and coal-fired power plants with geothermal power plants. The input data used in the model relies on the cost data recently reported by government agencies and non-profit organizations, such as the Department of Energy, National Laboratories, California Energy Commission and Geothermal Energy Association. The second part of the study introduces the stochastic discounted cash flow valuation model for the geothermal technologies analyzed in the first phase. In this phase of the study, the Integrated Planning Model (IPM) software was used to forecast the revenue streams of geothermal assets under different price and regulation scenarios. These results are then combined to create a stochastic revenue forecast of the power plants. The uncertainties in gas prices and environmental regulations will be modeled and their potential impacts will be

  3. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands); Grupa, J.B. [Netherlands Energy Research Foundation (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models.

  4. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models.

  5. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models.

  6. Possibilistic uncertainty analysis of a conceptual model of snowmelt runoff

    Directory of Open Access Journals (Sweden)

    A. P. Jacquin

    2010-08-01

    Full Text Available This study presents the analysis of predictive uncertainty of a conceptual type snowmelt runoff model. The method applied uses possibilistic rather than probabilistic calculus for the evaluation of predictive uncertainty. Possibility theory is an information theory meant to model uncertainties caused by imprecise or incomplete knowledge about a real system rather than by randomness. A snow dominated catchment in the Chilean Andes is used as case study. Predictive uncertainty arising from parameter uncertainties of the watershed model is assessed. Model performance is evaluated according to several criteria, in order to define the possibility distribution of the parameter vector. The plausibility of the simulated glacier mass balance and snow cover are used for further constraining the model representations. Possibility distributions of the discharge estimates and prediction uncertainty bounds are subsequently derived. The results of the study indicate that the use of additional information allows a reduction of predictive uncertainty. In particular, the assessment of the simulated glacier mass balance and snow cover helps to reduce the width of the uncertainty bounds without a significant increment in the number of unbounded observations.

  7. Sensitivity of Subjective Decisions in the GLUE Methodology for Quantifying the Uncertainty in the Flood Inundation Map for Seymour Reach in Indiana, USA

    Directory of Open Access Journals (Sweden)

    Younghun Jung

    2014-07-01

    Full Text Available Generalized likelihood uncertainty estimation (GLUE is one of the widely-used methods for quantifying uncertainty in flood inundation mapping. However, the subjective nature of its application involving the definition of the likelihood measure and the criteria for defining acceptable versus unacceptable models can lead to different results in quantifying uncertainty bounds. The objective of this paper is to perform a sensitivity analysis of the effect of the choice of likelihood measures and cut-off thresholds used in selecting behavioral and non-behavioral models in the GLUE methodology. By using a dataset for a reach along the White River in Seymour, Indiana, multiple prior distributions, likelihood measures and cut-off thresholds are used to investigate the role of subjective decisions in applying the GLUE methodology for uncertainty quantification related to topography, streamflow and Manning’s n. Results from this study show that a normal pdf produces a narrower uncertainty bound compared to a uniform pdf for an uncertain variable. Similarly, a likelihood measure based on water surface elevations is found to be less affected compared to other likelihood measures that are based on flood inundation area and width. Although the findings from this study are limited due to the use of a single test case, this paper provides a framework that can be utilized to gain a better understanding of the uncertainty while applying the GLUE methodology in flood inundation mapping.

  8. The Model Optimization, Uncertainty, and SEnsitivity analysis (MOUSE) toolbox: overview and application

    Science.gov (United States)

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  9. Overview and application of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) toolbox

    Science.gov (United States)

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  10. Uncertainty Instability Risk Analysis of High Concrete Arch Dam Abutments

    Directory of Open Access Journals (Sweden)

    Xin Cao

    2017-01-01

    Full Text Available The uncertainties associated with concrete arch dams rise with the increased height of dams. Given the uncertainties associated with influencing factors, the stability of high arch dam abutments as a fuzzy random event was studied. In addition, given the randomness and fuzziness of calculation parameters as well as the failure criterion, hazard point and hazard surface uncertainty instability risk ratio models were proposed for high arch dam abutments on the basis of credibility theory. The uncertainty instability failure criterion was derived through the analysis of the progressive instability failure process on the basis of Shannon’s entropy theory. The uncertainties associated with influencing factors were quantized by probability or possibility distribution assignments. Gaussian random theory was used to generate random realizations for influence factors with spatial variability. The uncertainty stability analysis method was proposed by combining the finite element analysis and the limit equilibrium method. The instability risk ratio was calculated using the Monte Carlo simulation method and fuzzy random postprocessing. Results corroborate that the modeling approach is sound and that the calculation method is feasible.

  11. Analysis and Reduction of Complex Networks Under Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  12. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Adaptive framework for uncertainty analysis in electromagnetic field measurements.

    Science.gov (United States)

    Prieto, Javier; Alonso, Alonso A; de la Rosa, Ramón; Carrera, Albano

    2015-04-01

    Misinterpretation of uncertainty in the measurement of the electromagnetic field (EMF) strength may lead to an underestimation of exposure risk or an overestimation of required measurements. The Guide to the Expression of Uncertainty in Measurement (GUM) has internationally been adopted as a de facto standard for uncertainty assessment. However, analyses under such an approach commonly assume unrealistic static models or neglect relevant prior information, resulting in non-robust uncertainties. This study proposes a principled and systematic framework for uncertainty analysis that fuses information from current measurements and prior knowledge. Such a framework dynamically adapts to data by exploiting a likelihood function based on kernel mixtures and incorporates flexible choices of prior information by applying importance sampling. The validity of the proposed techniques is assessed from measurements performed with a broadband radiation meter and an isotropic field probe. The developed framework significantly outperforms GUM approach, achieving a reduction of 28% in measurement uncertainty. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Uncertainty analysis and visualization of diffusion tensor images

    Science.gov (United States)

    Jiao, Fangxiang

    Diffusion magnetic resonance imaging (dMRI) has become a popular technique to detect brain white matter structure. However, imaging noise, imaging artifacts, and modeling techniques, etc., create many uncertainties, which may generate misleading information for further analysis or applications, such as surgical planning. Therefore, how to analyze, effectively visualize, and reduce these uncertainties become very important research questions. In this dissertation, we present both rank-k decomposition and direct decomposition approaches based on spherical deconvolution to decompose the fiber directions more accurately for high angular resolution diffusion imaging (HARDI) data, which will reduce the uncertainties of the fiber directions. By applying volume rendering techniques to an ensemble of 3D orientation distribution function (ODF) glyphs, which we call SIP functions of diffusion shapes, one can elucidate the complex heteroscedastic structural variation in these local diffusion shapes. Furthermore, we quantify the extent of this variation by measuring the fraction of the volume of these shapes, which is consistent across all noise levels, the certain volume ratio . To better understand the uncertainties in white matter fiber tracks, we propose three metrics to quantify the differences between the results of diffusion tensor magnetic resonance imaging (DT-MRI) fiber tracking algorithms: the area between corresponding fibers of each bundle, the Earth Mover's Distance (EMD) between two fiber bundle volumes, and the current distance between two fiber bundle volumes. Based on these metrics, we discuss an interactive fiber track comparison visualization toolkit we have developed to visualize these uncertainties more efficiently. Physical phantoms, with high repeatability and reproducibility, are also designed with the hope of validating the dMRI techniques. In summary, this dissertation provides a better understanding about uncertainties in diffusion magnetic resonance

  15. Uncertainty analysis for absorption and first-derivative EPR spectra

    Science.gov (United States)

    Tseitlin, Mark; Eaton, Sandra S.; Eaton, Gareth R.

    2015-01-01

    Electron paramagnetic resonance (EPR) experimental techniques produce absorption or first-derivative spectra. Uncertainty analysis provides the basis for comparison of spectra obtained by different methods. In this study it was used to derive analytical equations to relate uncertainties for integrated intensity and line widths obtained from absorption or first-derivative spectra to the signal-to-noise ratio (SNR), with the assumption of white noise. Predicted uncertainties for integrated intensities and line widths are in good agreement with Monte Carlo calculations for Lorentzian and Gaussian lineshapes. Conservative low-pass filtering changes the noise spectrum, which can be modeled in the Monte Carlo simulations. When noise is close to white, the analytical equations provide useful estimates of uncertainties. For example, for a Lorentzian line with white noise, the uncertainty in the number of spins obtained from the first-derivative spectrum is 2.6 times greater than from the absorption spectrum at the same SNR. Uncertainties in line widths obtained from absorption and first-derivative spectra are similar. The impact of integration or differentiation on SNR and on uncertainties in fitting parameters was analyzed. Although integration of the first-derivative spectrum improves the apparent smoothness of the spectrum, it also changes the frequency distribution of the noise. If the lineshape of the signal is known, the integrated intensity can be determined more accurately by fitting the first-derivative spectrum than by first integrating and then fitting the absorption spectrum. Uncertainties in integrated intensities and line widths are less when the parameters are determined from the original data than from spectra that have been either integrated or differentiated. PMID:25774102

  16. Uncertainty analysis for absorption and first-derivative EPR spectra.

    Science.gov (United States)

    Tseitlin, Mark; Eaton, Sandra S; Eaton, Gareth R

    2012-11-01

    Electron paramagnetic resonance (EPR) experimental techniques produce absorption or first-derivative spectra. Uncertainty analysis provides the basis for comparison of spectra obtained by different methods. In this study it was used to derive analytical equations to relate uncertainties for integrated intensity and line widths obtained from absorption or first-derivative spectra to the signal-to-noise ratio (SNR), with the assumption of white noise. Predicted uncertainties for integrated intensities and line widths are in good agreement with Monte Carlo calculations for Lorentzian and Gaussian lineshapes. Conservative low-pass filtering changes the noise spectrum, which can be modeled in the Monte Carlo simulations. When noise is close to white, the analytical equations provide useful estimates of uncertainties. For example, for a Lorentzian line with white noise, the uncertainty in the number of spins obtained from the first-derivative spectrum is 2.6 times greater than from the absorption spectrum at the same SNR. Uncertainties in line widths obtained from absorption and first-derivative spectra are similar. The impact of integration or differentiation on SNR and on uncertainties in fitting parameters was analyzed. Although integration of the first-derivative spectrum improves the apparent smoothness of the spectrum, it also changes the frequency distribution of the noise. If the lineshape of the signal is known, the integrated intensity can be determined more accurately by fitting the first-derivative spectrum than by first integrating and then fitting the absorption spectrum. Uncertainties in integrated intensities and line widths are less when the parameters are determined from the original data than from spectra that have been either integrated or differentiated.

  17. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

  18. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  19. Geological-structural models used in SR 97. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Saksa, P.; Nummela, J. [FINTACT Oy (Finland)

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km{sup 3}. Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that

  20. Systematic Evaluation of Uncertainty in Material Flow Analysis

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2014-01-01

    Material flow analysis (MFA) is a tool to investigate material flows and stocks in defined systems as a basis for resource management or environmental pollution control. Because of the diverse nature of sources and the varying quality and availability of data, MFA results are inherently uncertain...... in MFA. Based on this, recommendations for consideration of uncertainty in MFA are provided. A five-step framework for uncertainty handling is outlined, reflecting aspects such as data quality and goal/scope of the MFA. We distinguish between descriptive (quantification of material turnover in a region...... for exploratory MFAs. Irrespective of the level of sophistication, lack of information about MFA data poses a major challenge for meaningful uncertainty analysis. The step-wise framework suggested here provides a systematic way to consider available information and produce results as precise as the data warrant....

  1. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pohl, Andrew Phillip [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jordan, Dirk [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.

  2. On the Optimality of Trust Network Analysis with Subjective Logic

    Directory of Open Access Journals (Sweden)

    PARK, Y.

    2014-08-01

    Full Text Available Building and measuring trust is one of crucial aspects in e-commerce, social networking and computer security. Trust networks are widely used to formalize trust relationships and to conduct formal reasoning of trust values. Diverse trust network analysis methods have been developed so far and one of the most widely used schemes is TNA-SL (Trust Network Analysis with Subjective Logic. Recent papers claimed that TNA-SL always finds the optimal solution by producing the least uncertainty. In this paper, we present some counter-examples, which imply that TNA-SL is not an optimal algorithm. Furthermore, we present a probabilistic algorithm in edge splitting to minimize uncertainty.

  3. Sensitivity and Uncertainty analysis of saltwater intrusion in coastal aquifers

    Science.gov (United States)

    Zhao, Z.; Jin, G.; Zhao, J.; Li, L.; Chen, X.; Tao, X.

    2012-12-01

    Aquifer heterogeneity has been a focus in uncertainty analysis of saltwater intrusion in coastal aquifers, especially the spatial variance of hydraulic conductivities. In this study, we investigated how inland and seaward boundary conditions may also contribute to the uncertainty in predicting saltwater intrusion in addition to the aquifer properties. Based on numerical simulations, the analysis focused on the salt-freshwater mixing zoon characterized by its location given by the contour line of 50% salt concentration of seawater and width of an area between the contour lines of 10% and 90% seawater concentrations. Sensitivity analysis was conducted first to identify the most influential factors on the location and width of the mixing zoon among tidal amplitude, freshwater influx rate, aquifer permeability, fluid viscosity and longitudinal dispersivity. Based on the results of the sensitivity analysis, an efficient sampling strategy was form to determine the parameter space for uncertainty analysis. The results showed that (1) both freshwater influx across the inland boundary and tidal oscillations on the seaward boundary imposed a retardation effect on the mixing zoon; and (2) seasonal variations of freshwater influx rate combined with tidal fluctuations of sea level led to great uncertainty with the simulated mixing zoon.

  4. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  5. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  6. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  7. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    Science.gov (United States)

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  8. Comparative Analysis of Uncertainties in Urban Surface Runoff Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Schaarup-Jensen, Kjeld

    2007-01-01

    In the present paper a comparison between three different surface runoff models, in the numerical urban drainage tool MOUSE, is conducted. Analysing parameter uncertainty, it is shown that the models are very sensitive with regards to the choice of hydrological parameters, when combined overflow...... analysis, further research in improved parameter assessment for surface runoff models is needed....

  9. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  10. Quantitative analysis of uncertainty from pebble flow in HTR

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Hao, E-mail: haochen.heu@163.com [Fundamental Science on Nuclear Safety and Simulation Technology Laboratory, College of Nuclear Science and Technology, Harbin Engineering University, Harbin (China); Institute of Nuclear and New Energy Technology (INET), Collaborative Innovation Center of Advanced Nuclear Energy Technology, Key Laboratory of Advanced Reactor Engineering and Safety of Ministry of Education, Tsinghua University, Beijing (China); Fu, Li; Jiong, Guo; Ximing, Sun; Lidong, Wang [Institute of Nuclear and New Energy Technology (INET), Collaborative Innovation Center of Advanced Nuclear Energy Technology, Key Laboratory of Advanced Reactor Engineering and Safety of Ministry of Education, Tsinghua University, Beijing (China)

    2015-12-15

    Highlights: • An uncertainty and sensitivity analysis model for pebble flow has been built. • Experiment and random walk theory are used to identify uncertainty of pebble flow. • Effects of pebble flow to the core parameters are identified by sensitivity analysis. • Uncertainty of core parameters due to pebble flow is quantified for the first time. - Abstract: In pebble bed HTR, along the deterministic average flow lines, randomness exists in the flow of pebbles, which is not possible to simulate with the current reactor design codes for HTR, such as VSOP, due to the limitation of current computer capability. In order to study how the randomness of pebble flow will affect the key parameters in HTR, a new pebble flow model was set up, which has been successfully transplanted into the VSOP code. In the new pebble flow model, mixing coefficients were introduced into the fixed flow line to simulate the randomness of pebble flow. Numerical simulation and pebble flow experiments were facilitated to determine the mixing coefficients. Sensitivity analysis was conducted to achieve the conclusion that the key parameters of pebble bed HTR are not sensitive to the randomness in pebble flow. The uncertainty of maximum power density and power distribution caused by the randomness in pebble flow is very small, especially for the “multi-pass” scheme of fuel circulation adopted in the pebble bed HTR.

  11. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  12. Uncertainty analysis for seismic hazard in Northern and Central Italy

    Science.gov (United States)

    Lombardi, A.M.; Akinci, A.; Malagnini, L.; Mueller, C.S.

    2005-01-01

    In this study we examine uncertainty and parametric sensitivity of Peak Ground Acceleration (PGA) and 1-Hz Spectral Acceleration (1-Hz SA) in probabilistic seismic hazard maps (10% probability of exceedance in 50 years) of Northern and Central Italy. The uncertainty in hazard is estimated using a Monte Carlo approach to randomly sample a logic tree that has three input-variables branch points representing alternative values for b-value, maximum magnitude (Mmax) and attenuation relationships. Uncertainty is expressed in terms of 95% confidence band and Coefficient Of Variation (COV). The overall variability of ground motions and their sensitivity to each parameter of the logic tree are investigated. The largest values of the overall 95% confidence band are around 0.15 g for PGA in the Friuli and Northern Apennines regions and around 0.35 g for 1-Hz SA in the Central Apennines. The sensitivity analysis shows that the largest contributor to seismic hazard variability is uncertainty in the choice of ground-motion attenuation relationships, especially in the Friuli Region (???0.10 g) for PGA and in the Friuli and Central Apennines regions (???0.15 g) for 1-Hz SA. This is followed by the variability of the b-value: its main contribution is evident in the Friuli and Central Apennines regions for both 1-Hz SA (???0.15 g) and PGA (???0.10 g). We observe that the contribution of Mmax to seismic hazard variability is negligible, at least for 10% exceedance in 50-years hazard. The overall COV map for PGA shows that the uncertainty in the hazard is larger in the Friuli and Northern Apennine regions, around 20-30%, than the Central Apennines and Northwestern Italy, around 10-20%. The overall uncertainty is larger for the 1-Hz SA map and reaches 50-60% in the Central Apennines and Western Alps.

  13. Treatment of uncertainties in the IPCC: a philosophical analysis

    Science.gov (United States)

    Jebeile, J.; Drouet, I.

    2014-12-01

    The IPCC produces scientific reports out of findings on climate and climate change. Because the findings are uncertain in many respects, the production of reports requires aggregating assessments of uncertainties of different kinds. This difficult task is currently regulated by the Guidance note for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties. The note recommends that two metrics—i.e. confidence and likelihood— be used for communicating the degree of certainty in findings. Confidence is expressed qualitatively "based on the type, amount, quality, and consistency of evidence […] and the degree of agreement", while likelihood is expressed probabilistically "based on statistical analysis of observations or model results, or expert judgment". Therefore, depending on the evidence evaluated, authors have the choice to present either an assigned level of confidence or a quantified measure of likelihood. But aggregating assessments of uncertainties of these two different kinds express distinct and conflicting methodologies. So the question arises whether the treatment of uncertainties in the IPCC is rationally justified. In order to answer the question, it is worth comparing the IPCC procedures with the formal normative theories of epistemic rationality which have been developed by philosophers. These theories—which include contributions to the philosophy of probability and to bayesian probabilistic confirmation theory—are relevant for our purpose because they are commonly used to assess the rationality of common collective jugement formation based on uncertain knowledge. In this paper we make the comparison and pursue the following objectives: i/we determine whether the IPCC confidence and likelihood can be compared with the notions of uncertainty targeted by or underlying the formal normative theories of epistemic rationality; ii/we investigate whether the formal normative theories of epistemic rationality justify

  14. The Subject Analysis of Payment Systems Characteristics

    OpenAIRE

    Korobeynikova Olga Mikhaylovna

    2015-01-01

    The article deals with the analysis of payment systems aimed at identifying the categorical terminological apparatus, proving their specific features and revealing the impact of payment systems on the state of money turnover. On the basis of the subject analysis, the author formulates the definitions of a payment system (characterized by increasing speed of effecting payments, by the reduction of costs, by high degree of payments convenience for subjects of transactions, by security of paymen...

  15. Use of Forward Sensitivity Analysis Method to Improve Code Scaling, Applicability, and Uncertainty (CSAU) Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Haihua Zhao; Vincent A. Mousseau; Nam T. Dinh

    2010-10-01

    Code Scaling, Applicability, and Uncertainty (CSAU) methodology was developed in late 1980s by US NRC to systematically quantify reactor simulation uncertainty. Basing on CSAU methodology, Best Estimate Plus Uncertainty (BEPU) methods have been developed and widely used for new reactor designs and existing LWRs power uprate. In spite of these successes, several aspects of CSAU have been criticized for further improvement: i.e., (1) subjective judgement in PIRT process; (2) high cost due to heavily relying large experimental database, needing many experts man-years work, and very high computational overhead; (3) mixing numerical errors with other uncertainties; (4) grid dependence and same numerical grids for both scaled experiments and real plants applications; (5) user effects; Although large amount of efforts have been used to improve CSAU methodology, the above issues still exist. With the effort to develop next generation safety analysis codes, new opportunities appear to take advantage of new numerical methods, better physical models, and modern uncertainty qualification methods. Forward sensitivity analysis (FSA) directly solves the PDEs for parameter sensitivities (defined as the differential of physical solution with respective to any constant parameter). When the parameter sensitivities are available in a new advanced system analysis code, CSAU could be significantly improved: (1) Quantifying numerical errors: New codes which are totally implicit and with higher order accuracy can run much faster with numerical errors quantified by FSA. (2) Quantitative PIRT (Q-PIRT) to reduce subjective judgement and improving efficiency: treat numerical errors as special sensitivities against other physical uncertainties; only parameters having large uncertainty effects on design criterions are considered. (3) Greatly reducing computational costs for uncertainty qualification by (a) choosing optimized time steps and spatial sizes; (b) using gradient information

  16. Uncertainty Assessment of Hydrological Frequency Analysis Using Bootstrap Method

    Directory of Open Access Journals (Sweden)

    Yi-Ming Hu

    2013-01-01

    Full Text Available The hydrological frequency analysis (HFA is the foundation for the hydraulic engineering design and water resources management. Hydrological extreme observations or samples are the basis for HFA; the representativeness of a sample series to the population distribution is extremely important for the estimation reliability of the hydrological design value or quantile. However, for most of hydrological extreme data obtained in practical application, the size of the samples is usually small, for example, in China about 40~50 years. Generally, samples with small size cannot completely display the statistical properties of the population distribution, thus leading to uncertainties in the estimation of hydrological design values. In this paper, a new method based on bootstrap is put forward to analyze the impact of sampling uncertainty on the design value. By bootstrap resampling technique, a large number of bootstrap samples are constructed from the original flood extreme observations; the corresponding design value or quantile is estimated for each bootstrap sample, so that the sampling distribution of design value is constructed; based on the sampling distribution, the uncertainty of quantile estimation can be quantified. Compared with the conventional approach, this method provides not only the point estimation of a design value but also quantitative evaluation on uncertainties of the estimation.

  17. Stochastic analysis in production process and ecology under uncertainty

    CERN Document Server

    Bieda, Bogusław

    2014-01-01

    The monograph addresses a problem of stochastic analysis based on the uncertainty assessment by simulation and application of this method in ecology and steel industry under uncertainty. The first chapter defines the Monte Carlo (MC) method and random variables in stochastic models. Chapter two deals with the contamination transport in porous media. Stochastic approach for Municipal Solid Waste transit time contaminants modeling using MC simulation has been worked out. The third chapter describes the risk analysis of the waste to energy facility proposal for Konin city, including the financial aspects. Environmental impact assessment of the ArcelorMittal Steel Power Plant, in Kraków - in the chapter four - is given. Thus, four scenarios of the energy mix production processes were studied. Chapter five contains examples of using ecological Life Cycle Assessment (LCA) - a relatively new method of environmental impact assessment - which help in preparing pro-ecological strategy, and which can lead to reducing t...

  18. Bayesian Characterization of Uncertainty in Intra-Subject Non-Rigid Registration

    Science.gov (United States)

    Risholm, Petter; Janoos, Firdaus; Norton, Isaiah; Golby, Alex J.; Wells, William M.

    2013-01-01

    In settings where high-level inferences are made based on registered image data, the registration uncertainty can contain important information. In this article, we propose a Bayesian non-rigid registration framework where conventional dissimilarity and regularization energies can be included in the likelihood and the prior distribution on deformations respectively through the use of Boltzmann’s distribution. The posterior distribution is characterized using Markov Chain Monte Carlo (MCMC) methods with the effect of the Boltzmann temperature hyper-parameters marginalized under broad uninformative hyper-prior distributions. The MCMC chain permits estimation of the most likely deformation as well as the associated uncertainty. On synthetic examples, we demonstrate the ability of the method to identify the maximum a posteriori estimate and the associated posterior uncertainty, and demonstrate that the posterior distribution can be non-Gaussian. Additionally, results from registering clinical data acquired during neurosurgery for resection of brain tumor are provided; we compare the method to single transformation results from a deterministic optimizer and introduce methods that summarize the high-dimensional uncertainty. At the site of resection, the registration uncertainty increases and the marginal distribution on deformations is shown to be multi-modal. PMID:23602919

  19. Epistemic Uncertainty Analysis: An Approach Using Expert Judgment and Evidential Credibility

    Directory of Open Access Journals (Sweden)

    Patrick Hester

    2012-01-01

    Full Text Available When dealing with complex systems, all decision making occurs under some level of uncertainty. This is due to the physical attributes of the system being analyzed, the environment in which the system operates, and the individuals which operate the system. Techniques for decision making that rely on traditional probability theory have been extensively pursued to incorporate these inherent aleatory uncertainties. However, complex problems also typically include epistemic uncertainties that result from lack of knowledge. These problems are fundamentally different and cannot be addressed in the same fashion. In these instances, decision makers typically use subject matter expert judgment to assist in the analysis of uncertainty. The difficulty with expert analysis, however, is in assessing the accuracy of the expert's input. The credibility of different information can vary widely depending on the expert’s familiarity with the subject matter and their intentional (i.e., a preference for one alternative over another and unintentional biases (heuristics, anchoring, etc.. This paper proposes the metric of evidential credibility to deal with this issue. The proposed approach is ultimately demonstrated on an example problem concerned with the estimation of aircraft maintenance times for the Turkish Air Force.

  20. Uncertainty analysis in WWTP model applications: a critical discussion using an example from design.

    Science.gov (United States)

    Sin, Gürkan; Gernaey, Krist V; Neumann, Marc B; van Loosdrecht, Mark C M; Gujer, Willi

    2009-06-01

    This study focuses on uncertainty analysis of WWTP models and analyzes the issue of framing and how it affects the interpretation of uncertainty analysis results. As a case study, the prediction of uncertainty involved in model-based design of a wastewater treatment plant is studied. The Monte Carlo procedure is used for uncertainty estimation, for which the input uncertainty is quantified through expert elicitation and the sampling is performed using the Latin hypercube method. Three scenarios from engineering practice are selected to examine the issue of framing: (1) uncertainty due to stoichiometric, biokinetic and influent parameters; (2) uncertainty due to hydraulic behaviour of the plant and mass transfer parameters; (3) uncertainty due to the combination of (1) and (2). The results demonstrate that depending on the way the uncertainty analysis is framed, the estimated uncertainty of design performance criteria differs significantly. The implication for the practical applications of uncertainty analysis in the wastewater industry is profound: (i) as the uncertainty analysis results are specific to the framing used, the results must be interpreted within the context of that framing; and (ii) the framing must be crafted according to the particular purpose of uncertainty analysis/model application. Finally, it needs to be emphasised that uncertainty analysis is no doubt a powerful tool for model-based design among others, however clear guidelines for good uncertainty analysis in wastewater engineering practice are needed.

  1. Final Report. Analysis and Reduction of Complex Networks Under Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Coles, T. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Spantini, A. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Tosatto, L. [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2013-09-30

    The project was a collaborative effort among MIT, Sandia National Laboratories (local PI Dr. Habib Najm), the University of Southern California (local PI Prof. Roger Ghanem), and The Johns Hopkins University (local PI Prof. Omar Knio, now at Duke University). Our focus was the analysis and reduction of large-scale dynamical systems emerging from networks of interacting components. Such networks underlie myriad natural and engineered systems. Examples important to DOE include chemical models of energy conversion processes, and elements of national infrastructure—e.g., electric power grids. Time scales in chemical systems span orders of magnitude, while infrastructure networks feature both local and long-distance connectivity, with associated clusters of time scales. These systems also blend continuous and discrete behavior; examples include saturation phenomena in surface chemistry and catalysis, and switching in electrical networks. Reducing size and stiffness is essential to tractable and predictive simulation of these systems. Computational singular perturbation (CSP) has been effectively used to identify and decouple dynamics at disparate time scales in chemical systems, allowing reduction of model complexity and stiffness. In realistic settings, however, model reduction must contend with uncertainties, which are often greatest in large-scale systems most in need of reduction. Uncertainty is not limited to parameters; one must also address structural uncertainties—e.g., whether a link is present in a network—and the impact of random perturbations, e.g., fluctuating loads or sources. Research under this project developed new methods for the analysis and reduction of complex multiscale networks under uncertainty, by combining computational singular perturbation (CSP) with probabilistic uncertainty quantification. CSP yields asymptotic approximations of reduceddimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing

  2. Orbit uncertainty propagation and sensitivity analysis with separated representations

    Science.gov (United States)

    Balducci, Marc; Jones, Brandon; Doostan, Alireza

    2017-09-01

    Most approximations for stochastic differential equations with high-dimensional, non-Gaussian inputs suffer from a rapid (e.g., exponential) increase of computational cost, an issue known as the curse of dimensionality. In astrodynamics, this results in reduced accuracy when propagating an orbit-state probability density function. This paper considers the application of separated representations for orbit uncertainty propagation, where future states are expanded into a sum of products of univariate functions of initial states and other uncertain parameters. An accurate generation of separated representation requires a number of state samples that is linear in the dimension of input uncertainties. The computation cost of a separated representation scales linearly with respect to the sample count, thereby improving tractability when compared to methods that suffer from the curse of dimensionality. In addition to detailed discussions on their construction and use in sensitivity analysis, this paper presents results for three test cases of an Earth orbiting satellite. The first two cases demonstrate that approximation via separated representations produces a tractable solution for propagating the Cartesian orbit-state uncertainty with up to 20 uncertain inputs. The third case, which instead uses Equinoctial elements, reexamines a scenario presented in the literature and employs the proposed method for sensitivity analysis to more thoroughly characterize the relative effects of uncertain inputs on the propagated state.

  3. Analysis of uncertainty propagation through model parameters and structure.

    Science.gov (United States)

    Patil, Abhijit; Deng, Zhi-Qiang

    2010-01-01

    Estimation of uncertainty propagation in watershed models is challenging but useful to total maximum daily load (TMDL) calculations. This paper presents an effective approach, involving the combined application of Rosenblueth method and sensitivity analysis, to the determination of uncertainty propagation through the parameters and structure of the HSPF (Hydrologic Simulation Program-FORTRAN) model. The sensitivity analysis indicates that the temperature is a major forcing function in the DO-BOD balance and controls the overall dissolved oxygen concentration. The mean and standard deviation from the descriptive statistics of dissolved oxygen data obtained using the HSPF model are compared to those estimated using Rosenblueth's method. The difference is defined as the error propagated from water temperature through dissolved oxygen. The error propagation, while considering the second order sensitivity coefficient in Rosenblueth's method, is observed to have a mean of 0.281 mg/l and a standard deviation of 0.099 mg/l. A relative low error propagation value is attributed to low skewness of dependent and independent variables. The results provide new insights into the uncertainty propagation in the HSPF model commonly used for TMDL development.

  4. ProbCD: enrichment analysis accounting for categorization uncertainty

    Directory of Open Access Journals (Sweden)

    Shmulevich Ilya

    2007-10-01

    Full Text Available Abstract Background As in many other areas of science, systems biology makes extensive use of statistical association and significance estimates in contingency tables, a type of categorical data analysis known in this field as enrichment (also over-representation or enhancement analysis. In spite of efforts to create probabilistic annotations, especially in the Gene Ontology context, or to deal with uncertainty in high throughput-based datasets, current enrichment methods largely ignore this probabilistic information since they are mainly based on variants of the Fisher Exact Test. Results We developed an open-source R-based software to deal with probabilistic categorical data analysis, ProbCD, that does not require a static contingency table. The contingency table for the enrichment problem is built using the expectation of a Bernoulli Scheme stochastic process given the categorization probabilities. An on-line interface was created to allow usage by non-programmers and is available at: http://xerad.systemsbiology.net/ProbCD/. Conclusion We present an analysis framework and software tools to address the issue of uncertainty in categorical data analysis. In particular, concerning the enrichment analysis, ProbCD can accommodate: (i the stochastic nature of the high-throughput experimental techniques and (ii probabilistic gene annotation.

  5. Parametric uncertainty analysis of pulse wave propagation in a model of a human arterial network

    Science.gov (United States)

    Xiu, Dongbin; Sherwin, Spencer J.

    2007-10-01

    Reduced models of human arterial networks are an efficient approach to analyze quantitative macroscopic features of human arterial flows. The justification for such models typically arise due to the significantly long wavelength associated with the system in comparison to the lengths of arteries in the networks. Although these types of models have been employed extensively and many issues associated with their implementations have been widely researched, the issue of data uncertainty has received comparatively little attention. Similar to many biological systems, a large amount of uncertainty exists in the value of the parameters associated with the models. Clearly reliable assessment of the system behaviour cannot be made unless the effect of such data uncertainty is quantified. In this paper we present a study of parametric data uncertainty in reduced modelling of human arterial networks which is governed by a hyperbolic system. The uncertain parameters are modelled as random variables and the governing equations for the arterial network therefore become stochastic. This type stochastic hyperbolic systems have not been previously systematically studied due to the difficulties introduced by the uncertainty such as a potential change in the mathematical character of the system and imposing boundary conditions. We demonstrate how the application of a high-order stochastic collocation method based on the generalized polynomial chaos expansion, combined with a discontinuous Galerkin spectral/hp element discretization in physical space, can successfully simulate this type of hyperbolic system subject to uncertain inputs with bounds. Building upon a numerical study of propagation of uncertainty and sensitivity in a simplified model with a single bifurcation, a systematical parameter sensitivity analysis is conducted on the wave dynamics in a multiple bifurcating human arterial network. Using the physical understanding of the dynamics of pulse waves in these types of

  6. The Impact of Uncertainty on Investment. A Meta-Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koetse, M.J. [Department of Spatial Economics, Vrije Universiteit Amsterdam (Netherlands); De Groot, Henri L.F. [Tinbergen Institute, Amsterdam (Netherlands); Florax, R.J.G.M. [Department of Agricultural Economics, Purdue University, West Lafayette (United States)

    2006-07-01

    In this paper we perform a meta-analysis on empirical estimates of the impact between investment and uncertainty. Since the outcomes of primary studies are largely incomparable with respect to the magnitude of the effect, our analysis focuses on the direction and statistical significance of the relationship. The standard approach in this situation is to estimate an ordered probit model on a categorical estimate, defined in terms of the direction of the effect. The estimates are transformed into marginal effects, in order to represent the changes in the probability of finding a negative significant, insignificant, and positive significant estimate. Although a meta-analysis generally does not allow for inferences on the correctness of model specifications in primary studies, our results give clear directions for model building in empirical investment research. For example, not including factor prices in investment models may seriously affect the model outcomes. Furthermore, we find that Q models produce more negative significant estimates than other models do, ceteris paribus. The outcome of a study is also affected by the type of data used in a primary study. Although it is clear that meta-analysis cannot always give decisive insights into the explanations for the variation in empirical outcomes, our meta-analysis shows that we can explain to a large extent why empirical estimates of the investment uncertainty relationship differ.

  7. Uncertainty analysis of wind-wave predictions in Lake Michigan

    Science.gov (United States)

    Nekouee, Navid; Ataie-Ashtiani, Behzad; Hamidi, Sajad Ahmad

    2016-10-01

    With all the improvement in wave and hydrodynamics numerical models, the question rises in our mind that how the accuracy of the forcing functions and their input can affect the results. In this paper, a commonly used numerical third-generation wave model, SWAN is applied to predict waves in Lake Michigan. Wind data are analyzed to determine wind variation frequency over Lake Michigan. Wave predictions uncertainty due to wind local effects are compared during a period where wind has a fairly constant speed and direction over the northern and southern basins. The study shows that despite model calibration in Lake Michigan area, the model deficiency arises from ignoring wind effects in small scales. Wave prediction also emphasizes that small scale turbulence in meteorological forces can increase prediction errors by 38%. Wave frequency and coherence analysis show that both models can predict the wave variation time scale with the same accuracy. Insufficient number of meteorological stations can result in neglecting local wind effects and discrepancies in current predictions. The uncertainty of wave numerical models due to input uncertainties and model principals should be taken into account for design risk factors.

  8. Eye tracker uncertainty analysis and modelling in real time

    Science.gov (United States)

    Fornaser, A.; De Cecco, M.; Leuci, M.; Conci, N.; Daldoss, M.; Armanini, A.; Maule, L.; De Natale, F.; Da Lio, M.

    2017-01-01

    Techniques for tracking the eyes took place since several decades for different applications that range from military, to education, entertainment and clinics. The existing systems are in general of two categories: precise but intrusive or comfortable but less accurate. The idea of this work is to calibrate an eye tracker of the second category. In particular we have estimated the uncertainty both in nominal and in case of variable operating conditions. We took into consideration different influencing factors such as: head movement and rotation, eyes detected, target position on the screen, illumination and objects in front of the eyes. Results proved that the 2D uncertainty can be modelled as a circular confidence interval as far as there is no stable principal directions in both the systematic and the repeatability effects. This confidence region was also modelled as a function of the current working conditions. In this way we can obtain a value of the uncertainty that is a function of the operating condition estimated in real time opening the field to new applications that reconfigure the human machine interface as a function of the operating conditions. Examples can range from option buttons reshape, local zoom dynamically adjusted, speed optimization to regulate interface responsiveness, the possibility to take into account the uncertainty associated to a particular interaction. Furthermore, in the analysis of visual scanning patterns, the resulting Point of Regard maps would be associated with proper confidence levels thus allowing to draw accurate conclusions. We conducted an experimental campaign to estimate and validate the overall modelling procedure obtaining valid results in 86% of the cases.

  9. Measurement Uncertainty Analysis of the Strain Gauge Based Stabilographic Platform

    Directory of Open Access Journals (Sweden)

    Walendziuk Wojciech

    2014-08-01

    Full Text Available The present article describes constructing a stabilographic platform which records a standing patient’s deflection from their point of balance. The constructed device is composed of a toughen glass slab propped with 4 force sensors. Power transducers are connected to the measurement system based on a 24-bit ADC transducer which acquires slight body movements of a patient. The data is then transferred to the computer in real time and data analysis is conducted. The article explains the principle of operation as well as the algorithm of measurement uncertainty for the COP (Centre of Pressure surface (x, y.

  10. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    Science.gov (United States)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  11. Reducing spatial uncertainty in climatic maps through geostatistical analysis

    Science.gov (United States)

    Pesquer, Lluís; Ninyerola, Miquel; Pons, Xavier

    2014-05-01

    Climatic maps from meteorological stations and geographical co-variables can be obtained through correlative models (Ninyerola et al., 2000)*. Nevertheless, the spatial uncertainty of the resulting maps could be reduced. The present work is a new stage over those approaches aiming to study how to obtain better results while characterizing spatial uncertainty. The study area is Catalonia (32000 km2), a region with highly variable relief (0 to 3143 m). We have used 217 stations (321 to 1244 mm) to model the annual precipitation in two steps: 1/ multiple regression using geographical variables (elevation, distance to the coast, latitude, etc) and 2/ refinement of the results by adding the spatial interpolation of the regression residuals with inverse distance weighting (IDW), regularized splines with tension (SPT) or ordinary kriging (OK). Spatial uncertainty analysis is based on an independent subsample (test set), randomly selected in previous works. The main contribution of this work is the analysis of this test set as well as the search for an optimal process of division (split) of the stations in two sets, one used to perform the multiple regression and residuals interpolation (fit set), and another used to compute the quality (test set); optimal division should reduce spatial uncertainty and improve the overall quality. Two methods have been evaluated against classical methods: (random selection RS and leave-one-out cross-validation LOOCV): selection by Euclidian 2D-distance, and selection by anisotropic 2D-distance combined with a 3D-contribution (suitable weighted) from the most representative independent variable. Both methods define a minimum threshold distance, obtained by variogram analysis, between samples. Main preliminary results for LOOCV, RS (average from 10 executions), Euclidian criterion (EU), and for anisotropic criterion (with 1.1 value, UTMY coordinate has a bit more weight than UTMX) combined with 3D criteria (A3D) (1000 factor for elevation

  12. Autonomous Pointing Control of a Large Satellite Antenna Subject to Parametric Uncertainty.

    Science.gov (United States)

    Wu, Shunan; Liu, Yufei; Radice, Gianmarco; Tan, Shujun

    2017-03-10

    With the development of satellite mobile communications, large antennas are now widely used. The precise pointing of the antenna's optical axis is essential for many space missions. This paper addresses the challenging problem of high-precision autonomous pointing control of a large satellite antenna. The pointing dynamics are firstly proposed. The proportional-derivative feedback and structural filter to perform pointing maneuvers and suppress antenna vibrations are then presented. An adaptive controller to estimate actual system frequencies in the presence of modal parameters uncertainty is proposed. In order to reduce periodic errors, the modified controllers, which include the proposed adaptive controller and an active disturbance rejection filter, are then developed. The system stability and robustness are analyzed and discussed in the frequency domain. Numerical results are finally provided, and the results have demonstrated that the proposed controllers have good autonomy and robustness.

  13. Concise Neural Nonaffine Control of Air-Breathing Hypersonic Vehicles Subject to Parametric Uncertainties

    Directory of Open Access Journals (Sweden)

    Xiangwei Bu

    2017-01-01

    Full Text Available In this paper, a novel simplified neural control strategy is proposed for the longitudinal dynamics of an air-breathing hypersonic vehicle (AHV directly using nonaffine models instead of affine ones. For the velocity dynamics, an adaptive neural controller is devised based on a minimal-learning parameter (MLP technique for the sake of decreasing computational loads. The altitude dynamics is rewritten as a pure feedback nonaffine formulation, for which a novel concise neural control approach is achieved without backstepping. The special contributions are that the control architecture is concise and the computational cost is low. Moreover, the exploited controller possesses good practicability since there is no need for affine models. The semiglobally uniformly ultimate boundedness of all the closed-loop system signals is guaranteed via Lyapunov stability theory. Finally, simulation results are presented to validate the effectiveness of the investigated control methodology in the presence of parametric uncertainties.

  14. Autonomous Pointing Control of a Large Satellite Antenna Subject to Parametric Uncertainty

    Directory of Open Access Journals (Sweden)

    Shunan Wu

    2017-03-01

    Full Text Available With the development of satellite mobile communications, large antennas are now widely used. The precise pointing of the antenna’s optical axis is essential for many space missions. This paper addresses the challenging problem of high-precision autonomous pointing control of a large satellite antenna. The pointing dynamics are firstly proposed. The proportional–derivative feedback and structural filter to perform pointing maneuvers and suppress antenna vibrations are then presented. An adaptive controller to estimate actual system frequencies in the presence of modal parameters uncertainty is proposed. In order to reduce periodic errors, the modified controllers, which include the proposed adaptive controller and an active disturbance rejection filter, are then developed. The system stability and robustness are analyzed and discussed in the frequency domain. Numerical results are finally provided, and the results have demonstrated that the proposed controllers have good autonomy and robustness.

  15. Dynamic Simulation, Sensitivity and Uncertainty Analysis of a Demonstration Scale Lignocellulosic Enzymatic Hydrolysis Process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Sin, Gürkan

    2014-01-01

    This study presents the uncertainty and sensitivity analysis of a lignocellulosic enzymatic hydrolysis model considering both model and feed parameters as sources of uncertainty. The dynamic model is parametrized for accommodating various types of biomass, and different enzymatic complexes...

  16. Mathematical aspects of the Heisenberg uncertainty principle within local fractional Fourier analysis

    OpenAIRE

    Yang, Xiao-jun; Baleanu, Dumitru; Machado, J.A. Tenreiro

    2013-01-01

    In this paper, we discuss the mathematical aspects of the Heisenberg uncertainty principle within local fractional Fourier analysis. The Schrödinger equation and Heisenberg uncertainty principles are structured within local fractional operators.

  17. Uncertainties analysis made easy using basic electricity analogy

    CERN Document Server

    Cardoso, George C

    2016-01-01

    This paper proposes a battery-resistor circuit to aid introductory laboratory students visualize concepts of experimental measurement uncertainty, sums of uncertainties and uncertainty of the mean value. In the model presented the uncertainty or noise can be though of as noise in a loudspeaker, making the analogy simple to understand. The mathematics used is simple, requires no knowledge of statistics and provides correct expressions.

  18. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  19. Geoengineering to Avoid Overshoot: An Analysis of Uncertainty

    Science.gov (United States)

    Tanaka, Katsumasa; Cho, Cheolhung; Krey, Volker; Patt, Anthony; Rafaj, Peter; Rao-Skirbekk, Shilpa; Wagner, Fabian

    2010-05-01

    ., 2009) is employed to calculate climate responses including associated uncertainty and to estimate geoengineering profiles to cap the warming at 2°C since preindustrial. The inversion setup for the model ACC2 is used to estimate the uncertain parameters (e.g. climate sensitivity) against associated historical observations (e.g. global-mean surface air temperature). Our preliminary results show that under climate and scenario uncertainties, a geoengineering intervention to avoid an overshoot would be with medium intensity in the latter half of this century (≈ 1 Mt. Pinatubo eruption every 4 years in terms of stratospheric sulfur injections). The start year of geoengineering intervention does not significantly influence the long-term geoengineering profile. However, a geoengineering intervention of the medium intensity could bring about substantial environmental side effects such as the destruction of stratospheric ozone. Our results point to the necessity to pursue persistently mainstream mitigation efforts. 2) Pollution Abatement and Geoengineering The second study examines the potential of geoengineering combined with air clean policy. A drastic air pollution abatement might result in an abrupt warming because it would suddenly remove the tropospheric aerosols which partly offset the background global warming (e.g. Andreae et al, 2005, Raddatz and Tanaka, 2010). This study investigates the magnitude of unrealized warming under a range of policy assumptions and associated uncertainties. Then the profile of geoengineering is estimated to suppress the warming that would be accompanied by clean air policy. This study is the first attempt to explore uncertainty in the warming caused by clean air policy - Kloster et al. (2009), which assess regional changes in climate and hydrological cycle, has not however included associated uncertainties in the analysis. A variety of policy assumptions will be devised to represent various degrees of air pollution abatement. These

  20. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    Science.gov (United States)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  1. Reducing the uncertainty in robotic machining by modal analysis

    Science.gov (United States)

    Alberdi, Iñigo; Pelegay, Jose Angel; Arrazola, Pedro Jose; Ørskov, Klaus Bonde

    2017-10-01

    The use of industrial robots for machining could lead to high cost and energy savings for the manufacturing industry. Machining robots offer several advantages respect to CNC machines such as flexibility, wide working space, adaptability and relatively low cost. However, there are some drawbacks that are preventing a widespread adoption of robotic solutions namely lower stiffness, vibration/chatter problems and lower accuracy and repeatability. Normally due to these issues conservative cutting parameters are chosen, resulting in a low material removal rate (MRR). In this article, an example of a modal analysis of a robot is presented. For that purpose the Tap-testing technology is introduced, which aims at maximizing productivity, reducing the uncertainty in the selection of cutting parameters and offering a stable process free from chatter vibrations.

  2. Reduction of uncertainties in probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choun, Young Sun; Choi, In Kil [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-02-01

    An integrated research for the reduction of conservatism and uncertainties in PSHA in Korea was performed. The research consisted of five technical task areas as follows; Task 1: Earthquake Catalog Development for PSHA. Task 2: Evaluation of Seismicity and Tectonics of the Korea Region. Task 3: Development of a Ground Motion Relationships. Task 4: Improvement of PSHA Modelling Methodology. Task 5: Development of Seismic Source Interpretations for the region of Korea for Inputs to PSHA. A series of tests on an ancient wooden house and an analysis on medium size earthquake in Korea were performed intensively. Signification improvement, especially in the estimation of historical earthquake, ground motion attenuation, and seismic source interpretations, were made through this study. 314 refs., 180 figs., 54 tabs. (Author)

  3. Property Uncertainty Analysis and Methods for Optimal Working Fluids of Thermodynamic Cycles

    DEFF Research Database (Denmark)

    Frutiger, Jerome

    heat from spray-drying air in dairy industries. The novel reverse engineering approach provides a valid alternative to computationally demanding optimization approaches and al-lows to take into account property uncertainties. The outcome of this thesis asserts that property uncertainties should...... pro-cedure for the propagation of property uncertainties through the cycle process onto the model output uncertainty, and 4) novel strategies for the selection of working fluids under property uncertainties, in particular a new reverse engineering ap-proach based on sampling and uncertainty concepts...... property uncertainties is a vital tool for model analysis and fluid selection. In the second fluid selection study the novel reverse engineering approach based on sampling techniques and uncertainty analysis is applied to identify suitable working fluids for a industrial heat pump system, used to recover...

  4. Quality in environmental science for policy: assessing uncertainty as a component of policy analysis

    NARCIS (Netherlands)

    Maxim, L.; van der Sluijs, J.P.

    2011-01-01

    The sheer number of attempts to define and classify uncertainty reveals an awareness of its importance in environmental science for policy, though the nature of uncertainty is often misunderstood. The interdisciplinary field of uncertainty analysis is unstable; there are currently several incomplete

  5. Practical issues in handling data input and uncertainty in a budget impact analysis

    NARCIS (Netherlands)

    M.J.C. Nuijten (Mark); T. Mittendorf (Thomas); U. Persson (Ulf)

    2011-01-01

    textabstractThe objective of this paper was to address the importance of dealing systematically and comprehensively with uncertainty in a budget impact analysis (BIA) in more detail. The handling of uncertainty in health economics was used as a point of reference for addressing the uncertainty in a

  6. Uncertainty with friction parameters and impact on risk analysis

    Directory of Open Access Journals (Sweden)

    Willis T.D.M.

    2016-01-01

    This uncertainty is also analysed in a wider Monte Carlo method, comparing other sources of uncertainty in flood modelling, including hydrological input uncertainty, DTM uncertainty and the uncertainty associated with the computational model used. 3 test cases, with different hydraulic properties are used to provide generic conclusions to the test cases. Two urban test cases with transcritical flow conditions and a river overtopping event in a rural/urban domain. The results from the model results are analysed with typical modelling evaluation techniques, such as binary flood extent comparison and depths comparison measures, as well as measures of exposure, here defined as the cost of damage associated with modelled water depths. The results demonstrate that modelling uncertainty is reduced by increasing the number of frictional surfaces in the modelling, indicating that through marginal pre-processing effort better representation of microscale hydraulics can be achieved, particularly in urban areas. Model results are also far more sensitive to uniform values, which also demonstrate an increased level of uncertainty, even in large scale modelling. The uncertainty associated with friction values though is shown to be relatively small compared to the uncertainty of the numerical scheme, and also displays significant parameter interaction.

  7. Approximate Uncertainty Modeling in Risk Analysis with Vine Copulas.

    Science.gov (United States)

    Bedford, Tim; Daneshkhah, Alireza; Wilson, Kevin J

    2016-04-01

    Many applications of risk analysis require us to jointly model multiple uncertain quantities. Bayesian networks and copulas are two common approaches to modeling joint uncertainties with probability distributions. This article focuses on new methodologies for copulas by developing work of Cooke, Bedford, Kurowica, and others on vines as a way of constructing higher dimensional distributions that do not suffer from some of the restrictions of alternatives such as the multivariate Gaussian copula. The article provides a fundamental approximation result, demonstrating that we can approximate any density as closely as we like using vines. It further operationalizes this result by showing how minimum information copulas can be used to provide parametric classes of copulas that have such good levels of approximation. We extend previous approaches using vines by considering nonconstant conditional dependencies, which are particularly relevant in financial risk modeling. We discuss how such models may be quantified, in terms of expert judgment or by fitting data, and illustrate the approach by modeling two financial data sets. © 2015 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  8. [Parameter uncertainty analysis for urban rainfall runoff modelling].

    Science.gov (United States)

    Huang, Jin-Liang; Lin, Jie; Du, Peng-Fei

    2012-07-01

    An urban watershed in Xiamen was selected to perform the parameter uncertainty analysis for urban stormwater runoff modeling in terms of identification and sensitivity analysis based on storm water management model (SWMM) using Monte-Carlo sampling and regionalized sensitivity analysis (RSA) algorithm. Results show that Dstore-Imperv, Dstore-Perv and Curve Number (CN) are the identifiable parameters with larger K-S values in hydrological and hydraulic module, and the rank of K-S values in hydrological and hydraulic module is Dstore-Imperv > CN > Dstore-Perv > N-Perv > conductivity > Con-Mann > N-Imperv. With regards to water quality module, the parameters in exponent washoff model including Coefficient and Exponent and the Max. Buildup parameter of saturation buildup model in three land cover types are the identifiable parameters with the larger K-S values. In comparison, the K-S value of rate constant in three landuse/cover types is smaller than that of Max. Buildup, Coefficient and Exponent.

  9. Uncertainty and Sensitivity Analysis Results Obtained in the 1996 Performance Assessment for the Waste Isolation Pilot Plant

    Energy Technology Data Exchange (ETDEWEB)

    Bean, J.E.; Berglund, J.W.; Davis, F.J.; Economy, K.; Garner, J.W.; Helton, J.C.; Johnson, J.D.; MacKinnon, R.J.; Miller, J.; O' Brien, D.G.; Ramsey, J.L.; Schreiber, J.D.; Shinta, A.; Smith, L.N.; Stockman, C.; Stoelzel, D.M.; Vaughn, P.

    1998-09-01

    The Waste Isolation Pilot Plant (WPP) is located in southeastern New Mexico and is being developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. A detailed performance assessment (PA) for the WIPP was carried out in 1996 and supports an application by the DOE to the U.S. Environmental Protection Agency (EPA) for the certification of the WIPP for the disposal of TRU waste. The 1996 WIPP PA uses a computational structure that maintains a separation between stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, with stochastic uncertainty arising from the many possible disruptions that could occur over the 10,000 yr regulatory period that applies to the WIPP and subjective uncertainty arising from the imprecision with which many of the quantities required in the PA are known. Important parts of this structure are (1) the use of Latin hypercube sampling to incorporate the effects of subjective uncertainty, (2) the use of Monte Carlo (i.e., random) sampling to incorporate the effects of stochastic uncertainty, and (3) the efficient use of the necessarily limited number of mechanistic calculations that can be performed to support the analysis. The use of Latin hypercube sampling generates a mapping from imprecisely known analysis inputs to analysis outcomes of interest that provides both a display of the uncertainty in analysis outcomes (i.e., uncertainty analysis) and a basis for investigating the effects of individual inputs on these outcomes (i.e., sensitivity analysis). The sensitivity analysis procedures used in the PA include examination of scatterplots, stepwise regression analysis, and partial correlation analysis. Uncertainty and sensitivity analysis results obtained as part of the 1996 WIPP PA are presented and discussed. Specific topics considered include two phase flow in the vicinity of the repository, radionuclide release from the repository, fluid flow and radionuclide

  10. Regional Frequency and Uncertainty Analysis of Extreme Precipitation in Bangladesh

    Science.gov (United States)

    Mortuza, M. R.; Demissie, Y.; Li, H. Y.

    2014-12-01

    Increased frequency of extreme precipitations, especially those with multiday durations, are responsible for recent urban floods and associated significant losses of lives and infrastructures in Bangladesh. Reliable and routinely updated estimation of the frequency of occurrence of such extreme precipitation events are thus important for developing up-to-date hydraulic structures and stormwater drainage system that can effectively minimize future risk from similar events. In this study, we have updated the intensity-duration-frequency (IDF) curves for Bangladesh using daily precipitation data from 1961 to 2010 and quantified associated uncertainties. Regional frequency analysis based on L-moments is applied on 1-day, 2-day and 5-day annual maximum precipitation series due to its advantages over at-site estimation. The regional frequency approach pools the information from climatologically similar sites to make reliable estimates of quantiles given that the pooling group is homogeneous and of reasonable size. We have used Region of influence (ROI) approach along with homogeneity measure based on L-moments to identify the homogenous pooling groups for each site. Five 3-parameter distributions (i.e., Generalized Logistic, Generalized Extreme value, Generalized Normal, Pearson Type Three, and Generalized Pareto) are used for a thorough selection of appropriate models that fit the sample data. Uncertainties related to the selection of the distributions and historical data are quantified using the Bayesian Model Averaging and Balanced Bootstrap approaches respectively. The results from this study can be used to update the current design and management of hydraulic structures as well as in exploring spatio-temporal variations of extreme precipitation and associated risk.

  11. The Subject Analysis of Payment Systems Characteristics

    Directory of Open Access Journals (Sweden)

    Korobeynikova Olga Mikhaylovna

    2015-09-01

    Full Text Available The article deals with the analysis of payment systems aimed at identifying the categorical terminological apparatus, proving their specific features and revealing the impact of payment systems on the state of money turnover. On the basis of the subject analysis, the author formulates the definitions of a payment system (characterized by increasing speed of effecting payments, by the reduction of costs, by high degree of payments convenience for subjects of transactions, by security of payments, by acceptable level of risks and by social efficiency, a national payment system, and a local payment system (characterized by the growth of economic and social efficiency of systems participants, by the process of money turnover optimization on the basis of saving transaction costs and increasing speed of money flows within the local payment systems. According to the economic levels, the payment systems are divided to macrosystems (national payment systems, mezosystems (payment systems localized on the operational and territorial basis, microsystems (payments by individual economic subjects. The establishment of qualitative features of payment systems, which is a basis of the author’s terminological interpretation, gave a possibility to reveal the cause-effect relations of payment systems influence on the state of money turnover in the involved subjects, and on the economy as a whole. The result of the present research consists in revealing the payment systems influence on the state of money turnover which is significant: at the state and regional level – in the optimization of budget and inter-budgetary relations, in acceleration of the money turnover, in deceleration of the money supply and inflation rate, in reduced need in money emission; at the level of economic entities – in accelerating the money turnover and accounts receivable, in the reduction of debit and credit loans, in the growth of profit (turnover; at the household level – in

  12. Stochastic analysis of the recharge uncertainty of a regional aquifer in extreme arid conditions

    OpenAIRE

    Rojas, Rodrigo; Dassargues, Alain

    2006-01-01

    The Pampa del Tamarugal Aquifer (PTA) is an important source of groundwater in northern Chile. Since the study area is situated in the Atacama Desert, the estimation of groundwater recharge based on conventional hydrological methods is subject to large uncertainties. To account for variations in the groundwater balance, caused by uncertainties in the average recharge rates, randomly generated recharge values with different levels of uncertainty are simulated using a groundwater flow model. Re...

  13. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  14. Uncertainty Analysis in MRI-based Polymer Gel Dosimetry.

    Science.gov (United States)

    Keshtkar, M; Zahmatkesh, M H; Montazerabadi, A R

    2017-09-01

    Polymer gel dosimeters combined with magnetic resonance imaging (MRI) can be used for dose verification of advanced radiation therapy techniques. However, the uncertainty of dose map measured by gel dosimeter should be known. The purpose of this study is to investigate the uncertainty related to calibration curve and MRI protocol for MAGIC (Methacrylic and Ascorbic acid in Gelatin Initiated by Copper) gel and finally ways of optimization MRI protocol is introduced. MAGIC gel was prepared by the Fong et al. instruction. The gels were poured into calibration vials and irradiated by 18 MV photons. 1.5 Tesla MRI was used for reading out information. Finally, uncertainty of measured dose was calculated. Results show that for MAGIC polymer gel dosimeter, at low doses, the estimated uncertainty is high (≈ 18.96% for 1 Gy) but it reduces to approximately 4.17% for 10 Gy. Also, with increasing dose, the uncertainty for the measured dose decreases non-linearly. For low doses, the most significant uncertainties are σR0 (uncertainty of intercept) and σa (uncertainty of slope) for high doses. MRI protocol parameters influence signal-to-noise ratio (SNR). The most important source of uncertainty is uncertainty of R2. Hence, MRI protocol and parameters therein should be optimized. At low doses, the estimated uncertainty is high and reduces by increasing dose. It is suggested that in relative dosimetry, gels are irradiated by high doses in linear range of given gel dosimeter and then scaled down to the desired dose range.

  15. The uncertainty in physical measurements an introduction to data analysis in the physics laboratory

    CERN Document Server

    Fornasini, Paolo

    2008-01-01

    All measurements of physical quantities are affected by uncertainty. Understanding the origin of uncertainty, evaluating its extent and suitably taking it into account in data analysis is essential for assessing the degree of accuracy of phenomenological relationships and physical laws in both scientific research and technological applications. The Uncertainty in Physical Measurements: An Introduction to Data Analysis in the Physics Laboratory presents an introduction to uncertainty and to some of the most common procedures of data analysis. This book will serve the reader well by filling the gap between tutorial textbooks and highly specialized monographs. The book is divided into three parts. The first part is a phenomenological introduction to measurement and uncertainty: properties of instruments, different causes and corresponding expressions of uncertainty, histograms and distributions, and unified expression of uncertainty. The second part contains an introduction to probability theory, random variable...

  16. Incorporating parametric uncertainty into population viability analysis models

    Science.gov (United States)

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  17. Theoretical Analysis of Positional Uncertainty in Direct Georeferencing

    Science.gov (United States)

    Coskun Kiraci, Ali; Toz, Gonul

    2016-10-01

    GNSS/INS system composed of Global Navigation Satellite System and Inertial Navigation System together can provide orientation parameters directly by the observations collected during the flight. Thus orientation parameters can be obtained by GNSS/INS integration process without any need for aero triangulation after the flight. In general, positional uncertainty can be estimated with known coordinates of Ground Control Points (GCP) which require field works such as marker construction and GNSS measurement leading additional cost to the project. Here the question arises what should be the theoretical uncertainty of point coordinates depending on the uncertainties of orientation parameters. In this study the contribution of each orientation parameter on positional uncertainty is examined and theoretical positional uncertainty is computed without GCP measurement for direct georeferencing using a graphical user interface developed in MATLAB.

  18. THEORETICAL ANALYSIS OF POSITIONAL UNCERTAINTY IN DIRECT GEOREFERENCING

    Directory of Open Access Journals (Sweden)

    A. C. Kiraci

    2016-10-01

    Full Text Available GNSS/INS system composed of Global Navigation Satellite System and Inertial Navigation System together can provide orientation parameters directly by the observations collected during the flight. Thus orientation parameters can be obtained by GNSS/INS integration process without any need for aero triangulation after the flight. In general, positional uncertainty can be estimated with known coordinates of Ground Control Points (GCP which require field works such as marker construction and GNSS measurement leading additional cost to the project. Here the question arises what should be the theoretical uncertainty of point coordinates depending on the uncertainties of orientation parameters. In this study the contribution of each orientation parameter on positional uncertainty is examined and theoretical positional uncertainty is computed without GCP measurement for direct georeferencing using a graphical user interface developed in MATLAB.

  19. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  20. Uncertainty analysis for estimates of the first indirect aerosol effect

    Directory of Open Access Journals (Sweden)

    Y. Chen

    2005-01-01

    Full Text Available The IPCC has stressed the importance of producing unbiased estimates of the uncertainty in indirect aerosol forcing, in order to give policy makers as well as research managers an understanding of the most important aspects of climate change that require refinement. In this study, we use 3-D meteorological fields together with a radiative transfer model to examine the spatially-resolved uncertainty in estimates of the first indirect aerosol forcing. The global mean forcing calculated in the reference case is -1.30 Wm-2. Uncertainties in the indirect forcing associated with aerosol and aerosol precursor emissions, aerosol mass concentrations from different chemical transport models, aerosol size distributions, the cloud droplet parameterization, the representation of the in-cloud updraft velocity, the relationship between effective radius and volume mean radius, cloud liquid water content, cloud fraction, and the change in the cloud drop single scattering albedo due to the presence of black carbon are calculated. The aerosol burden calculated by chemical transport models and the cloud fraction are found to be the most important sources of uncertainty. Variations in these parameters cause an underestimation or overestimation of the indirect forcing compared to the base case by more than 0.6 Wm-2. Uncertainties associated with aerosol and aerosol precursor emissions, uncertainties in the representation of the aerosol size distribution (including the representation of the pre-industrial size distribution, and uncertainties in the representation of cloud droplet spectral dispersion effect cause uncertainties in the global mean forcing of 0.2~0.6 Wm-2. There are significant regional differences in the uncertainty associated with the first indirect forcing with the largest uncertainties in industrial regions (North America, Europe, East Asia followed by those in the major biomass burning regions.

  1. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

  2. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Directory of Open Access Journals (Sweden)

    Villemereuil Pierre de

    2012-06-01

    Full Text Available Abstract Background Uncertainty in comparative analyses can come from at least two sources: a phylogenetic uncertainty in the tree topology or branch lengths, and b uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow and inflated significance in hypothesis testing (e.g. p-values will be too small. Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible

  3. Uncertainty analysis for the k - {epsilon} model of turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Pelletier, D.; Etienne, S.; Turgeon, E. [Ecole Polytechnique de Montreal, Dept. de Genie Mechanique, Montreal, Quebec (Canada); Borggaard, J. [Virginia Tech, Dept. of Mathematics, Interdisciplinary Center for Applied Mathematics, Blacksburg, Virginia (United States)

    2002-07-01

    A general continuous sensitivity equation method is used to perform uncertainty analyses of the standard k - {epsilon} model of turbulence with wall functions. Various parameters of the flow model are studied (boundary conditions, closure coefficients etc.). The methodology is applied to flow over a backward facing step. The methodology is also applied to flow over a NACA0012 airfoil. Uncertainty in the angle of attack is cascaded through the CFD simulation to produce uncertainty intervals on the predictions of the lift, drag and moment coefficients. (author)

  4. Uncertainty Analysis of Multi-Model Flood Forecasts

    Directory of Open Access Journals (Sweden)

    Erich J. Plate

    2015-12-01

    Full Text Available This paper demonstrates, by means of a systematic uncertainty analysis, that the use of outputs from more than one model can significantly improve conditional forecasts of discharges or water stages, provided the models are structurally different. Discharge forecasts from two models and the actual forecasted discharge are assumed to form a three-dimensional joint probability density distribution (jpdf, calibrated on long time series of data. The jpdf is decomposed into conditional probability density distributions (cpdf by means of Bayes formula, as suggested and explored by Krzysztofowicz in a series of papers. In this paper his approach is simplified to optimize conditional forecasts for any set of two forecast models. Its application is demonstrated by means of models developed in a study of flood forecasting for station Stung Treng on the middle reach of the Mekong River in South-East Asia. Four different forecast models were used and pairwise combined: forecast with no model, with persistence model, with a regression model, and with a rainfall-runoff model. Working with cpdfs requires determination of dependency among variables, for which linear regressions are required, as was done by Krzysztofowicz. His Bayesian approach based on transforming observed probability distributions of discharges and forecasts into normal distributions is also explored. Results obtained with his method for normal prior and likelihood distributions are identical to results from direct multiple regressions. Furthermore, it is shown that in the present case forecast accuracy is only marginally improved, if Weibull distributed basic data were converted into normally distributed variables.

  5. Uncertainty of measurement and clinical value of semen analysis: has standardisation through professional guidelines helped or hindered progress?

    Science.gov (United States)

    Tomlinson, M J

    2016-09-01

    This article suggests that diagnostic semen analysis has no more clinical value today than it had 25-30 years ago, and both the confusion surrounding its evidence base (in terms of relationship with conception) and the low level of confidence in the clinical setting is attributable to an associated high level of 'uncertainty'. Consideration of the concept of measurement uncertainty is mandatory for medical laboratories applying for the ISO15189 standard. It is evident that the entire semen analysis process is prone to error every step from specimen collection to the reporting of results and serves to compound uncertainty associated with diagnosis or prognosis. Perceived adherence to published guidelines for the assessment of sperm concentration, motility and morphology does not guarantee a reliable and reproducible test result. Moreover, the high level of uncertainty associated with manual sperm motility and morphology can be attributed to subjectivity and lack a traceable standard. This article describes where and why uncertainty exists and suggests that semen analysis will continue to be of limited value until it is more adequately considered and addressed. Although professional guidelines for good practice have provided the foundations for testing procedures for many years, the risk in following rather prescriptive guidance to the letter is that unless they are based on an overwhelmingly firm evidence base, the quality of semen analysis will remain poor and the progress towards the development of more innovative methods for investigating male infertility will be slow. © 2016 American Society of Andrology and European Academy of Andrology.

  6. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    OpenAIRE

    J. Florian Wellmann

    2013-01-01

    The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are ...

  7. ANALYSIS OF MEASUREMENT UNCERTAINTIES IN THE NULLING TEST FOR AIR LEAKAGE FROM RESIDENTIAL DUCTS.

    Energy Technology Data Exchange (ETDEWEB)

    ANDREWS,J.W.

    2001-04-01

    An analysis of measurement uncertainties in a recently proposed method of measuring air leakage in residential duct systems has been carried out. The uncertainties in supply and return leakage rates are expressed in terms of the value of the envelope leakage flow coefficient and the uncertainties in measured pressures and air flow rates. Results of the analysis are compared with data published by two research groups.

  8. Uncertainty analysis of complex hydro-biogeochemical models

    OpenAIRE

    Houska, Tobias

    2017-01-01

    This thesis is about complex hydro-biogeochemical models and their practical applications. Several modelling practices and their associated uncertainty are investigated in this joined project of the working groups of Prof. Dr. Lutz Breuer, Justus Liebig University Giessen, and Prof. Dr. Klaus Butterbach-Bahl at Karlsruhe Institute of Technology. The aim of the project is to develop strategies for reducing the climate footprint of agricultural production and to quantify uncertainties of model-...

  9. Analysis of Uncertainty in Dynamic Processes Development of Banks Functioning

    Directory of Open Access Journals (Sweden)

    Aleksei V. Korovyakovskii

    2013-01-01

    Full Text Available The paper offers the approach to measure of uncertainty estimation in dynamic processes of banks functioning, using statistic data of different banking operations indicators. To calculate measure of uncertainty in dynamic processes of banks functioning the phase images of relevant sets of statistic data are considered. Besides, it is shown that the form of phase image of the studied sets of statistic data can act as a basis of measure of uncertainty estimation in dynamic processes of banks functioning. The set of analytical characteristics are offered to formalize the form of phase image definition of the studied sets of statistic data. It is shown that the offered analytical characteristics consider inequality of changes in values of the studied sets of statistic data, which is one of the ways of uncertainty display in dynamic processes development. The invariant estimates of measure of uncertainty in dynamic processes of banks functioning, considering significant changes in absolute values of the same indicators for different banks were obtained. The examples of calculation of measure of uncertainty in dynamic processes of concrete banks functioning were cited.

  10. UNCERTAINTY PROPAGATION ANALYSIS FOR YONGGWANG NUCLEAR UNIT 4 BY MCCARD/MASTER CORE ANALYSIS SYSTEM

    Directory of Open Access Journals (Sweden)

    HO JIN PARK

    2014-06-01

    Full Text Available This paper concerns estimating uncertainties of the core neutronics design parameters of power reactors by direct sampling method (DSM calculations based on the two-step McCARD/MASTER design system in which McCARD is used to generate the fuel assembly (FA homogenized few group constants (FGCs while MASTER is used to conduct the core neutronics design computation. It presents an extended application of the uncertainty propagation analysis method originally designed for uncertainty quantification of the FA FGCs as a way to produce the covariances between the FGCs of any pair of FAs comprising the core, or the covariance matrix of the FA FGCs required for random sampling of the FA FGCs input sets into direct sampling core calculations by MASTER. For illustrative purposes, the uncertainties of core design parameters such as the effective multiplication factor (keff, normalized FA power densities, power peaking factors, etc. for the beginning of life (BOL core of Yonggwang nuclear unit 4 (YGN4 at the hot zero power and all rods out are estimated by the McCARD/MASTER-based DSM computations. The results are compared with those from the uncertainty propagation analysis method based on the McCARD-predicted sensitivity coefficients of nuclear design parameters and the cross section covariance data.

  11. Application of Uncertainty and Sensitivity Analysis to a Kinetic Model for Enzymatic Biodiesel Production

    DEFF Research Database (Denmark)

    Price, Jason Anthony; Nordblad, Mathias; Woodley, John

    2014-01-01

    This paper demonstrates the added benefits of using uncertainty and sensitivity analysis in the kinetics of enzymatic biodiesel production. For this study, a kinetic model by Fedosov and co-workers is used. For the uncertainty analysis the Monte Carlo procedure was used to statistically quantify...

  12. Uncertainty Analysis of RELAP5-3D

    Energy Technology Data Exchange (ETDEWEB)

    Alexandra E Gertman; Dr. George L Mesina

    2012-07-01

    As world-wide energy consumption continues to increase, so does the demand for the use of alternative energy sources, such as Nuclear Energy. Nuclear Power Plants currently supply over 370 gigawatts of electricity, and more than 60 new nuclear reactors have been commissioned by 15 different countries. The primary concern for Nuclear Power Plant operation and lisencing has been safety. The safety of the operation of Nuclear Power Plants is no simple matter- it involves the training of operators, design of the reactor, as well as equipment and design upgrades throughout the lifetime of the reactor, etc. To safely design, operate, and understand nuclear power plants, industry and government alike have relied upon the use of best-estimate simulation codes, which allow for an accurate model of any given plant to be created with well-defined margins of safety. The most widely used of these best-estimate simulation codes in the Nuclear Power industry is RELAP5-3D. Our project focused on improving the modeling capabilities of RELAP5-3D by developing uncertainty estimates for its calculations. This work involved analyzing high, medium, and low ranked phenomena from an INL PIRT on a small break Loss-Of-Coolant Accident as wall as an analysis of a large break Loss-Of- Coolant Accident. Statistical analyses were performed using correlation coefficients. To perform the studies, computer programs were written that modify a template RELAP5 input deck to produce one deck for each combination of key input parameters. Python scripting enabled the running of the generated input files with RELAP5-3D on INL’s massively parallel cluster system. Data from the studies was collected and analyzed with SAS. A summary of the results of our studies are presented.

  13. Assessing Reliability of Cellulose Hydrolysis Models to Support Biofuel Process Design – Identifiability and Uncertainty Analysis

    DEFF Research Database (Denmark)

    Sin, Gürkan; Meyer, Anne S.; Gernaey, Krist

    2010-01-01

    The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done in the ori...... to analyze the uncertainty of model predictions. This allows judging the fitness of the model to the purpose under uncertainty. Hence we recommend uncertainty analysis as a proactive solution when faced with model uncertainty, which is the case for biofuel process development research....

  14. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  15. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    Directory of Open Access Journals (Sweden)

    J. Florian Wellmann

    2013-04-01

    Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.

  16. Alignment measurements uncertainties for large assemblies using probabilistic analysis techniques

    CERN Document Server

    AUTHOR|(CDS)2090816; Almond, Heather

    Big science and ambitious industrial projects continually push forward with technical requirements beyond the grasp of conventional engineering techniques. Example of those are ultra-high precision requirements in the field of celestial telescopes, particle accelerators and aerospace industry. Such extreme requirements are limited largely by the capability of the metrology used, namely, it’s uncertainty in relation to the alignment tolerance required. The current work was initiated as part of Maria Curie European research project held at CERN, Geneva aiming to answer those challenges as related to future accelerators requiring alignment of 2 m large assemblies to tolerances in the 10 µm range. The thesis has found several gaps in current knowledge limiting such capability. Among those was the lack of application of state of the art uncertainty propagation methods in alignment measurements metrology. Another major limiting factor found was the lack of uncertainty statements in the thermal errors compensatio...

  17. NASTRAN variance analysis and plotting of HBDY elements. [analysis of uncertainties of the computer results as a function of uncertainties in the input data

    Science.gov (United States)

    Harder, R. L.

    1974-01-01

    The NASTRAN Thermal Analyzer has been intended to do variance analysis and plot the thermal boundary elements. The objective of the variance analysis addition is to assess the sensitivity of temperature variances resulting from uncertainties inherent in input parameters for heat conduction analysis. The plotting capability provides the ability to check the geometry (location, size and orientation) of the boundary elements of a model in relation to the conduction elements. Variance analysis is the study of uncertainties of the computed results as a function of uncertainties of the input data. To study this problem using NASTRAN, a solution is made for both the expected values of all inputs, plus another solution for each uncertain variable. A variance analysis module subtracts the results to form derivatives, and then can determine the expected deviations of output quantities.

  18. How does uncertainty shape patient experience in advanced illness?:A secondary analysis of qualitative data

    OpenAIRE

    Etkind, Simon; Bristowe, Katherine; Bailey, Katherine; Selman, Lucy E; Murtagh, Fliss

    2017-01-01

    Background: Uncertainty is common in advanced illness, but is infrequently studied in this context. If poorly addressed, uncertainty can lead to adverse patient outcomes. Aim: We aimed to understand patient experiences of uncertainty in advanced illness, and develop a typology of patients’ responses and preferences to inform practice.Design: Secondary analysis of qualitative interview transcripts. Studies were assessed for inclusion and interviews were sampled using maximum-variation sampling...

  19. Sensitivity Analysis and Uncertainty Quantification in Pulmonary Drug Delivery of Orally Inhaled Pharmaceuticals.

    Science.gov (United States)

    Lu, Jun; Xi, Jinxiang; Langenderfer, Joseph E

    2017-11-01

    In spite of widespread use of modeling tools in inhalation dosimetry, it remains difficult to quantify the output uncertainties when subjected to various sources of input variability. This study aimed to develop a computational model that can quantify the input sensitivity and output uncertainty in pulmonary drug delivery by coupling probabilistic analysis package NESSUS with ANSYS Fluent. An image-based mouth-lung model was used to simulate the transport and deposition of drug particles and variability in particle size, density, and inhalation speed were considered. Results show that input variables have different importance levels on the delivered doses to lungs. For a given level of variability, the delivered dose is more sensitive to the variance of particle diameter than that of the inhalation speed and particle density. The range of input scatters has a profound impact on the outcome probability of delivered efficiencies, while the input distribution type (normal vs. log-normal) appears to have an insignificant effect. Despite normal distributions for all input variables, the output exhibits a non-normal distribution. The proposed model in this study allows easy specification of input distributions to conduct multivariable probabilistic analysis of inhalation drug deliveries, which can facilitate more reliable treatment planning and outcome assessment. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  20. Using copulas for modeling stochastic dependence in power system uncertainty analysis

    NARCIS (Netherlands)

    Papaefthymiou, G.; Kurowicka, D.

    2009-01-01

    The increasing penetration of renewable generation in power systems necessitates the modeling of this stochastic system infeed in operation and planning studies. The system analysis leads to multivariate uncertainty analysis problems, involving non-Normal correlated random variables. In this

  1. 3.8 Proposed approach to uncertainty quantification and sensitivity analysis in the next PA

    Energy Technology Data Exchange (ETDEWEB)

    Flach, Greg [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Wohlwend, Jen [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-10-02

    This memorandum builds upon Section 3.8 of SRNL (2016) and Flach (2017) by defining key error analysis, uncertainty quantification, and sensitivity analysis concepts and terms, in preparation for the next E-Area Performance Assessment (WSRC 2008) revision.

  2. Parameter uncertainty analysis for simulating streamflow in a river catchment of Vietnam

    Directory of Open Access Journals (Sweden)

    Dao Nguyen Khoi

    2015-07-01

    Full Text Available Hydrological models play vital roles in management of water resources. However, the calibration of the hydrological models is a large challenge because of the uncertainty involved in the large number of parameters. In this study, four uncertainty analysis methods, including Generalized Likelihood Uncertainty Estimation (GLUE, Parameter Solution (ParaSol, Particle Swarm Optimization (PSO, and Sequential Uncertainty Fitting (SUFI-2, were employed to perform parameter uncertainty analysis of streamflow simulation in the Srepok River Catchment by using the Soil and Water Assessment Tool (SWAT model. The four methods were compared in terms of the model prediction uncertainty, the model performance, and the computational efficiency. The results showed that the SUFI-2 method has the advantages in the model calibration and uncertainty analysis. This technique could be run with the smallest of simulation runs to achieve good prediction uncertainty bands and model performance. This technique could be run with the smallest of simulation runs to achieve good prediction uncertainty bands and model performance.

  3. Uncertainty in river discharge observations: a quantitative analysis

    Directory of Open Access Journals (Sweden)

    G. Di Baldassarre

    2009-06-01

    Full Text Available This study proposes a framework for analysing and quantifying the uncertainty of river flow data. Such uncertainty is often considered to be negligible with respect to other approximations affecting hydrological studies. Actually, given that river discharge data are usually obtained by means of the so-called rating curve method, a number of different sources of error affect the derived observations. These include: errors in measurements of river stage and discharge utilised to parameterise the rating curve, interpolation and extrapolation error of the rating curve, presence of unsteady flow conditions, and seasonal variations of the state of the vegetation (i.e. roughness. This study aims at analysing these sources of uncertainty using an original methodology. The novelty of the proposed framework lies in the estimation of rating curve uncertainty, which is based on hydraulic simulations. These latter are carried out on a reach of the Po River (Italy by means of a one-dimensional (1-D hydraulic model code (HEC-RAS. The results of the study show that errors in river flow data are indeed far from negligible.

  4. Isokinetic TWC Evaporator Probe: Calculations and Systemic Uncertainty Analysis

    Science.gov (United States)

    Davison, Craig R.; Strapp, John W.; Lilie, Lyle E.; Ratvasky, Thomas P.; Dumont, Christopher

    2016-01-01

    A new Isokinetic Total Water Content Evaporator (IKP2) was downsized from a prototype instrument, specifically to make airborne measurements of hydrometeor total water content (TWC) in deep tropical convective clouds to assess the new ice crystal Appendix D icing envelope. The probe underwent numerous laboratory and wind tunnel investigations to ensure reliable operation under the difficult high altitude/speed/TWC conditions under which other TWC instruments have been known to either fail, or have unknown performance characteristics and the results are presented in a companion paper (Ref. 1). This paper presents the equations used to determine the total water content (TWC) of the sampled atmosphere from the values measured by the IKP2 or necessary ancillary data from other instruments. The uncertainty in the final TWC is determined by propagating the uncertainty in the measured values through the calculations to the final result. Two techniques were used and the results compared. The first is a typical analytical method of propagating uncertainty and the second performs a Monte Carlo simulation. The results are very similar with differences that are insignificant for practical purposes. The uncertainty is between 2 and 3 percent at most practical operating conditions. The capture efficiency of the IKP2 was also examined based on a computational fluid dynamic simulation of the original IKP and scaled down to the IKP2. Particles above 24 micrometers were found to have a capture efficiency greater than 99 percent at all operating conditions.

  5. Natural hazard modeling and uncertainty analysis [Chapter 2

    Science.gov (United States)

    Matthew Thompson; Jord J. Warmink

    2017-01-01

    Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...

  6. Uncertainty analysis for results of thermal hydraulic codes of best-estimate-type; Analisis de incertidumbre para resultados de codigos termohidraulicos de mejor estimacion

    Energy Technology Data Exchange (ETDEWEB)

    Alva N, J.

    2010-07-01

    In this thesis, some fundamental knowledge is presented about uncertainty analysis and about diverse methodologies applied in the study of nuclear power plant transient event analysis, particularly related to thermal hydraulics phenomena. These concepts and methodologies mentioned in this work come from a wide bibliographical research in the nuclear power subject. Methodologies for uncertainty analysis have been developed by quite diverse institutions, and they have been widely used worldwide for application to results from best-estimate-type computer codes in nuclear reactor thermal hydraulics and safety analysis. Also, the main uncertainty sources, types of uncertainties, and aspects related to best estimate modeling and methods are introduced. Once the main bases of uncertainty analysis have been set, and some of the known methodologies have been introduced, it is presented in detail the CSAU methodology, which will be applied in the analyses. The main objective of this thesis is to compare the results of an uncertainty and sensibility analysis by using the Response Surface Technique to the application of W ilks formula, apply through a loss coolant experiment and an event of rise in a BWR. Both techniques are options in the part of uncertainty and sensibility analysis of the CSAU methodology, which was developed for the analysis of transients and accidents at nuclear power plants, and it is the base of most of the methodologies used in licensing of nuclear power plants practically everywhere. Finally, the results of applying both techniques are compared and discussed. (Author)

  7. Linguistic uncertainty in qualitative risk analysis and how to minimize it.

    Science.gov (United States)

    Carey, Janet M; Burgman, Mark A

    2008-04-01

    Most risk assessments assume uncertainty may be decomposed into variability and incertitude. Language is often overlooked as a source of uncertainty, but linguistic uncertainty may be pervasive in workshops, committees, and other face-to-face language-based settings where it can result in misunderstanding and arbitrary disagreement. Here we present examples of linguistic uncertainty drawn from qualitative risk analysis undertaken in stakeholder workshops and describe how the uncertainties were treated. We used a process of iterative re-assessment of likelihoods and consequences, interspersed with facilitated discussion, to assist in the reduction of language-based uncertainty. The effects of this process were evident as changes in the level of agreement among groups of assessors in the ranking of hazards.

  8. A Practical ANOVA Approach for Uncertainty Analysis in Population-Based Disease Microsimulation Models.

    Science.gov (United States)

    Sharif, Behnam; Wong, Hubert; Anis, Aslam H; Kopec, Jacek A

    2017-04-01

    To provide a practical approach for calculating uncertainty intervals and variance components associated with initial-condition and dynamic-equation parameters in computationally expensive population-based disease microsimulation models. In the proposed uncertainty analysis approach, we calculated the required computational time and the number of runs given a user-defined error bound on the variance of the grand mean. The equations for optimal sample sizes were derived by minimizing the variance of the grand mean using initial estimates for variance components. Finally, analysis of variance estimators were used to calculate unbiased variance estimates. To illustrate the proposed approach, we performed uncertainty analysis to estimate the uncertainty associated with total direct cost of osteoarthritis in Canada from 2010 to 2031 according to a previously published population health microsimulation model of osteoarthritis. We first calculated crude estimates for initial-population sampling and dynamic-equation parameters uncertainty by performing a small number of runs. We then calculated the optimal sample sizes and finally derived 95% uncertainty intervals of the total cost and unbiased estimates for variance components. According to our results, the contribution of dynamic-equation parameter uncertainty to the overall variance was higher than that of initial parameter sampling uncertainty throughout the study period. The proposed analysis of variance approach provides the uncertainty intervals for the mean outcome in addition to unbiased estimates for each source of uncertainty. The contributions of each source of uncertainty can then be compared with each other for validation purposes so as to improve the model accuracy. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  9. Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity

    Science.gov (United States)

    Harbin Li; Steven G. McNulty

    2007-01-01

    Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL...

  10. Uncertainty analysis in rainfall-runoff modelling : Application of machine learning techniques

    NARCIS (Netherlands)

    Shrestha, D.l.

    2009-01-01

    This thesis presents powerful machine learning (ML) techniques to build predictive models of uncertainty with application to hydrological models. Two different methods are developed and tested. First one focuses on parameter uncertainty analysis by emulating the results of Monte Carlo simulations of

  11. Uncertainty Analysis in Rainfall-Runoff Modelling: Application of Machine Learning Techniques

    NARCIS (Netherlands)

    Shrestha, D.L.

    2009-01-01

    This thesis presents powerful machine learning (ML) techniques to build predictive models of uncertainty with application to hydrological models. Two different methods are developed and tested. First one focuses on parameter uncertainty analysis by emulating the results of Monte Carlo simulations of

  12. A python framework for environmental model uncertainty analysis

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  13. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  14. Monte-Carlo based Uncertainty Analysis For CO2 Laser Microchanneling Model

    Science.gov (United States)

    Prakash, Shashi; Kumar, Nitish; Kumar, Subrata

    2016-09-01

    CO2 laser microchanneling has emerged as a potential technique for the fabrication of microfluidic devices on PMMA (Poly-methyl-meth-acrylate). PMMA directly vaporizes when subjected to high intensity focused CO2 laser beam. This process results in clean cut and acceptable surface finish on microchannel walls. Overall, CO2 laser microchanneling process is cost effective and easy to implement. While fabricating microchannels on PMMA using a CO2 laser, the maximum depth of the fabricated microchannel is the key feature. There are few analytical models available to predict the maximum depth of the microchannels and cut channel profile on PMMA substrate using a CO2 laser. These models depend upon the values of thermophysical properties of PMMA and laser beam parameters. There are a number of variants of transparent PMMA available in the market with different values of thermophysical properties. Therefore, for applying such analytical models, the values of these thermophysical properties are required to be known exactly. Although, the values of laser beam parameters are readily available, extensive experiments are required to be conducted to determine the value of thermophysical properties of PMMA. The unavailability of exact values of these property parameters restrict the proper control over the microchannel dimension for given power and scanning speed of the laser beam. In order to have dimensional control over the maximum depth of fabricated microchannels, it is necessary to have an idea of uncertainty associated with the predicted microchannel depth. In this research work, the uncertainty associated with the maximum depth dimension has been determined using Monte Carlo method (MCM). The propagation of uncertainty with different power and scanning speed has been predicted. The relative impact of each thermophysical property has been determined using sensitivity analysis.

  15. Meta-Analysis: Caregiver and Youth Uncertainty in Pediatric Chronic Illness.

    Science.gov (United States)

    Szulczewski, Lauren; Mullins, Larry L; Bidwell, Sarah L; Eddington, Angelica R; Pai, Ahna L H

    2017-05-01

    To conduct a systematic review on the construct of illness uncertainty in caregivers and youth as related to the following: demographic and illness variables, psychological functioning, illness-related distress, and reaction/coping style. A meta-analysis was conducted with articles assessing the associations between illness uncertainty and variables of interest that were published between November 1983 and June 2016 ( n = 58). Psychological functioning and illness-related distress had primarily medium effect sizes. Demographic and illness variables had small effect sizes. More positive and fewer negative reaction/coping styles were associated with less illness uncertainty, with primarily small effects. Illness uncertainty may be an important factor that influences psychological functioning and distress and coping in the context of pediatric chronic illness. However, additional research is needed to determine more precise mean effect sizes, as well as the potential efficacy of intervention to address uncertainty. adolescents, children, chronic illness, coping skills and adjustment, meta-analysis, parents, psychosocial functioning.

  16. Photovoltaic Calibrations at the National Renewable Energy Laboratory and Uncertainty Analysis Following the ISO 17025 Guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Emery, Keith [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-01

    The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence of Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.

  17. A GLUE uncertainty analysis of a drying model of pharmaceutical granules.

    Science.gov (United States)

    Mortier, Séverine Thérèse F C; Van Hoey, Stijn; Cierkens, Katrijn; Gernaey, Krist V; Seuntjens, Piet; De Baets, Bernard; De Beer, Thomas; Nopens, Ingmar

    2013-11-01

    A shift from batch processing towards continuous processing is of interest in the pharmaceutical industry. However, this transition requires detailed knowledge and process understanding of all consecutive unit operations in a continuous manufacturing line to design adequate control strategies. This can be facilitated by developing mechanistic models of the multi-phase systems in the process. Since modelling efforts only started recently in this field, uncertainties about the model predictions are generally neglected. However, model predictions have an inherent uncertainty (i.e. prediction uncertainty) originating from uncertainty in input data, model parameters, model structure, boundary conditions and software. In this paper, the model prediction uncertainty is evaluated for a model describing the continuous drying of single pharmaceutical wet granules in a six-segmented fluidized bed drying unit, which is part of the full continuous from-powder-to-tablet manufacturing line (Consigma™, GEA Pharma Systems). A validated model describing the drying behaviour of a single pharmaceutical granule in two consecutive phases is used. First of all, the effect of the assumptions at the particle level on the prediction uncertainty is assessed. Secondly, the paper focuses on the influence of the most sensitive parameters in the model. Finally, a combined analysis (particle level plus most sensitive parameters) is performed and discussed. To propagate the uncertainty originating from the parameter uncertainty to the model output, the Generalized Likelihood Uncertainty Estimation (GLUE) method is used. This method enables a modeller to incorporate the information obtained from the experimental data in the assessment of the uncertain model predictions and to find a balance between model performance and data precision. A detailed evaluation of the obtained uncertainty analysis results is made with respect to the model structure, interactions between parameters and uncertainty

  18. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in

  19. Flutter analysis of an airfoil with multiple nonlinearities and uncertainties

    Directory of Open Access Journals (Sweden)

    Haitao Liao

    2013-09-01

    Full Text Available An original method for calculating the limit cycle oscillations of nonlinear aero-elastic system is presented. The problem of determining the maximum vibration amplitude of limit cycle is transformed into a nonlinear optimization problem. The harmonic balance method and the Floquet theory are selected to construct the general nonlinear equality and inequality constraints. The resulting constrained maximization problem is then solved by using the MultiStart algorithm. Finally, the proposed approach is validated and used to analyse the limit cycle oscillations of an airfoil with multiple nonlinearities and uncertainties. Numerical examples show that the coexistence of multiple nonlinearities may lead to low amplitude limit cycle oscillation.

  20. PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, R

    2008-06-25

    Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.

  1. Uncertainty Analysis of Resistance Tests in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University

    Directory of Open Access Journals (Sweden)

    Cihad DELEN

    2015-12-01

    Full Text Available In this study, some systematical resistance tests, where were performed in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University (ITU, have been included in order to determine the uncertainties. Experiments which are conducted in the framework of mathematical and physical rules for the solution of engineering problems, measurements, calculations include uncertainty. To question the reliability of the obtained values, the existing uncertainties should be expressed as quantities. The uncertainty of a measurement system is not known if the results do not carry a universal value. On the other hand, resistance is one of the most important parameters that should be considered in the process of ship design. Ship resistance during the design phase of a ship cannot be determined precisely and reliably due to the uncertainty resources in determining the resistance value that are taken into account. This case may cause negative effects to provide the required specifications in the latter design steps. The uncertainty arising from the resistance test has been estimated and compared for a displacement type ship and high speed marine vehicles according to ITTC 2002 and ITTC 2014 regulations which are related to the uncertainty analysis methods. Also, the advantages and disadvantages of both ITTC uncertainty analysis methods have been discussed.

  2. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  3. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  4. Uncertainty analysis and design optimization of hybrid rocket motor powered vehicle for suborbital flight

    Directory of Open Access Journals (Sweden)

    Zhu Hao

    2015-06-01

    Full Text Available In this paper, we propose an uncertainty analysis and design optimization method and its applications on a hybrid rocket motor (HRM powered vehicle. The multidisciplinary design model of the rocket system is established and the design uncertainties are quantified. The sensitivity analysis of the uncertainties shows that the uncertainty generated from the error of fuel regression rate model has the most significant effect on the system performances. Then the differences between deterministic design optimization (DDO and uncertainty-based design optimization (UDO are discussed. Two newly formed uncertainty analysis methods, including the Kriging-based Monte Carlo simulation (KMCS and Kriging-based Taylor series approximation (KTSA, are carried out using a global approximation Kriging modeling method. Based on the system design model and the results of design uncertainty analysis, the design optimization of an HRM powered vehicle for suborbital flight is implemented using three design optimization methods: DDO, KMCS and KTSA. The comparisons indicate that the two UDO methods can enhance the design reliability and robustness. The researches and methods proposed in this paper can provide a better way for the general design of HRM powered vehicles.

  5. Analysis and Reduction of Complex Networks Under Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Ghanem, Roger G [University of Southern California

    2014-07-31

    This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC team consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.

  6. Uncertainty Analysis of Radar and Gauge Rainfall Estimates in the Russian River Basin

    Science.gov (United States)

    Cifelli, R.; Chen, H.; Willie, D.; Reynolds, D.; Campbell, C.; Sukovich, E.

    2013-12-01

    Radar Quantitative Precipitation Estimation (QPE) has been a very important application of weather radar since it was introduced and made widely available after World War II. Although great progress has been made over the last two decades, it is still a challenging process especially in regions of complex terrain such as the western U.S. It is also extremely difficult to make direct use of radar precipitation data in quantitative hydrologic forecasting models. To improve the understanding of rainfall estimation and distributions in the NOAA Hydrometeorology Testbed in northern California (HMT-West), extensive evaluation of radar and gauge QPE products has been performed using a set of independent rain gauge data. This study focuses on the rainfall evaluation in the Russian River Basin. The statistical properties of the different gridded QPE products will be compared quantitatively. The main emphasis of this study will be on the analysis of uncertainties of the radar and gauge rainfall products that are subject to various sources of error. The spatial variation analysis of the radar estimates is performed by measuring the statistical distribution of the radar base data such as reflectivity and by the comparison with a rain gauge cluster. The application of mean field bias values to the radar rainfall data will also be described. The uncertainty analysis of the gauge rainfall will be focused on the comparison of traditional kriging and conditional bias penalized kriging (Seo 2012) methods. This comparison is performed with the retrospective Multisensor Precipitation Estimator (MPE) system installed at the NOAA Earth System Research Laboratory. The independent gauge set will again be used as the verification tool for the newly generated rainfall products.

  7. An Industrial Analysis for Integrating Business Subjects.

    Science.gov (United States)

    Kapusinski, Albert T.

    1986-01-01

    Describes the industrial analysis seminar at Caldwell College (New Jersey), which was designed to be a capstone course for undergraduate business majors, allowing them to bring business topics into focus by using all their collected business acumen: accounting, marketing, management, economics, law, etc. (CT)

  8. Effect of Uncertainties in Physical Property Estimates on Process Design - Sensitivity Analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Jones, Mark Nicholas; Sin, Gürkan

    can arise from the experiments itself or from the property models employed. It is important to consider the effect of these uncertainties on the process design in order to assess the quality and reliability of the final design. The main objective of this work is to develop a systematic methodology...... analysis was performed to evaluate the effect of these uncertainties on the process design. The developed methodology was applied to evaluate the effect of uncertainties in the property estimates on design of different unit operations such as extractive distillation, short path evaporator, equilibrium......, the operating conditions, and the choice of the property prediction models, the input uncertainties resulted in significant uncertainties in the final design. The developed methodology was able to: (i) assess the quality of final design; (ii) identify pure component and mixture properties of critical importance...

  9. Application of quantile functions for the analysis and comparison of gas pressure balance uncertainties

    Directory of Open Access Journals (Sweden)

    Ramnath Vishal

    2017-01-01

    Full Text Available Traditionally in the field of pressure metrology uncertainty quantification was performed with the use of the Guide to the Uncertainty in Measurement (GUM; however, with the introduction of the GUM Supplement 1 (GS1 the use of Monte Carlo simulations has become an accepted practice for uncertainty analysis in metrology for mathematical models in which the underlying assumptions of the GUM are not valid. Consequently the use of quantile functions was developed as a means to easily summarize and report on uncertainty numerical results that were based on Monte Carlo simulations. In this paper, we considered the case of a piston–cylinder operated pressure balance where the effective area is modelled in terms of a combination of explicit/implicit and linear/non-linear models, and how quantile functions may be applied to analyse results and compare uncertainties from a mixture of GUM and GS1 methodologies.

  10. Bias and Uncertainty in Non-Ideal qNMR Analysis.

    Science.gov (United States)

    Le Gresley, Adam; Fardus, Fahmina; Warren, John

    2015-01-01

    We report a comprehensive analysis of the acquisition-related sources of uncertainty for internally and externally standardized qNMR experiments. The impacts of major instrument- and sample-related sources of biases and uncertainties are quantified where possible, and the validity of correction and calibration techniques are also discussed. The application of uncertainty budgets for qNMR is well established for simple, internally standardized systems, but the model is incomplete and does not allow for the additional biases and sources of uncertainty that arise from spectrum complexity and external standardization. This report considers the additional contributions to the uncertainty budget that need to be considered to ensure SI traceability of measurement across a wider range of analytes and NMR methodologies.

  11. Subjectivity

    Directory of Open Access Journals (Sweden)

    Jesús Vega Encabo

    2015-11-01

    Full Text Available In this paper, I claim that subjectivity is a way of being that is constituted through a set of practices in which the self is subject to the dangers of fictionalizing and plotting her life and self-image. I examine some ways of becoming subject through narratives and through theatrical performance before others. Through these practices, a real and active subjectivity is revealed, capable of self-knowledge and self-transformation. 

  12. Uncertainty analysis of impacts of climate change on snow processes: Case study of interactions of GCM uncertainty and an impact model

    Science.gov (United States)

    Kudo, Ryoji; Yoshida, Takeo; Masumoto, Takao

    2017-05-01

    The impact of climate change on snow water equivalent (SWE) and its uncertainty were investigated in snowy areas of subarctic and temperate climate zones in Japan by using a snow process model and climate projections derived from general circulation models (GCMs). In particular, we examined how the uncertainty due to GCMs propagated through the snow model, which contained nonlinear processes defined by thresholds, as an example of the uncertainty caused by interactions among multiple sources of uncertainty. An assessment based on the climate projections in Coupled Model Intercomparison Project Phase 5 indicated that heavy-snowfall areas in the temperate zone (especially in low-elevation areas) were markedly vulnerable to temperature change, showing a large SWE reduction even under slight changes in winter temperature. The uncertainty analysis demonstrated that the uncertainty associated with snow processes (1) can be accounted for mainly by the interactions between GCM uncertainty (in particular, the differences of projected temperature changes between GCMs) and the nonlinear responses of the snow model and (2) depends on the balance between the magnitude of projected temperature changes and present climates dominated largely by climate zones and elevation. Specifically, when the peaks of the distributions of daily mean temperature projected by GCMs cross the key thresholds set in the model, the GCM uncertainty, even if tiny, can be amplified by the nonlinear propagation through the snow process model. This amplification results in large uncertainty in projections of CC impact on snow processes.

  13. Stability analysis of machine tool spindle under uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Dou

    2016-05-01

    Full Text Available Chatter is a harmful machining vibration that occurs between the workpiece and the cutting tool, usually resulting in irregular flaw streaks on the finished surface and severe tool wear. Stability lobe diagrams could predict chatter by providing graphical representations of the stable combinations of the axial depth of the cut and spindle speed. In this article, the analytical model of a spindle system is constructed, including a Timoshenko beam rotating shaft model and double sets of angular contact ball bearings with 5 degrees of freedom. Then, the stability lobe diagram of the model is developed according to its dynamic properties. The Monte Carlo method is applied to analyse the bearing preload influence on the system stability with uncertainty taken into account.

  14. Uncertainty analysis for probabilistic pipe fracture evaluations in LBB applications

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, S.; Ghadiali, N.; Wilkowski, G.

    1997-04-01

    During the NRC`s Short Cracks in Piping and Piping Welds Program at Battelle, a probabilistic methodology was developed to conduct fracture evaluations of circumferentially cracked pipes for application to leak-rate detection. Later, in the IPIRG-2 program, several parameters that may affect leak-before-break and other pipe flaw evaluations were identified. This paper presents new results from several uncertainty analyses to evaluate the effects of normal operating stresses, normal plus safe-shutdown earthquake stresses, off-centered cracks, restraint of pressure-induced bending, and dynamic and cyclic loading rates on the conditional failure probability of pipes. systems in BWR and PWR. For each parameter, the sensitivity to conditional probability of failure and hence, its importance on probabilistic leak-before-break evaluations were determined.

  15. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    Energy Technology Data Exchange (ETDEWEB)

    Pourgol-Mohammad, Mohammad, E-mail: pourgolmohammad@sut.ac.ir [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mojtaba [Building & Housing Research Center, Tehran (Iran, Islamic Republic of); Sepanloo, Kamran [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of)

    2016-08-15

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  16. How does uncertainty shape patient experience in advanced illness? A secondary analysis of qualitative data.

    Science.gov (United States)

    Etkind, Simon Noah; Bristowe, Katherine; Bailey, Katharine; Selman, Lucy Ellen; Murtagh, Fliss Em

    2017-02-01

    Uncertainty is common in advanced illness but is infrequently studied in this context. If poorly addressed, uncertainty can lead to adverse patient outcomes. We aimed to understand patient experiences of uncertainty in advanced illness and develop a typology of patients' responses and preferences to inform practice. Secondary analysis of qualitative interview transcripts. Studies were assessed for inclusion and interviews were sampled using maximum-variation sampling. Analysis used a thematic approach with 10% of coding cross-checked to enhance reliability. Qualitative interviews from six studies including patients with heart failure, chronic obstructive pulmonary disease, renal disease, cancer and liver failure. A total of 30 transcripts were analysed. Median age was 75 (range, 43-95), 12 patients were women. The impact of uncertainty was frequently discussed: the main related themes were engagement with illness, information needs, patient priorities and the period of time that patients mainly focused their attention on (temporal focus). A typology of patient responses to uncertainty was developed from these themes. Uncertainty influences patient experience in advanced illness through affecting patients' information needs, preferences and future priorities for care. Our typology aids understanding of how patients with advanced illness respond to uncertainty. Assessment of these three factors may be a useful starting point to guide clinical assessment and shared decision making.

  17. Status of XSUSA for Sampling Based Nuclear Data Uncertainty and Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Pasichnyk I.

    2013-03-01

    Full Text Available In the present contribution, an overview of the sampling based XSUSA method for sensitivity and uncertainty analysis with respect to nuclear data is given. The focus is on recent developments and applications of XSUSA. These applications include calculations for critical assemblies, fuel assembly depletion calculations, and steadystate as well as transient reactor core calculations. The analyses are partially performed in the framework of international benchmark working groups (UACSA – Uncertainty Analyses for Criticality Safety Assessment, UAM – Uncertainty Analysis in Modelling. It is demonstrated that particularly for full-scale reactor calculations the influence of the nuclear data uncertainties on the results can be substantial. For instance, for the radial fission rate distributions of mixed UO2/MOX light water reactor cores, the 2σ uncertainties in the core centre and periphery can reach values exceeding 10%. For a fast transient, the resulting time behaviour of the reactor power was covered by a wide uncertainty band. Overall, the results confirm the necessity of adding systematic uncertainty analyses to best-estimate reactor calculations.

  18. The IAEA Coordinated Research Program on HTGR Uncertainty Analysis: Phase I Status and Initial Results

    Energy Technology Data Exchange (ETDEWEB)

    Strydom, Gerhard; Bostelmann, Friederike; Ivanov, Kostadin

    2014-10-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. One way to address the uncertainties in the HTGR analysis tools is to assess the sensitivity of critical parameters (such as the calculated maximum fuel temperature during loss of coolant accidents) to a few important input uncertainties. The input parameters were identified by engineering judgement in the past but are today typically based on a Phenomena Identification Ranking Table (PIRT) process. The input parameters can also be derived from sensitivity studies and are then varied in the analysis to find a spread in the parameter of importance. However, there is often no easy way to compensate for these uncertainties. In engineering system design, a common approach for addressing performance uncertainties is to add compensating margins to the system, but with passive properties credited it is not so clear how to apply it in the case of modular HTGR heat removal path. Other more sophisticated uncertainty modelling approaches, including Monte Carlo analysis, have also been proposed and applied. Ideally one wishes to apply a more fundamental approach to determine the predictive capability and accuracies of coupled neutronics/thermal-hydraulics and depletion simulations used for reactor design and safety assessment. Today there is a broader acceptance of the use of uncertainty analysis even in safety studies, and it has been accepted by regulators in some cases to replace the traditional conservative analysis. Therefore some safety analysis calculations may use a mixture of these approaches for different parameters depending upon the particular requirements of the analysis problem involved. Sensitivity analysis can for example be used to provide information as part of an uncertainty analysis to determine best estimate plus uncertainty results to the

  19. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Sin, Gürkan; Gernaey, Krist

    2013-01-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic modelbased process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty...

  20. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.

  1. Fast variable stiffness composite cylinder uncertainty analysis by using reanalysis assisted Copula function

    Directory of Open Access Journals (Sweden)

    Yang Zeng

    2016-01-01

    Full Text Available There are lots of uncertainties in variable-stiffness composite materials such as material properties, fibre volume fraction, geometries at various scale and matrix porosity. Commonly, these uncertainties are not always mutually independent and there exist correlations among these random input variables. These correlations may affect the output of composite significantly. To address these correlations, a novel approach for uncertainty analysis based on copula function assisted by reanalysis method is suggested. The Copula function is utilized to address the correlations of random input variables. Monte Carlo simulation (MCS is employed to obtain the uncertainty analysis. Therefore, a large number of samples should be generated and the expensive computational cost is not feasible when the popular finite element (FE model is utilized. To save the computational cost and make the uncertainty analysis feasible in practice, an efficient fast computation method, reanalysis method is integrated in the frame. The numerical test demonstrates that the proposed approach is an efficient uncertainty analysis tool for the practical engineering problems.

  2. Uncertainty Analysis of the Temperature–Resistance Relationship of Temperature Sensing Fabric

    Directory of Open Access Journals (Sweden)

    Muhammad Dawood Husain

    2016-11-01

    Full Text Available This paper reports the uncertainty analysis of the temperature–resistance (TR data of the newly developed temperature sensing fabric (TSF, which is a double-layer knitted structure fabricated on an electronic flat-bed knitting machine, made of polyester as a basal yarn, and embedded with fine metallic wire as sensing element. The measurement principle of the TSF is identical to temperature resistance detector (RTD; that is, change in resistance due to change in temperature. The regression uncertainty (uncertainty within repeats and repeatability uncertainty (uncertainty among repeats were estimated by analysing more than 300 TR experimental repeats of 50 TSF samples. The experiments were performed under dynamic heating and cooling environments on a purpose-built test rig within the temperature range of 20–50 °C. The continuous experimental data was recorded through LabVIEW-based graphical user interface. The result showed that temperature and resistance values were not only repeatable but reproducible, with only minor variations. The regression uncertainty was found to be less than ±0.3 °C; the TSF sample made of Ni and W wires showed regression uncertainty of <±0.13 °C in comparison to Cu-based TSF samples (>±0.18 °C. The cooling TR data showed considerably reduced values (±0.07 °C of uncertainty in comparison with the heating TR data (±0.24 °C. The repeatability uncertainty was found to be less than ±0.5 °C. By increasing the number of samples and repeats, the uncertainties may be reduced further. The TSF could be used for continuous measurement of the temperature profile on the surface of the human body.

  3. The uncertainty analysis of a liquid metal reactor for burning minor actinides from light water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Hang Bok [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    The neutronics analysis of a liquid metal reactor for burning minor actinides has shown that uncertainties in the nuclear data of several key minor actinide isotopes can introduce large uncertainties in the predicted performance of the core. A comprehensive sensitivity and uncertainty analysis was performed on a 1200 MWth actinide burner designed for a low burnup reactivity swing, negative doppler coefficient, and low sodium void worth. Sensitivities were generated using depletion perturbation methods for the equilibrium cycle of the reactor and covariance data was taken ENDF-B/V and other published sources. The relative uncertainties in the burnup swing, doppler coefficient, and void worth were conservatively estimated to be 180%, 97%, and 46%, respectively. 5 refs., 1 fig., 3 tabs. (Author)

  4. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    Science.gov (United States)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different

  5. Sensitivity analysis and uncertainty quantification for environmental models******

    Directory of Open Access Journals (Sweden)

    Cartailler Thomas

    2014-01-01

    Full Text Available Environmental models often involve complex dynamic and spatial inputs and outputs. This raises specific issues when performing uncertainty and sensitivity analyses (SA. Based on applications in flood risk assessment and agro-ecology, we present current research to adapt the methods of variance-based SA to such models. After recalling the basic principles, we propose a metamodelling approach of dynamic models based on a reduced-basis approximation of PDEs and we show how the error on the subsequent sensitivity indices can be quantified. We then present a mix of pragmatic and methodological solutions to perform the SA of a dynamic agro-climatic model with non standard input factors. SA is then applied to a flood risk model with spatially distributed inputs and outputs. Block sensitivity indices are defined and a precise relationship between these indices and their support size is established. Finally, we show how the whole support landscape and its key features can be incorporated in the SA of a spatial model.

  6. A Probabilistic Approach to Uncertainty Analysis in NTPR Radiation Dose Assessments

    Science.gov (United States)

    2009-11-01

    analysis. Software Selection (4.1.1) resulted in choosing Mathcad ® as the primary tool for Monte Carlo calculations and was supplemented with the...model variability and uncertainty of model parameters generated in Mathcad . The special studies of model parameters, uncertainties, and distributions...Enewetak Island contained in contractor reports to model PFs and test them for a 48-man barracks building and an 8-man tent. Mathcad software was

  7. Influences of system uncertainties on the numerical transfer path analysis of engine systems

    Science.gov (United States)

    Acri, A.; Nijman, E.; Acri, A.; Offner, G.

    2017-10-01

    Practical mechanical systems operate with some degree of uncertainty. In numerical models uncertainties can result from poorly known or variable parameters, from geometrical approximation, from discretization or numerical errors, from uncertain inputs or from rapidly changing forcing that can be best described in a stochastic framework. Recently, random matrix theory was introduced to take parameter uncertainties into account in numerical modeling problems. In particular in this paper, Wishart random matrix theory is applied on a multi-body dynamic system to generate random variations of the properties of system components. Multi-body dynamics is a powerful numerical tool largely implemented during the design of new engines. In this paper the influence of model parameter variability on the results obtained from the multi-body simulation of engine dynamics is investigated. The aim is to define a methodology to properly assess and rank system sources when dealing with uncertainties. Particular attention is paid to the influence of these uncertainties on the analysis and the assessment of the different engine vibration sources. Examples of the effects of different levels of uncertainties are illustrated by means of examples using a representative numerical powertrain model. A numerical transfer path analysis, based on system dynamic substructuring, is used to derive and assess the internal engine vibration sources. The results obtained from this analysis are used to derive correlations between parameter uncertainties and statistical distribution of results. The derived statistical information can be used to advance the knowledge of the multi-body analysis and the assessment of system sources when uncertainties in model parameters are considered.

  8. Uncertainty Analysis of non-point source pollution control facilities design techniques in Korea

    Science.gov (United States)

    Lee, J.; Okjeong, L.; Gyeong, C. B.; Park, M. W.; Kim, S.

    2015-12-01

    The design of non-point sources control facilities in Korea is divided largely by the stormwater capture ratio, the stormwater load capture ratio, and the pollutant reduction efficiency of the facility. The stormwater capture ratio is given by a design formula as a function of the water quality treatment capacity, the greater the capacity, the more the amount of stormwater intercepted by the facility. The stormwater load capture ratio is defined as the ratio of the load entering the facility of the total pollutant load generated in the target catchment, and is given as a design formula represented by a function of the stormwater capture ratio. In order to estimate the stormwater capture ratio and load capture ratio, a lot of quantitative analysis of hydrologic processes acted in pollutant emission is required, but these formulas have been applied without any verification. Since systematic monitoring programs were insufficient, verification of these formulas was fundamentally impossible. However, recently the Korean ministry of Environment has conducted an long-term systematic monitoring project, and thus the verification of the formulas became possible. In this presentation, the stormwater capture ratio and load capture ratio are re-estimated using actual TP data obtained from long-term monitoring program at Noksan industrial complex located in Busan, Korea. Through the re-estimated process, the uncertainty included in the design process that has been applied until now will be shown in a quantitative extent. In addition, each uncertainty included in the stormwater capture ratio estimation and in the stormwater load capture ratio estimation will be expressed to quantify the relative impact on the overall non-point pollutant control facilities design process. Finally, the SWMM-Matlab interlocking module for model parameters estimation will be introduced. Acknowledgement This subject is supported by Korea Ministry of Environment as "The Eco Innovation Project : Non

  9. Efficient parametric uncertainty analysis within the hybrid Finite Element/Statistical Energy Analysis method

    Science.gov (United States)

    Cicirello, Alice; Langley, Robin S.

    2014-03-01

    This paper is concerned with the development of efficient algorithms for propagating parametric uncertainty within the context of the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) approach to the analysis of complex vibro-acoustic systems. This approach models the system as a combination of SEA subsystems and FE components; it is assumed that the FE components have fully deterministic properties, while the SEA subsystems have a high degree of randomness. The method has been recently generalised by allowing the FE components to possess parametric uncertainty, leading to two ensembles of uncertainty: a non-parametric one (SEA subsystems) and a parametric one (FE components). The SEA subsystems ensemble is dealt with analytically, while the effect of the additional FE components ensemble can be dealt with by Monte Carlo Simulations. However, this approach can be computationally intensive when applied to complex engineering systems having many uncertain parameters. Two different strategies are proposed: (i) the combination of the hybrid FE/SEA method with the First Order Reliability Method which allows the probability of the non-parametric ensemble average of a response variable exceeding a barrier to be calculated and (ii) the combination of the hybrid FE/SEA method with Laplace's method which allows the evaluation of the probability of a response variable exceeding a limit value. The proposed approaches are illustrated using two built-up plate systems with uncertain properties and the results are validated against direct integration, Monte Carlo simulations of the FE and of the hybrid FE/SEA models.

  10. Estimation and Uncertainty Analysis of Flammability Properties of Chemicals using Group-Contribution Property Models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    Process safety studies and assessments rely on accurate property data. Flammability data like the lower and upper flammability limit (LFL and UFL) play an important role in quantifying the risk of fire and explosion. If experimental values are not available for the safety analysis due to cost...... or time constraints, property prediction models like group contribution (GC) models can estimate flammability data. The estimation needs to be accurate, reliable and as less time consuming as possible. However, GC property prediction methods frequently lack rigorous uncertainty analysis. Hence...... to the parameter estimation an uncertainty analysis of the estimated data and a comparison to other methods is performed. A thorough uncertainty analysis provides information about the prediction error, which is important for the use of the data in process safety studies and assessments. The method considers...

  11. Uncertainty analysis of a one-dimensional constitutive model for shape memory alloy thermomechanical description

    DEFF Research Database (Denmark)

    Oliveira, Sergio A.; Savi, Marcelo A.; Santos, Ilmar F.

    2014-01-01

    The use of shape memory alloys (SMAs) in engineering applications has increased the interest of the accuracy analysis of their thermomechanical description. This work presents an uncertainty analysis related to experimental tensile tests conducted with shape memory alloy wires. Experimental data...... are compared with numerical simulations obtained from a constitutive model with internal constraints employed to describe the thermomechanical behavior of SMAs. The idea is to evaluate if the numerical simulations are within the uncertainty range of the experimental data. Parametric analysis is also developed...

  12. Uncertainties in Cancer Risk Coefficients for Environmental Exposure to Radionuclides. An Uncertainty Analysis for Risk Coefficients Reported in Federal Guidance Report No. 13

    Energy Technology Data Exchange (ETDEWEB)

    Pawel, David [U.S. Environmental Protection Agency; Leggett, Richard Wayne [ORNL; Eckerman, Keith F [ORNL; Nelson, Christopher [U.S. Environmental Protection Agency

    2007-01-01

    Federal Guidance Report No. 13 (FGR 13) provides risk coefficients for estimation of the risk of cancer due to low-level exposure to each of more than 800 radionuclides. Uncertainties in risk coefficients were quantified in FGR 13 for 33 cases (exposure to each of 11 radionuclides by each of three exposure pathways) on the basis of sensitivity analyses in which various combinations of plausible biokinetic, dosimetric, and radiation risk models were used to generate alternative risk coefficients. The present report updates the uncertainty analysis in FGR 13 for the cases of inhalation and ingestion of radionuclides and expands the analysis to all radionuclides addressed in that report. The analysis indicates that most risk coefficients for inhalation or ingestion of radionuclides are determined within a factor of 5 or less by current information. That is, application of alternate plausible biokinetic and dosimetric models and radiation risk models (based on the linear, no-threshold hypothesis with an adjustment for the dose and dose rate effectiveness factor) is unlikely to change these coefficients by more than a factor of 5. In this analysis the assessed uncertainty in the radiation risk model was found to be the main determinant of the uncertainty category for most risk coefficients, but conclusions concerning the relative contributions of risk and dose models to the total uncertainty in a risk coefficient may depend strongly on the method of assessing uncertainties in the risk model.

  13. Uncertainty analysis of standardized measurements of random-incidence absorption and scattering coefficients.

    Science.gov (United States)

    Müller-Trapet, Markus; Vorländer, Michael

    2015-01-01

    This work presents an analysis of the effect of some uncertainties encountered when measuring absorption or scattering coefficients in the reverberation chamber according to International Organization for Standardization/American Society for Testing and Materials standards. This especially relates to the uncertainty due to spatial fluctuations of the sound field. By analyzing the mathematical definition of the respective coefficient, a relationship between the properties of the chamber and the test specimen and the uncertainty in the measured quantity is determined and analyzed. The validation of the established equations is presented through comparisons with measurement data. This study analytically explains the main sources of error and provides a method to obtain the product of the necessary minimum number of measurement positions and the band center frequency to achieve a given maximum uncertainty in the desired quantity. It is shown that this number depends on the ratio of room volume to sample surface area and the reverberation time of the empty chamber.

  14. Propagation of uncertainty and sensitivity analysis in an integral oil-gas plume model

    KAUST Repository

    Wang, Shitao

    2016-05-27

    Polynomial Chaos expansions are used to analyze uncertainties in an integral oil-gas plume model simulating the Deepwater Horizon oil spill. The study focuses on six uncertain input parameters—two entrainment parameters, the gas to oil ratio, two parameters associated with the droplet-size distribution, and the flow rate—that impact the model\\'s estimates of the plume\\'s trap and peel heights, and of its various gas fluxes. The ranges of the uncertain inputs were determined by experimental data. Ensemble calculations were performed to construct polynomial chaos-based surrogates that describe the variations in the outputs due to variations in the uncertain inputs. The surrogates were then used to estimate reliably the statistics of the model outputs, and to perform an analysis of variance. Two experiments were performed to study the impacts of high and low flow rate uncertainties. The analysis shows that in the former case the flow rate is the largest contributor to output uncertainties, whereas in the latter case, with the uncertainty range constrained by aposteriori analyses, the flow rate\\'s contribution becomes negligible. The trap and peel heights uncertainties are then mainly due to uncertainties in the 95% percentile of the droplet size and in the entrainment parameters.

  15. An Empirical Analysis of Stakeholders' Influence on Policy Development: the Role of Uncertainty Handling

    Directory of Open Access Journals (Sweden)

    Rianne M. Bijlsma

    2011-03-01

    Full Text Available Stakeholder participation is advocated widely, but there is little structured, empirical research into its influence on policy development. We aim to further the insight into the characteristics of participatory policy development by comparing it to expert-based policy development for the same case. We describe the process of problem framing and analysis, as well as the knowledge base used. We apply an uncertainty perspective to reveal differences between the approaches and speculate about possible explanations. We view policy development as a continuous handling of substantive uncertainty and process uncertainty, and investigate how the methods of handling uncertainty of actors influence the policy development. Our findings suggest that the wider frame that was adopted in the participatory approach was the result of a more active handling of process uncertainty. The stakeholders handled institutional uncertainty by broadening the problem frame, and they handled strategic uncertainty by negotiating commitment and by including all important stakeholder criteria in the frame. In the expert-based approach, we observed a more passive handling of uncertainty, apparently to avoid complexity. The experts handled institutional uncertainty by reducing the scope and by anticipating windows of opportunity in other policy arenas. Strategic uncertainty was handled by assuming stakeholders' acceptance of noncontroversial measures that balanced benefits and sacrifices. Three other observations are of interest to the scientific debate on participatory policy processes. Firstly, the participatory policy was less adaptive than the expert-based policy. The observed low tolerance for process uncertainty of participants made them opt for a rigorous "once and for all" settling of the conflict. Secondly, in the participatory approach, actors preferred procedures of traceable knowledge acquisition over controversial topics to handle substantive uncertainty. This

  16. Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry

    Science.gov (United States)

    West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat

    2016-01-01

    The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.

  17. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Knowledge Advancement.

    Energy Technology Data Exchange (ETDEWEB)

    Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.; Ross, Kyle W.; Cardoni, Jeffrey N; Kalinich, Donald A.; Osborn, Douglas.; Sallaberry, Cedric Jean-Marie; Ghosh, S. Tina

    2014-02-01

    This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the model response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)

  18. Integrated Risk-Capability Analysis under Deep Uncertainty : An ESDMA Approach

    NARCIS (Netherlands)

    Pruyt, E.; Kwakkel, J.H.

    2012-01-01

    Integrated risk-capability analysis methodologies for dealing with increasing degrees of complexity and deep uncertainty are urgently needed in an ever more complex and uncertain world. Although scenario approaches, risk assessment methods, and capability analysis methods are used, few organizations

  19. Applying Fuzzy and Probabilistic Uncertainty Concepts to the Material Flow Analysis of Palladium in Austria

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2015-01-01

    Material flow analysis (MFA) is a widely applied tool to investigate resource and recycling systems of metals and minerals. Owing to data limitations and restricted system understanding, MFA results are inherently uncertain. To demonstrate the systematic implementation of uncertainty analysis in ...

  20. Comparative uncertainty analysis of copper loads in stormwater systems using GLUE and grey-box modeling

    DEFF Research Database (Denmark)

    Lindblom, Erik Ulfson; Madsen, Henrik; Mikkelsen, Peter Steen

    2007-01-01

    . With the proposed model and input data, the GLUE analysis show that the total sampled copper mass can be predicted within a range of +/- 50% of the median value ( 385 g), whereas the grey-box analysis showed a prediction uncertainty of less than +/- 30%. Future work will clarify the pros and cons of the two methods...

  1. Robust sensor fault detection and isolation of gas turbine engines subjected to time-varying parameter uncertainties

    Science.gov (United States)

    Pourbabaee, Bahareh; Meskin, Nader; Khorasani, Khashayar

    2016-08-01

    In this paper, a novel robust sensor fault detection and isolation (FDI) strategy using the multiple model-based (MM) approach is proposed that remains robust with respect to both time-varying parameter uncertainties and process and measurement noise in all the channels. The scheme is composed of robust Kalman filters (RKF) that are constructed for multiple piecewise linear (PWL) models that are constructed at various operating points of an uncertain nonlinear system. The parameter uncertainty is modeled by using a time-varying norm bounded admissible structure that affects all the PWL state space matrices. The robust Kalman filter gain matrices are designed by solving two algebraic Riccati equations (AREs) that are expressed as two linear matrix inequality (LMI) feasibility conditions. The proposed multiple RKF-based FDI scheme is simulated for a single spool gas turbine engine to diagnose various sensor faults despite the presence of parameter uncertainties, process and measurement noise. Our comparative studies confirm the superiority of our proposed FDI method when compared to the methods that are available in the literature.

  2. Operational modal analysis modeling, Bayesian inference, uncertainty laws

    CERN Document Server

    Au, Siu-Kui

    2017-01-01

    This book presents operational modal analysis (OMA), employing a coherent and comprehensive Bayesian framework for modal identification and covering stochastic modeling, theoretical formulations, computational algorithms, and practical applications. Mathematical similarities and philosophical differences between Bayesian and classical statistical approaches to system identification are discussed, allowing their mathematical tools to be shared and their results correctly interpreted. Many chapters can be used as lecture notes for the general topic they cover beyond the OMA context. After an introductory chapter (1), Chapters 2–7 present the general theory of stochastic modeling and analysis of ambient vibrations. Readers are first introduced to the spectral analysis of deterministic time series (2) and structural dynamics (3), which do not require the use of probability concepts. The concepts and techniques in these chapters are subsequently extended to a probabilistic context in Chapter 4 (on stochastic pro...

  3. METHODICAL ENSURING ELECTRONIC SUBJECT ANALYSIS OF DOCUMENTS: FEATURES OF EDITING SUBJECT HEADINGS IN ABIS ABSOTHEQUE UNICODE

    Directory of Open Access Journals (Sweden)

    Т. М. Бикова

    2016-03-01

    Full Text Available The purpose of our article is consideration of questions of electronic subject analysis of documents and methodical ensuring editing subject headings in the electronic catalog. The main objective of our article – to show a technique of editing the dictionary of subject headings, to study and apply this technique in work of libraries of higher education institutions. Object of research is the thesaurus of subject headings of the electronic catalog of the Scientific Library of Odessa I. I. Mechnikov National University. To improve the efficiency and quality of the search capabilities of the electronic catalog needs constant work on its optimization, that is, technical editing of subject headings, the opening of new subject headings and subheadings.  In Scientific library the instruction, which regulates a technique of edition of subject headings, was developed and put into practice and establishes rationing of this process. The main finding of the work should be to improve the level of bibliographic service users and rationalization systematizer. The research findings have the practical value for employees of libraries.

  4. Uncertainty Analysis of the Potential Hazard of MCCI during Severe Accidents for the CANDU6 Plant

    Directory of Open Access Journals (Sweden)

    Sooyong Park

    2015-01-01

    Full Text Available This paper illustrates the application of a severe accident analysis computer program to the uncertainty analysis of molten corium-concrete interaction (MCCI phenomena in cases of severe accidents in CANDU6 type plant. The potential hazard of MCCI is a failure of the reactor building owing to the possibility of a calandria vault floor melt-through even though the containment filtered vent system is operated. Meanwhile, the MCCI still has large uncertainties in several phenomena such as a melt spreading area and the extent of water ingression into a continuous debris layer. The purpose of this study is to evaluate the MCCI in the calandria vault floor via an uncertainty analysis using the ISAAC program for the CANDU6.

  5. Using finite mixture models in thermal-hydraulics system code uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carlos, S., E-mail: scarlos@iqn.upv.es [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Sánchez, A. [Department d’Estadística Aplicada i Qualitat, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Ginestar, D. [Department de Matemàtica Aplicada, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Martorell, S. [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain)

    2013-09-15

    Highlights: • Best estimate codes simulation needs uncertainty quantification. • The output variables can present multimodal probability distributions. • The analysis of multimodal distribution is performed using finite mixture models. • Two methods to reconstruct output variable probability distribution are used. -- Abstract: Nuclear Power Plant safety analysis is mainly based on the use of best estimate (BE) codes that predict the plant behavior under normal or accidental conditions. As the BE codes introduce uncertainties due to uncertainty in input parameters and modeling, it is necessary to perform uncertainty assessment (UA), and eventually sensitivity analysis (SA), of the results obtained. These analyses are part of the appropriate treatment of uncertainties imposed by current regulation based on the adoption of the best estimate plus uncertainty (BEPU) approach. The most popular approach for uncertainty assessment, based on Wilks’ method, obtains a tolerance/confidence interval, but it does not completely characterize the output variable behavior, which is required for an extended UA and SA. However, the development of standard UA and SA impose high computational cost due to the large number of simulations needed. In order to obtain more information about the output variable and, at the same time, to keep computational cost as low as possible, there has been a recent shift toward developing metamodels (model of model), or surrogate models, that approximate or emulate complex computer codes. In this way, there exist different techniques to reconstruct the probability distribution using the information provided by a sample of values as, for example, the finite mixture models. In this paper, the Expectation Maximization and the k-means algorithms are used to obtain a finite mixture model that reconstructs the output variable probability distribution from data obtained with RELAP-5 simulations. Both methodologies have been applied to a separated

  6. A new algorithm for five-hole probe calibration, data reduction, and uncertainty analysis

    Science.gov (United States)

    Reichert, Bruce A.; Wendt, Bruce J.

    1994-01-01

    A new algorithm for five-hole probe calibration and data reduction using a non-nulling method is developed. The significant features of the algorithm are: (1) two components of the unit vector in the flow direction replace pitch and yaw angles as flow direction variables; and (2) symmetry rules are developed that greatly simplify Taylor's series representations of the calibration data. In data reduction, four pressure coefficients allow total pressure, static pressure, and flow direction to be calculated directly. The new algorithm's simplicity permits an analytical treatment of the propagation of uncertainty in five-hole probe measurement. The objectives of the uncertainty analysis are to quantify uncertainty of five-hole results (e.g., total pressure, static pressure, and flow direction) and determine the dependence of the result uncertainty on the uncertainty of all underlying experimental and calibration measurands. This study outlines a general procedure that other researchers may use to determine five-hole probe result uncertainty and provides guidance to improve measurement technique. The new algorithm is applied to calibrate and reduce data from a rake of five-hole probes. Here, ten individual probes are mounted on a single probe shaft and used simultaneously. Use of this probe is made practical by the simplicity afforded by this algorithm.

  7. Measuring the Flexural Strength of Ceramics at Elevated Temperatures – An Uncertainty Analysis

    Directory of Open Access Journals (Sweden)

    Štubňa I.

    2014-02-01

    Full Text Available The flexural mechanical strength was measured at room and elevated temperatures on green ceramic samples made from quartz electroporcelain mixture. An apparatus exploited the three-point-bending mechanical arrangement and a magazine for 10 samples that are favorable at the temperature measurements from 20 °C to 1000 °C. A description of the apparatus from the point of possible sources of uncertainties is also given. The uncertainty analysis taking into account thermal expansion of the sample and span between the supports is performed for 600 °C. Friction between the sample and supports as well as friction between mechanical parts of the apparatus is also considered. The value of the mechanical strength at the temperature of 600 °C is 13.23 ± 0.50 MPa, where the second term is an expanded standard uncertainty. Such an uncertainty is mostly caused by inhomogeneities in measured samples. The biggest part of the uncertainty arises from the repeatability of the loading force which reflects a scatter of the sample properties. The influence of the temperature on the uncertainty value is very small

  8. Uncertainties analysis for the plutonium dosimetry model, doses-2005, using Mayak bioassay data.

    Science.gov (United States)

    Bess, John D; Krahenbuhl, Melinda P; Miller, Scott C; Slaughter, David M; Khokhryakov, Viktor V; Khokhryakov, Valentin F; Suslova, Klara G; Vostrotin, Vadim V

    2007-09-01

    The Doses-2005 model is a combination of the International Commission on Radiological Protection (ICRP) models modified using data from the Mayak Production Association cohort. Surrogate doses from inhaled plutonium can be assigned to approximately 29% of the Mayak workers using their urine bioassay measurements and other history records. The purpose of this study was to quantify and qualify the uncertainties in the estimates for radiation doses calculated with the Doses-2005 model by using Monte Carlo methods and perturbation theory. The average uncertainty in the yearly dose estimates for most organs was approximately 100% regardless of the transportability classification. The relative source of the uncertainties comes from three main sources: 45% from the urine bioassay measurements, 29% from the Doses-2005 model parameters, and 26% from the reference masses for the organs. The most significant reduction in the overall dose uncertainties would result from improved methods in bioassay measurement with additional improvements generated through further model refinement. Additional uncertainties were determined for dose estimates resulting from changes in the transportability classification and the smoking toggle. A comparison was performed to determine the effect of using the model with data from either urine bioassay or autopsy data; no direct correlation could be established. Analysis of the model using autopsy data and incorporation of results from other research efforts that have utilized plutonium ICRP models could improve the Doses-2005 model and reduce the overall uncertainty in the dose estimates.

  9. Uncertainty of site amplification derived from ground response analysis

    OpenAIRE

    Afshari, K; Stewart, JP

    2015-01-01

    Site-specific geotechnical ground response analyses (GRAs) are typically performed to evaluate stress and strain demands within soil profiles and/or to improve the estimation of site response relative to generic site terms from empirical prediction equations. Implementation of GRA results in probabilistic seismic hazard analysis (PSHA) requires knowledge of the mean and standard deviation of site amplification from GRA. We provide expressions for evaluating within-event standard deviations of...

  10. Implementation of a Bayesian Engine for Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Leng Vang; Curtis Smith; Steven Prescott

    2014-08-01

    In probabilistic risk assessment, it is important to have an environment where analysts have access to a shared and secured high performance computing and a statistical analysis tool package. As part of the advanced small modular reactor probabilistic risk analysis framework implementation, we have identified the need for advanced Bayesian computations. However, in order to make this technology available to non-specialists, there is also a need of a simplified tool that allows users to author models and evaluate them within this framework. As a proof-of-concept, we have implemented an advanced open source Bayesian inference tool, OpenBUGS, within the browser-based cloud risk analysis framework that is under development at the Idaho National Laboratory. This development, the “OpenBUGS Scripter” has been implemented as a client side, visual web-based and integrated development environment for creating OpenBUGS language scripts. It depends on the shared server environment to execute the generated scripts and to transmit results back to the user. The visual models are in the form of linked diagrams, from which we automatically create the applicable OpenBUGS script that matches the diagram. These diagrams can be saved locally or stored on the server environment to be shared with other users.

  11. Coupled Monte Carlo simulation and Copula theory for uncertainty analysis of multiphase flow simulation models.

    Science.gov (United States)

    Jiang, Xue; Na, Jin; Lu, Wenxi; Zhang, Yu

    2017-11-01

    Simulation-optimization techniques are effective in identifying an optimal remediation strategy. Simulation models with uncertainty, primarily in the form of parameter uncertainty with different degrees of correlation, influence the reliability of the optimal remediation strategy. In this study, a coupled Monte Carlo simulation and Copula theory is proposed for uncertainty analysis of a simulation model when parameters are correlated. Using the self-adaptive weight particle swarm optimization Kriging method, a surrogate model was constructed to replace the simulation model and reduce the computational burden and time consumption resulting from repeated and multiple Monte Carlo simulations. The Akaike information criterion (AIC) and the Bayesian information criterion (BIC) were employed to identify whether the t Copula function or the Gaussian Copula is the optimal Copula function to match the relevant structure of the parameters. The results show that both the AIC and BIC values of the t Copula function are less than those of the Gaussian Copula function. This indicates that the t Copula function is the optimal function for matching the relevant structure of the parameters. The outputs of the simulation model when parameter correlation was considered and when it was ignored were compared. The results show that the amplitude of the fluctuation interval when parameter correlation was considered is less than the corresponding amplitude when parameter estimation was ignored. Moreover, it was demonstrated that considering the correlation among parameters is essential for uncertainty analysis of a simulation model, and the results of uncertainty analysis should be incorporated into the remediation strategy optimization process.

  12. Bayesian uncertainty analysis compared with the application of the GUM and its supplements

    Science.gov (United States)

    Elster, Clemens

    2014-08-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) has proven to be a major step towards the harmonization of uncertainty evaluation in metrology. Its procedures contain elements from both classical and Bayesian statistics. The recent supplements 1 and 2 to the GUM appear to move the guidelines towards the Bayesian point of view, and they produce a probability distribution that shall encode one's state of knowledge about the measurand. In contrast to a Bayesian uncertainty analysis, however, Bayes' theorem is not applied explicitly. Instead, a distribution is assigned for the input quantities which is then ‘propagated’ through a model that relates the input quantities to the measurand. The resulting distribution for the measurand may coincide with a distribution obtained by the application of Bayes' theorem, but this is not true in general. The relation between a Bayesian uncertainty analysis and the application of the GUM and its supplements is investigated. In terms of a simple example, similarities and differences in the approaches are illustrated. Then a general class of models is considered and conditions are specified for which the distribution obtained by supplement 1 to the GUM is equivalent to a posterior distribution resulting from the application of Bayes' theorem. The corresponding prior distribution is identified and assessed. Finally, we briefly compare the GUM approach with a Bayesian uncertainty analysis in the context of regression problems.

  13. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bostelmann, F. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained). SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on

  14. Uncertainty modeling in vibration, control and fuzzy analysis of structural systems

    CERN Document Server

    Halder, Achintya; Ayyub, Bilal M

    1997-01-01

    This book gives an overview of the current state of uncertainty modeling in vibration, control, and fuzzy analysis of structural and mechanical systems. It is a coherent compendium written by leading experts and offers the reader a sampling of exciting research areas in several fast-growing branches in this field. Uncertainty modeling and analysis are becoming an integral part of system definition and modeling in many fields. The book consists of ten chapters that report the work of researchers, scientists and engineers on theoretical developments and diversified applications in engineering sy

  15. Uncertainty analysis of signal deconvolution using a measured instrument response function

    Science.gov (United States)

    Hartouni, E. P.; Beeman, B.; Caggiano, J. A.; Cerjan, C.; Eckart, M. J.; Grim, G. P.; Hatarik, R.; Moore, A. S.; Munro, D. H.; Phillips, T.; Sayre, D. B.

    2016-11-01

    A common analysis procedure minimizes the ln-likelihood that a set of experimental observables matches a parameterized model of the observation. The model includes a description of the underlying physical process as well as the instrument response function (IRF). In the case investigated here, the National Ignition Facility (NIF) neutron time-of-flight (nTOF) spectrometers, the IRF is constructed from measurements and models. IRF measurements have a finite precision that can make significant contributions to determine the uncertainty estimate of the physical model's parameters. We apply a Bayesian analysis to properly account for IRF uncertainties in calculating the ln-likelihood function used to find the optimum physical parameters.

  16. Uncertainty analysis for absorbed dose from a brain receptor imaging agent

    Energy Technology Data Exchange (ETDEWEB)

    Aydogan, B.; Miller, L.F. [Univ. of Tennessee, Knoxville, TN (United States). Nuclear Engineering Dept.; Sparks, R.B. [Oak Ridge Inst. for Science and Education, TN (United States); Stubbs, J.B. [Radiation Dosimetry Systems of Oak Ridge, Inc., Knoxville, TN (United States)

    1999-01-01

    Absorbed dose estimates are known to contain uncertainties. A recent literature search indicates that prior to this study no rigorous investigation of uncertainty associated with absorbed dose has been undertaken. A method of uncertainty analysis for absorbed dose calculations has been developed and implemented for the brain receptor imaging agent {sup 123}I-IPT. The two major sources of uncertainty considered were the uncertainty associated with the determination of residence time and that associated with the determination of the S values. There are many sources of uncertainty in the determination of the S values, but only the inter-patient organ mass variation was considered in this work. The absorbed dose uncertainties were determined for lung, liver, heart and brain. Ninety-five percent confidence intervals of the organ absorbed dose distributions for each patient and for a seven-patient population group were determined by the ``Latin Hypercube Sampling`` method. For an individual patient, the upper bound of the 95% confidence interval of the absorbed dose was found to be about 2.5 times larger than the estimated mean absorbed dose. For the seven-patient population the upper bound of the 95% confidence interval of the absorbed dose distribution was around 45% more than the estimated population mean. For example, the 95% confidence interval of the population liver dose distribution was found to be between 1.49E+0.7 Gy/MBq and 4.65E+07 Gy/MBq with a mean of 2.52E+07 Gy/MBq. This study concluded that patients in a population receiving {sup 123}I-IPT could receive absorbed doses as much as twice as large as the standard estimated absorbed dose due to these uncertainties.

  17. Abatement cost uncertainty and policy instrument selection. A dynamic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bosetti, V.; Markandya, A. [Fondazione Eni Enrico Mattei (Italy); Golub, A. [Environmental Defense (Italy); Massetti, E.; Tavoni, M. [Fondazione Eni Enrico Mattei (Italy)]|[Universita Cattolica del Sacro Cuore (Italy)

    2007-07-01

    This paper aims at investigating the relative economic and environmental outcomes of price versus quantity mechanisms in controlling greenhouse gas (GHG) emissions when abatement costs are uncertain. In particular, the authors evalute the impacts on policy costs, CO{sub 2} emissions and energy R&D for a mitigation target of 550 ppmv CO{sub 2} equivalent (i.e. 450 for CO{sub 2} only) concentrations. The analysis is performed in an optimal growth framework using the integrated assessment model WITCH (World Induced Technical Change Hybrid).

  18. Content analysis of subjective experiences in partial epileptic seizures.

    Science.gov (United States)

    Johanson, Mirja; Valli, Katja; Revonsuo, Antti; Wedlund, Jan-Eric

    2008-01-01

    A new content analysis method for systematically describing the phenomenology of subjective experiences in connection with partial epileptic seizures is described. Forty patients provided 262 descriptions of subjective experience relative to their partial epileptic seizures. The results revealed that subjective experiences during seizures consist mostly of sensory and bodily sensations, hallucinatory experiences, and thinking. The majority of subjective experiences during seizures are bizarre and distorted; nevertheless, the patients are able to engage in adequate behavior. To the best of our knowledge, this is the first study for which detailed subjective seizure descriptions were collected immediately after each seizure and the first study in which the content of verbal reports of subjective experiences during seizures, including both the ictal and postictal experiences, has been analyzed in detail.

  19. Measurement uncertainty from validation and duplicate analysis results in HPLC analysis of multivitamin preparations and nutrients with different galenic forms.

    Science.gov (United States)

    De Beer, J O; Baten, P; Nsengyumva, C; Smeyers-Verbeke, J

    2003-08-08

    An approach to calculate the measurement uncertainty in the HPLC analysis of several hydro- and liposoluble vitamins in multivitamin preparations with different galenic composition and properties is described. In the first instance it is examined if duplicate analysis results, obtained with a fully validated analysis method on different lots of an effervescent tablet preparation spread over several points of time, might contribute to calculate the measurement uncertainty of the HPLC method used and if the established uncertainty is acceptable in the assessment of compliance with the legal content limits. Analysis of variance (ANOVA) and precision calculations, based on the ISO 5725-2 norm are applied on the analysis results obtained to estimate precision components, necessary to derive the measurement uncertainty. In the second instance it is demonstrated to which extent the fully validated method of analysis for effervescent tablets is applicable to other galenic forms as e.g. capsules with oily emulsions, tablets, coated tablets, oral solutions, em leader and which specific modifications in the analysis steps are involved. By means of duplicate analysis results, acquired from a large series of real samples over a considerable period of time and classified according to their similarity in content, galenic forms and matrices, estimations of measurement uncertainty calculations are shown.

  20. Uncertainty and sensitivity analysis of applications of the PARK program system; Unsicherheits- und Sensitivitaetsanalyse von Anwendungen des Programmsystems PARK

    Energy Technology Data Exchange (ETDEWEB)

    Luczak-Urlik, D.; Bleher, M.; Jacob, P.; Mueller, H. [GSF - Forschungszentrum fuer Umwelt und Gesundheit Neuherberg GmbH, Oberschleissheim (Germany). Inst. fuer Strahlenschutz; Hofer, E.; Krzykacz, B. [Gesellschaft fuer Reaktorsicherheit mbH (Greece)

    1997-11-01

    The (PARK) program package for assessment and abatement of off-site radiological consequences of nuclear accidents establishes radiologic analyses and prognoses on the basis of the measured data obtained by the IMIS system for ambient radiation monitoring. A probabilistic uncertainty and sensitivity analysis has been performed of the PARK-based mean specific activities of fodder and food and the potential radiation exposure of the population. Uncertainties were expressed in model input parameters, i.e. in measured values and radioecologic parameters via subjective probability distributions, within the framework of various emission scenarios (nuclide spectra describing two release modes, dry and wet deposition, and at three times over a one-year period). The resulting output data distributions were determined by Monte Carlo simulations. As a rule, the results of the analyses depend on the selected emission scenario. Measured values proved to be the major source of uncertainties (especially from in-situ soil activity measurements), and also the activity transfer data from fodder to food. As for wet deposition, uncertainties were added by the parameters used for determination of the interception factors. As the measuring stations cannot correlate their measured gamma dose rates with meteorologic data (precipitations), derived information relating to larger areas such as a ``Kreis`` (administrative district) is very much subject to uncertainties. The uncertainties of the PARK results are increasing with assessed increases in rainfall. (orig.) [Deutsch] Das Programmsystem zur Abschaetzung und Begrenzung radiologischer Konsequenzen (PARK) erstellt radiologische Analysen und Prognosen aufgrund von Messdaten, welche im Rahmen des Integrierten Mess- und Informationssystems zur Ueberwachung der Umweltradioaktivitaet (IMIS) erhoben werden. Es wurde eine probabilistische Unsicherheits- und Sensitivitaetsanalyse fuer die in PARK berechneten mittleren spezifischen Aktivitaeten

  1. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation.

    Science.gov (United States)

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.

  2. Using Predictive Uncertainty Analysis to Assess Hydrologic Model Performance for a Watershed in Oregon

    Science.gov (United States)

    Brannan, K. M.; Somor, A.

    2016-12-01

    A variety of statistics are used to assess watershed model performance but these statistics do not directly answer the question: what is the uncertainty of my prediction. Understanding predictive uncertainty is important when using a watershed model to develop a Total Maximum Daily Load (TMDL). TMDLs are a key component of the US Clean Water Act and specify the amount of a pollutant that can enter a waterbody when the waterbody meets water quality criteria. TMDL developers use watershed models to estimate pollutant loads from nonpoint sources of pollution. We are developing a TMDL for bacteria impairments in a watershed in the Coastal Range of Oregon. We setup an HSPF model of the watershed and used the calibration software PEST to estimate HSPF hydrologic parameters and then perform predictive uncertainty analysis of stream flow. We used Monte-Carlo simulation to run the model with 1,000 different parameter sets and assess predictive uncertainty. In order to reduce the chance of specious parameter sets, we accounted for the relationships among parameter values by using mathematically-based regularization techniques and an estimate of the parameter covariance when generating random parameter sets. We used a novel approach to select flow data for predictive uncertainty analysis. We set aside flow data that occurred on days that bacteria samples were collected. We did not use these flows in the estimation of the model parameters. We calculated a percent uncertainty for each flow observation based 1,000 model runs. We also used several methods to visualize results with an emphasis on making the data accessible to both technical and general audiences. We will use the predictive uncertainty estimates in the next phase of our work, simulating bacteria fate and transport in the watershed.

  3. Uncertainty analysis of in-flight spectral calibration for hyperspectral imaging spectrometers

    Science.gov (United States)

    Zhao, Huijie; Geng, Ruonan; Jia, Guorui; Wang, Daming

    2017-10-01

    Hyperspectral imaging instrument performance, especially spectral response parameters, may change when the sensors work in-flight due to vibrations, temperature and pressure changes compared with the laboratory status. In order to derive valid information from imaging data, accurate spectral calibration accompanied by uncertainty analysis to the data must be made. The purpose of this work is to present a process to estimate the uncertainties of in-flight spectral calibration parameters by analyzing the sources of uncertainty and calculating their sensitivity coefficients. In the in-flight spectral calibration method, the band-center and bandwidth determinations are made by correlating the in-flight sensor measured radiance with reference radiance. In this procedure, the uncertainty analysis is conducted separately for three factors: (a) the radiance calculated from imaging data; (b) the reference data; (c) the matching process between the above two items. To obtain the final uncertainty, contributions due to every impact factor must be propagated through this process. Analyses have been made using above process for the Hyperion data. The results show that the shift of band-center in the oxygen absorption (about 762nm), compared with the value measured in the lab, is less than 0.9nm with uncertainties ranging from 0.063nm to 0.183nm related to spatial distribution along the across-track direction of the image, the change of bandwidth is less than 1nm with uncertainties ranging from 0.066nm to 0.166nm. This results verify the validity of the in-flight spectral calibration process.

  4. Bayesian data analysis to quantify the uncertainty of intact rock strength

    Directory of Open Access Journals (Sweden)

    Luis Fernando Contreras

    2018-02-01

    Full Text Available One of the main difficulties in the geotechnical design process lies in dealing with uncertainty. Uncertainty is associated with natural variation of properties, and the imprecision and unpredictability caused by insufficient information on parameters or models. Probabilistic methods are normally used to quantify uncertainty. However, the frequentist approach commonly used for this purpose has some drawbacks. First, it lacks a formal framework for incorporating knowledge not represented by data. Second, it has limitations in providing a proper measure of the confidence of parameters inferred from data. The Bayesian approach offers a better framework for treating uncertainty in geotechnical design. The advantages of the Bayesian approach for uncertainty quantification are highlighted in this paper with the Bayesian regression analysis of laboratory test data to infer the intact rock strength parameters σci and mi used in the Hoek–Brown strength criterion. Two case examples are used to illustrate different aspects of the Bayesian methodology and to contrast the approach with a frequentist approach represented by the nonlinear least squares (NLLS method. The paper discusses the use of a Student's t-distribution versus a normal distribution to handle outliers, the consideration of absolute versus relative residuals, and the comparison of quality of fitting results based on standard errors and Bayes factors. Uncertainty quantification with confidence and prediction intervals of the frequentist approach is compared with that based on scatter plots and bands of fitted envelopes of the Bayesian approach. Finally, the Bayesian method is extended to consider two improvements of the fitting analysis. The first is the case in which the Hoek–Brown parameter, a, is treated as a variable to improve the fitting in the triaxial region. The second is the incorporation of the uncertainty in the estimation of the direct tensile strength from Brazilian test

  5. Event-scale power law recession analysis: quantifying methodological uncertainty

    Science.gov (United States)

    Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.

    2017-01-01

    The study of single streamflow recession events is receiving increasing attention following the presentation of novel theoretical explanations for the emergence of power law forms of the recession relationship, and drivers of its variability. Individually characterizing streamflow recessions often involves describing the similarities and differences between model parameters fitted to each recession time series. Significant methodological sensitivity has been identified in the fitting and parameterization of models that describe populations of many recessions, but the dependence of estimated model parameters on methodological choices has not been evaluated for event-by-event forms of analysis. Here, we use daily streamflow data from 16 catchments in northern California and southern Oregon to investigate how combinations of commonly used streamflow recession definitions and fitting techniques impact parameter estimates of a widely used power law recession model. Results are relevant to watersheds that are relatively steep, forested, and rain-dominated. The highly seasonal mediterranean climate of northern California and southern Oregon ensures study catchments explore a wide range of recession behaviors and wetness states, ideal for a sensitivity analysis. In such catchments, we show the following: (i) methodological decisions, including ones that have received little attention in the literature, can impact parameter value estimates and model goodness of fit; (ii) the central tendencies of event-scale recession parameter probability distributions are largely robust to methodological choices, in the sense that differing methods rank catchments similarly according to the medians of these distributions; (iii) recession parameter distributions are method-dependent, but roughly catchment-independent, such that changing the choices made about a particular method affects a given parameter in similar ways across most catchments; and (iv) the observed correlative relationship

  6. Risk assessment of the Groningen geothermal potential : From seismic to reservoir uncertainty using a discrete parameter analysis

    NARCIS (Netherlands)

    Daniilidis, Alexandros; Doddema, Leon; Herber, Rien

    2016-01-01

    Geothermal exploitation is subject to several uncertainties, even in settings with high data availability, adding to project risk. Uncertainty can stem from the reservoir's initial state, as well as from the geological and operational parameters. The interplay between these aspects entails

  7. Linking trading ratio with TMDL (total maximum daily load) allocation matrix and uncertainty analysis.

    Science.gov (United States)

    Zhang, H X

    2008-01-01

    An innovative approach for total maximum daily load (TMDL) allocation and implementation is the watershed-based pollutant trading. Given the inherent scientific uncertainty for the tradeoffs between point and nonpoint sources, setting of trading ratios can be a contentious issue and was already listed as an obstacle by several pollutant trading programs. One of the fundamental reasons that a trading ratio is often set higher (e.g. greater than 2) is to allow for uncertainty in the level of control needed to attain water quality standards, and to provide a buffer in case traded reductions are less effective than expected. However, most of the available studies did not provide an approach to explicitly address the determination of trading ratio. Uncertainty analysis has rarely been linked to determination of trading ratio.This paper presents a practical methodology in estimating "equivalent trading ratio (ETR)" and links uncertainty analysis with trading ratio determination from TMDL allocation process. Determination of ETR can provide a preliminary evaluation of "tradeoffs" between various combination of point and nonpoint source control strategies on ambient water quality improvement. A greater portion of NPS load reduction in overall TMDL load reduction generally correlates with greater uncertainty and thus requires greater trading ratio. The rigorous quantification of trading ratio will enhance the scientific basis and thus public perception for more informed decision in overall watershed-based pollutant trading program. (c) IWA Publishing 2008.

  8. Uncertainty analysis of a model of wind-blown volcanic plumes.

    Science.gov (United States)

    Woodhouse, Mark J; Hogg, Andrew J; Phillips, Jeremy C; Rougier, Jonathan C

    Mathematical models of natural processes can be used as inversion tools to predict unobserved properties from measured quantities. Uncertainty in observations and model formulation impact on the efficacy of inverse modelling. We present a general methodology, history matching, that can be used to investigate the effect of observational and model uncertainty on inverse modelling studies. We demonstrate history matching on an integral model of volcanic plumes that is used to estimate source conditions from observations of the rise height of plumes during the eruptions of Eyjafjallajökull, Iceland, in 2010 and Grímsvötn, Iceland, in 2011. Sources of uncertainty are identified and quantified, and propagated through the integral plume model. A preliminary sensitivity analysis is performed to identify the uncertain model parameters that strongly influence model predictions. Model predictions are assessed against observations through an implausibility measure that rules out model inputs that are considered implausible given the quantified uncertainty. We demonstrate that the source mass flux at the volcano can be estimated from plume height observations, but the magmatic temperature, exit velocity and exsolved gas mass fraction cannot be accurately determined. Uncertainty in plume height observations and entrainment coefficients results in a large range of plausible values of the source mass flux. Our analysis shows that better constraints on entrainment coefficients for volcanic plumes and more precise observations of plume height are required to obtain tightly constrained estimates of the source mass flux.

  9. Uncertainty analysis of primary water pollutant control in China's pulp and paper industry.

    Science.gov (United States)

    Wen, Zong-guo; Di, Jing-han; Zhang, Xue-ying

    2016-03-15

    The total emission control target of water pollutants (e.g., COD and NH4-N) for a certain industrial sector can be predicted and analysed using the popular technology-based bottom-up modelling. However, this methodology has obvious uncertainty regarding the attainment of mitigation targets. The primary uncertainty comes from macro-production, pollutant reduction roadmap, and technical parameters. This research takes the paper and pulp industry in China as an example, and builds 5 mitigation scenarios via different combinations of raw material structure, scale structure, procedure mitigation technology, and end-of-pipe treatment technology. Using the methodology of uncertainty analysis via Monte Carlo, random sampling was conducted over a hundred thousand times. According to key parameters, sensitive parameters that impact total emission control targets such as industrial output, technique structure, cleaner production technology, and end-of-pipe treatment technology are discussed in this article. It appears that scenario uncertainty has a larger influence on COD emission than NH4-N, hence it is recommended that a looser total emission control target for COD is necessary to increase its feasibility and availability while maintaining the status quo of NH4-N. Consequently, from uncertainty analysis, this research recognizes the sensitive products, techniques, and technologies affecting industrial water pollution. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    Science.gov (United States)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  11. Coherent uncertainty analysis of aerosol measurements from multiple satellite sensors

    Directory of Open Access Journals (Sweden)

    M. Petrenko

    2013-07-01

    Full Text Available Aerosol retrievals from multiple spaceborne sensors, including MODIS (on Terra and Aqua, MISR, OMI, POLDER, CALIOP, and SeaWiFS – altogether, a total of 11 different aerosol products – were comparatively analyzed using data collocated with ground-based aerosol observations from the Aerosol Robotic Network (AERONET stations within the Multi-sensor Aerosol Products Sampling System (MAPSS, http://giovanni.gsfc.nasa.gov/mapss/ and http://giovanni.gsfc.nasa.gov/aerostat/. The analysis was performed by comparing quality-screened satellite aerosol optical depth or thickness (AOD or AOT retrievals during 2006–2010 to available collocated AERONET measurements globally, regionally, and seasonally, and deriving a number of statistical measures of accuracy. We used a robust statistical approach to detect and remove possible outliers in the collocated data that can bias the results of the analysis. Overall, the proportion of outliers in each of the quality-screened AOD products was within 7%. Squared correlation coefficient (R2 values of the satellite AOD retrievals relative to AERONET exceeded 0.8 for many of the analyzed products, while root mean square error (RMSE values for most of the AOD products were within 0.15 over land and 0.07 over ocean. We have been able to generate global maps showing regions where the different products present advantages over the others, as well as the relative performance of each product over different land cover types. It was observed that while MODIS, MISR, and SeaWiFS provide accurate retrievals over most of the land cover types, multi-angle capabilities make MISR the only sensor to retrieve reliable AOD over barren and snow/ice surfaces. Likewise, active sensing enables CALIOP to retrieve aerosol properties over bright-surface closed shrublands more accurately than the other sensors, while POLDER, which is the only one of the sensors capable of measuring polarized aerosols, outperforms other sensors in certain

  12. Satellite Test of the Equivalence Principle Uncertainty Analysis

    Science.gov (United States)

    Worden, Paul; Mester, John

    2009-12-01

    STEP, the Satellite Test of the Equivalence Principle, is intended to test the apparent equivalence of gravitational and inertial mass to 1 part in 1018 (Worden et al. in Adv. Space Res. 25(6):1205-1208, 2000). This will be an increase of more than five orders of magnitude over ground-based experiments and lunar laser ranging observations (Su et al. in Phys. Rev. D 50:3614-3636, 1994; Williams et al. in Phys. Rev. D 53:6730-6739, 1996; Schlamminger et al. in Phys. Rev. Lett. 100:041101, 2008). It is essential to have a comprehensive and consistent model of the possible error sources in an experiment of this nature to be able to understand and set requirements, and to evaluate design trade-offs. In the following pages we describe existing software for such an error model and the application of this software to the STEP experiment. In particular we address several issues, including charge and patch effect forces, where our understanding has improved since the launch of GP-B owing to the availability of GP-B data and preliminary analysis results (Everitt et al. in Space Sci. Rev., 2009, this issue; Silbergleit et al. in Space Sci. Rev., 2009, this issue; Keiser et al. in Space Sci. Rev., 2009, this issue; Heifetz et al. in Space Sci. Rev., 2009, this issue; Muhlfelder et al. in Space Sci. Rev., 2009, this issue).

  13. Reliability and Robustness Analysis of the Masinga Dam under Uncertainty

    Directory of Open Access Journals (Sweden)

    Hayden Postle-Floyd

    2017-02-01

    Full Text Available Kenya’s water abstraction must meet the projected growth in municipal and irrigation demand by the end of 2030 in order to achieve the country’s industrial and economic development plan. The Masinga dam, on the Tana River, is the key to meeting this goal to satisfy the growing demands whilst also continuing to provide hydroelectric power generation. This study quantitatively assesses the reliability and robustness of the Masinga dam system under uncertain future supply and demand using probabilistic climate and population projections, and examines how long-term planning may improve the longevity of the dam. River flow and demand projections are used alongside each other as inputs to the dam system simulation model linked to an optimisation engine to maximise water availability. Water availability after demand satisfaction is assessed for future years, and the projected reliability of the system is calculated for selected years. The analysis shows that maximising power generation on a short-term year-by-year basis achieves 80%, 50% and 1% reliability by 2020, 2025 and 2030 onwards, respectively. Longer term optimal planning, however, has increased system reliability to up to 95% in 2020, 80% in 2025, and more than 40% in 2030 onwards. In addition, increasing the capacity of the reservoir by around 25% can significantly improve the robustness of the system for all future time periods. This study provides a platform for analysing the implication of different planning and management of Masinga dam and suggests that careful consideration should be given to account for growing municipal needs and irrigation schemes in both the immediate and the associated Tana River basin.

  14. Uncertainty in projected point precipitation extremes for hydrological impact analysis of climate change

    Science.gov (United States)

    Van Uytven, Els; Willems, Patrick

    2017-04-01

    Current trends in the hydro-meteorological variables indicate the potential impact of climate change on hydrological extremes. Therefore, they trigger an increased importance climate adaptation strategies in water management. The impact of climate change on hydro-meteorological and hydrological extremes is, however, highly uncertain. This is due to uncertainties introduced by the climate models, the internal variability inherent to the climate system, the greenhouse gas scenarios and the statistical downscaling methods. In view of the need to define sustainable climate adaptation strategies, there is a need to assess these uncertainties. This is commonly done by means of ensemble approaches. Because more and more climate models and statistical downscaling methods become available, there is a need to facilitate the climate impact and uncertainty analysis. A Climate Perturbation Tool has been developed for that purpose, which combines a set of statistical downscaling methods including weather typing, weather generator, transfer function and advanced perturbation based approaches. By use of an interactive interface, climate impact modelers can apply these statistical downscaling methods in a semi-automatic way to an ensemble of climate model runs. The tool is applicable to any region, but has been demonstrated so far to cases in Belgium, Suriname, Vietnam and Bangladesh. Time series representing future local-scale precipitation, temperature and potential evapotranspiration (PET) conditions were obtained, starting from time series of historical observations. Uncertainties on the future meteorological conditions are represented in two different ways: through an ensemble of time series, and a reduced set of synthetic scenarios. The both aim to span the full uncertainty range as assessed from the ensemble of climate model runs and downscaling methods. For Belgium, for instance, use was made of 100-year time series of 10-minutes precipitation observations and daily

  15. Uncertainty analysis of the Measured Performance Rating (MPR) method. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1993-11-01

    A report was commissioned by the New York State Energy Research and Development Authority and the Electric Power Research Institute to evaluate the uncertainties in the energy monitoring method known as measured performance rating (MPR). The work is intended to help further development of the MPR system by quantitatively analyzing the uncertainties in estimates of the heat loss coefficients and heating system efficiencies. The analysis indicates that the MPR should detect as little as a 7 percent change in the heat loss coefficients and heating system efficiencies. The analysis indicate that the MPR should be able to detect as little as a 7 percent change in the heat loss coefficient at 95 percent confidence level. MPR appears sufficiently robust for characterizing common weatherization treatments; e.g., increasing attic insulation from R-7 to R-19 in a typical single-story, 1,100 sq. ft. house resulting in a 19 percent reduction in heat loss coefficient. Furnace efficiency uncertainties ranged up to three times those of the heat loss coefficients. Measurement uncertainties (at the 95 percent confidence level) were estimated to be from 1 to 5 percent for heat loss coefficients and 1.5 percent for a typical furnace efficiency. The analysis also shows a limitation in applying MPR to houses with heating ducts in slabs on grade and to those with very large thermal mass. Most of the uncertainties encountered in the study were due more to the methods of estimating the ``true`` heat loss coefficients, furnace efficiency, and furnace fuel consumption (by collecting fuel bills and simulating two actual houses) than to the MPR approach. These uncertainties in the true parameter values become evidence for arguments in favor of the need of empirical measures of heat loss coefficient and furnace efficiency, like the MPR method, rather than arguments against.

  16. The uncertainty about the social cost of carbon: A decomposition analysis using fund

    NARCIS (Netherlands)

    Anthoff, D.; Tol, R.S.J.

    2013-01-01

    We report the results of an uncertainty decomposition analysis of the social cost of carbon as estimated by FUND, a model that has a more detailed representation of the economic impact of climate change than any other model. Some of the parameters particularly influence impacts in the short run

  17. Uncertainty analysis for complex watershed water quality models: the parameter identifiability problem

    Science.gov (United States)

    Han, F.; Zheng, Y.

    2012-12-01

    Watershed-scale water quality simulation using distributed models like the Soil and Water Assessment Tool (SWAT) usually involves significant uncertainty. The uncertainty needs to be appropriately quantified if the simulation is used to support management practices. Many uncertainty analysis (UA) approaches have been developed for watershed hydrologic models, but their applicability to watershed water quality models, which are more complex, has not been well investigated. This study applied a Markov chain Monte Carlo (MCMC) approach, DiffeRential Evolution Adaptive Metropolis algorithm (DREAM), to the SWAT model. The sediment and total nitrogen pollution in the Newport Bay watershed (Southern California) was used as a case study. Different error assumptions were tested. The major findings include: 1) in the water quality simulation, many parameters are non-identifiable due to different causes; 2) the existence of identifiability seriously reduces the efficiency of the MCMC algorithm, and distorts the posterior distributions of the non-identifiable parameters, although the uncertainty band produced by the algorithm does not change much if enough samples are obtained. It was concluded that a sensitivity analysis (SA) followed by an identifiability analysis is necessary to reduce the non-identifiability, and enhances the applicability of a Bayesian UA approach to complex watershed water quality models. In addition, the analysis on the different causes of non-identifiablity provides insights into model tradeoffs between complexity and performance.

  18. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    Science.gov (United States)

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  19. Uncertainty analysis of an irrigation scheduling model for water management in crop production

    Science.gov (United States)

    Irrigation scheduling tools are critical to allow producers to manage water resources for crop production in an accurate and timely manner. To be useful, these tools need to be accurate, complete, and relatively reliable. The current work presents the uncertainty analysis and its results for the Mis...

  20. Communicating uncertainty in cost-benefit analysis : A cognitive psychological perspective

    NARCIS (Netherlands)

    Mouter, N.; Holleman, M.; Calvert, S.C.; Annema, J.A.

    2013-01-01

    Based on a cognitive psychological theory, this paper aims to improve the communication of uncertainty in Cost-Benefit Analysis. The theory is based on different cognitive-personality and cognitive-social psychological constructs that may help explain individual differences in the processing of

  1. Good Modeling Practice for PAT Applications: Propagation of Input Uncertainty and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Eliasson Lantz, Anna

    2009-01-01

    The uncertainty and sensitivity analysis are evaluated for their usefulness as part of the model-building within Process Analytical Technology applications. A mechanistic model describing a batch cultivation of Streptomyces coelicolor for antibiotic production was used as case study. The input...

  2. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    Science.gov (United States)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  3. Dissertation Defense Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional "validation by test only" mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics

  4. Guidelines for uncertainty analysis developed for the participants in the BIOMOVS II study

    Energy Technology Data Exchange (ETDEWEB)

    Baeverstam, U.; Davis, P.; Garcia-Olivares, A.; Henrich, E.; Koch, J

    1993-07-01

    This report has been produced to provide guidelines for uncertainty analysis for use by participants in the BIOMOVS II study. It is hoped that others with an interest in modelling contamination in the biosphere will also find it useful. The report has been prepared by members of the Uncertainty and Validation Working Group and has been reviewed by other BIOMOVS II participants. The opinions expressed are those of the authors and should not be taken to represent the views of the BIOMOVS II sponsors or other BIOMOVS Il participating organisations.

  5. Sensitivity Analysis and Uncertainty Quantification for the LAMMPS Molecular Dynamics Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bhat, Kabekode Ghanasham [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-07-18

    We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.

  6. Focused Belief Measures for Uncertainty Quantification in High Performance Semantic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, Cliff A.; Weaver, Jesse R.

    2013-08-13

    In web-scale semantic data analytics there is a great need for methods which aggregate uncertainty claims, on the one hand respecting the information provided as accurately as possible, while on the other still being tractable. Traditional statistical methods are more robust, but only represent distributional, additive uncertainty. Generalized information theory methods, including fuzzy systems and Dempster-Shafer (DS) evidence theory, represent multiple forms of uncertainty, but are computationally and methodologically difficult. We require methods which provide an effective balance between the complete representation of the full complexity of uncertainty claims in their interaction, while satisfying the needs of both computational complexity and human cognition. Here we build on J{\\o}sang's subjective logic to posit methods in focused belief measures (FBMs), where a full DS structure is focused to a single event. The resulting ternary logical structure is posited to be able to capture the minimal amount of generalized complexity needed at a maximum of computational efficiency. We demonstrate the efficacy of this approach in a web ingest experiment over the 2012 Billion Triple dataset from the Semantic Web Challenge.

  7. Analysis of a Nonlinear Aeroelastic System with Parametric Uncertainties Using Polynomial Chaos Expansion

    Directory of Open Access Journals (Sweden)

    Ajit Desai

    2010-01-01

    Full Text Available Aeroelastic stability remains an important concern for the design of modern structures such as wind turbine rotors, more so with the use of increasingly flexible blades. A nonlinear aeroelastic system has been considered in the present study with parametric uncertainties. Uncertainties can occur due to any inherent randomness in the system or modeling limitations, and so forth. Uncertainties can play a significant role in the aeroelastic stability predictions in a nonlinear system. The analysis has been put in a stochastic framework, and the propagation of system uncertainties has been quantified in the aeroelastic response. A spectral uncertainty quantification tool called Polynomial Chaos Expansion has been used. A projection-based nonintrusive Polynomial Chaos approach is shown to be much faster than its classical Galerkin method based counterpart. Traditional Monte Carlo Simulation is used as a reference solution. Effect of system randomness on the bifurcation behavior and the flutter boundary has been presented. Stochastic bifurcation results and bifurcation of probability density functions are also discussed.

  8. Analysis of actuator delay and its effect on uncertainty quantification for real-time hybrid simulation

    Science.gov (United States)

    Chen, Cheng; Xu, Weijie; Guo, Tong; Chen, Kai

    2017-10-01

    Uncertainties in structure properties can result in different responses in hybrid simulations. Quantification of the effect of these uncertainties would enable researchers to estimate the variances of structural responses observed from experiments. This poses challenges for real-time hybrid simulation (RTHS) due to the existence of actuator delay. Polynomial chaos expansion (PCE) projects the model outputs on a basis of orthogonal stochastic polynomials to account for influences of model uncertainties. In this paper, PCE is utilized to evaluate effect of actuator delay on the maximum displacement from real-time hybrid simulation of a single degree of freedom (SDOF) structure when accounting for uncertainties in structural properties. The PCE is first applied for RTHS without delay to determine the order of PCE, the number of sample points as well as the method for coefficients calculation. The PCE is then applied to RTHS with actuator delay. The mean, variance and Sobol indices are compared and discussed to evaluate the effects of actuator delay on uncertainty quantification for RTHS. Results show that the mean and the variance of the maximum displacement increase linearly and exponentially with respect to actuator delay, respectively. Sensitivity analysis through Sobol indices also indicates the influence of the single random variable decreases while the coupling effect increases with the increase of actuator delay.

  9. Developing Uncertainty Models for Robust Flutter Analysis Using Ground Vibration Test Data

    Science.gov (United States)

    Potter, Starr; Lind, Rick; Kehoe, Michael W. (Technical Monitor)

    2001-01-01

    A ground vibration test can be used to obtain information about structural dynamics that is important for flutter analysis. Traditionally, this information#such as natural frequencies of modes#is used to update analytical models used to predict flutter speeds. The ground vibration test can also be used to obtain uncertainty models, such as natural frequencies and their associated variations, that can update analytical models for the purpose of predicting robust flutter speeds. Analyzing test data using the -norm, rather than the traditional 2-norm, is shown to lead to a minimum-size uncertainty description and, consequently, a least-conservative robust flutter speed. This approach is demonstrated using ground vibration test data for the Aerostructures Test Wing. Different norms are used to formulate uncertainty models and their associated robust flutter speeds to evaluate which norm is least conservative.

  10. Intolerance of Uncertainty in Eating Disorders: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Brown, Melanie; Robinson, Lauren; Campione, Giovanna Cristina; Wuensch, Kelsey; Hildebrandt, Tom; Micali, Nadia

    2017-09-01

    Intolerance of uncertainty is an empirically supported transdiagnostic construct that may have relevance in understanding eating disorders. We conducted a meta-analysis and systematic review of intolerance of uncertainty in eating disorders using Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. We calculated random-effects standardised mean differences (SMD) for studies utilising the Intolerance of Uncertainty Scale (IUS) and summarised additional studies descriptively. Women with eating disorders have significantly higher IUS scores compared with healthy controls (SMD = 1.90; 95% C.I. 1.24 to 2.56; p eating disorders and potential target of cognitive, behavioural, interoceptive and affective symptoms. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.

  11. Development and application of objective uncertainty measures for nuclear power plant transient analysis[Dissertation 3897

    Energy Technology Data Exchange (ETDEWEB)

    Vinai, P

    2007-10-15

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire

  12. Uncertainty analysis for dynamic properties of MEMS resonator supported by fuzzy arithmetics

    Directory of Open Access Journals (Sweden)

    A Martowicz

    2016-04-01

    Full Text Available In the paper the application of uncertainty analysis performed formicroelectromechanical resonator is presented. Main objective ofundertaken analysis is to assess the propagation of considered uncertaintiesin the variation of chosen dynamic characteristics of Finite Element model ofmicroresonator. Many different model parameters have been assumed tobe uncertain: geometry and material properties. Apart from total uncertaintypropagation, sensitivity analysis has been carried out to study separateinfluences of all input uncertain characteristics. Uncertainty analysis has beenperformed by means of fuzzy arithmetics in which alpha-cut strategy hasbeen applied to assemble output fuzzy number. Monte Carlo Simulation andGenetic Algorithms have been employed to calculate intervals connectedwith each alpha-cut of searched fuzzy number. Elaborated model ofmicroresonator has taken into account in a simplified way the presence ofsurrounding air and constant electrostatic field.

  13. Estimates of Uncertainties in Analysis of Positron Lifetime Spectra for Metals

    DEFF Research Database (Denmark)

    Eldrup, Morten Mostgaard; Huang, Y. M.; McKee, B. T. A.

    1978-01-01

    by excluding the peak regions of the spectra from the analysis. The influence of using incorrect source-surface components in the analysis may on the other hand be reduced by including the peak regions of the spectra. A main conclusion of the work is that extreme caution should be exercised to avoid......The effects of uncertainties and errors in various constraints used in the analysis of multi-component life-time spectra of positrons annihilating in metals containing defects have been investigated in detail using computer simulated decay spectra and subsequent analysis. It is found...... that the errors in the fitted values of the main components lifetimes and intensities introduced from incorrect values of the instrumental resolution function and of the source-surface components can easily exceed the statistical uncertainties. The effect of an incorrect resolution function may be reduced...

  14. Quantitative uncertainty and sensitivity analysis of a PWR control rod ejection accident

    Energy Technology Data Exchange (ETDEWEB)

    Pasichnyk, I.; Perin, Y.; Velkov, K. [Gesellschaft flier Anlagen- und Reaktorsicherheit - GRS mbH, Boltzmannstasse 14, 85748 Garching bei Muenchen (Germany)

    2013-07-01

    The paper describes the results of the quantitative Uncertainty and Sensitivity (U/S) Analysis of a Rod Ejection Accident (REA) which is simulated by the coupled system code ATHLET-QUABOX/CUBBOX applying the GRS tool for U/S analysis SUSA/XSUSA. For the present study, a UOX/MOX mixed core loading based on a generic PWR is modeled. A control rod ejection is calculated for two reactor states: Hot Zero Power (HZP) and 30% of nominal power. The worst cases for the rod ejection are determined by steady-state neutronic simulations taking into account the maximum reactivity insertion in the system and the power peaking factor. For the U/S analysis 378 uncertain parameters are identified and quantified (thermal-hydraulic initial and boundary conditions, input parameters and variations of the two-group cross sections). Results for uncertainty and sensitivity analysis are presented for safety important global and local parameters. (authors)

  15. Analysis of uncertainties in the estimates of nitrous oxide and methane emissions in the UK's greenhouse gas inventory for agriculture

    Science.gov (United States)

    Milne, Alice E.; Glendining, Margaret J.; Bellamy, Pat; Misselbrook, Tom; Gilhespy, Sarah; Rivas Casado, Monica; Hulin, Adele; van Oijen, Marcel; Whitmore, Andrew P.

    2014-01-01

    The UK's greenhouse gas inventory for agriculture uses a model based on the IPCC Tier 1 and Tier 2 methods to estimate the emissions of methane and nitrous oxide from agriculture. The inventory calculations are disaggregated at country level (England, Wales, Scotland and Northern Ireland). Before now, no detailed assessment of the uncertainties in the estimates of emissions had been done. We used Monte Carlo simulation to do such an analysis. We collated information on the uncertainties of each of the model inputs. The uncertainties propagate through the model and result in uncertainties in the estimated emissions. Using a sensitivity analysis, we found that in England and Scotland the uncertainty in the emission factor for emissions from N inputs (EF1) affected uncertainty the most, but that in Wales and Northern Ireland, the emission factor for N leaching and runoff (EF5) had greater influence. We showed that if the uncertainty in any one of these emission factors is reduced by 50%, the uncertainty in emissions of nitrous oxide reduces by 10%. The uncertainty in the estimate for the emissions of methane emission factors for enteric fermentation in cows and sheep most affected the uncertainty in methane emissions. When inventories are disaggregated (as that for the UK is) correlation between separate instances of each emission factor will affect the uncertainty in emissions. As more countries move towards inventory models with disaggregation, it is important that the IPCC give firm guidance on this topic.

  16. Uncertainty in the deployment of Carbon Capture and Storage (CCS): A sensitivity analysis to techno-economic parameter uncertainty

    NARCIS (Netherlands)

    Koelbl, Barbara; van den Broek, Machteld; van Ruijven, Bastiaan Johannes; Faaij, André; van Vuuren, Detlef

    2014-01-01

    Projections of the deployment of Carbon Capture and Storage (CCS) technologies vary considerably. Cumulative emission reductions by CCS until 2100 vary in the majority of projections of the IPCC-TAR scenarios from 220 to 2200 GtCO2. This variation is a result of uncertainty in key determinants of

  17. Teaching Uncertainties

    Science.gov (United States)

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  18. Application of best estimate plus uncertainty in review of research reactor safety analysis

    Directory of Open Access Journals (Sweden)

    Adu Simon

    2015-01-01

    Full Text Available To construct and operate a nuclear research reactor, the licensee is required to obtain the authorization from the regulatory body. One of the tasks of the regulatory authority is to verify that the safety analysis fulfils safety requirements. Historically, the compliance with safety requirements was assessed using a deterministic approach and conservative assumptions. This provides sufficient safety margins with respect to the licensing limits on boundary and operational conditions. Conservative assumptions were introduced into safety analysis to account for the uncertainty associated with lack of knowledge. With the introduction of best estimate computational tools, safety analyses are usually carried out using the best estimate approach. Results of such analyses can be accepted by the regulatory authority only if appropriate uncertainty evaluation is carried out. Best estimate computer codes are capable of providing more realistic information on the status of the plant, allowing the prediction of real safety margins. The best estimate plus uncertainty approach has proven to be reliable and viable of supplying realistic results if all conditions are carefully followed. This paper, therefore, presents this concept and its possible application to research reactor safety analysis. The aim of the paper is to investigate the unprotected loss-of-flow transients "core blockage" of a miniature neutron source research reactor by applying best estimate plus uncertainty methodology. The results of our calculations show that the temperatures in the core are within the safety limits and do not pose any significant threat to the reactor, as far as the melting of the cladding is concerned. The work also discusses the methodology of the best estimate plus uncertainty approach when applied to the safety analysis of research reactors for licensing purposes.

  19. Cross-section sensitivity and uncertainty analysis of the FNG copper benchmark experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kodeli, I., E-mail: ivan.kodeli@ijs.si [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); Kondo, K. [Karlsruhe Institute of Technology, Postfach 3640, D-76021 Karlsruhe (Germany); Japan Atomic Energy Agency, Rokkasho-mura (Japan); Perel, R.L. [Racah Institute of Physics, Hebrew University of Jerusalem, IL-91904 Jerusalem (Israel); Fischer, U. [Karlsruhe Institute of Technology, Postfach 3640, D-76021 Karlsruhe (Germany)

    2016-11-01

    A neutronics benchmark experiment on copper assembly was performed end 2014–beginning 2015 at the 14-MeV Frascati neutron generator (FNG) of ENEA Frascati with the objective to provide the experimental database required for the validation of the copper nuclear data relevant for ITER design calculations, including the related uncertainties. The paper presents the pre- and post-analysis of the experiment performed using cross-section sensitivity and uncertainty codes, both deterministic (SUSD3D) and Monte Carlo (MCSEN5). Cumulative reaction rates and neutron flux spectra, their sensitivity to the cross sections, as well as the corresponding uncertainties were estimated for different selected detector positions up to ∼58 cm in the copper assembly. This permitted in the pre-analysis phase to optimize the geometry, the detector positions and the choice of activation reactions, and in the post-analysis phase to interpret the results of the measurements and the calculations, to conclude on the quality of the relevant nuclear cross-section data, and to estimate the uncertainties in the calculated nuclear responses and fluxes. Large uncertainties in the calculated reaction rates and neutron spectra of up to 50%, rarely observed at this level in the benchmark analysis using today's nuclear data, were predicted, particularly high for fast reactions. Observed C/E (dis)agreements with values as low as 0.5 partly confirm these predictions. Benchmark results are therefore expected to contribute to the improvement of both cross section as well as covariance data evaluations.

  20. Data uncertainties in material flow analysis: Municipal solid waste management system in Maputo City, Mozambique.

    Science.gov (United States)

    Dos Muchangos, Leticia Sarmento; Tokai, Akihiro; Hanashima, Atsuko

    2017-01-01

    Material flow analysis can effectively trace and quantify the flows and stocks of materials such as solid wastes in urban environments. However, the integrity of material flow analysis results is compromised by data uncertainties, an occurrence that is particularly acute in low-and-middle-income study contexts. This article investigates the uncertainties in the input data and their effects in a material flow analysis study of municipal solid waste management in Maputo City, the capital of Mozambique. The analysis is based on data collected in 2007 and 2014. Initially, the uncertainties and their ranges were identified by the data classification model of Hedbrant and Sörme, followed by the application of sensitivity analysis. The average lower and upper bounds were 29% and 71%, respectively, in 2007, increasing to 41% and 96%, respectively, in 2014. This indicates higher data quality in 2007 than in 2014. Results also show that not only data are partially missing from the established flows such as waste generation to final disposal, but also that they are limited and inconsistent in emerging flows and processes such as waste generation to material recovery (hence the wider variation in the 2014 parameters). The sensitivity analysis further clarified the most influencing parameter and the degree of influence of each parameter on the waste flows and the interrelations among the parameters. The findings highlight the need for an integrated municipal solid waste management approach to avoid transferring or worsening the negative impacts among the parameters and flows.

  1. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    Science.gov (United States)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  2. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    Science.gov (United States)

    Newman, L.; Hejduk, M.; Johnson, L.

    2016-09-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hardbody radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  3. Artificial intelligence metamodel comparison and application to wind turbine airfoil uncertainty analysis

    Directory of Open Access Journals (Sweden)

    Yaping Ju

    2016-05-01

    Full Text Available The Monte Carlo simulation method for turbomachinery uncertainty analysis often requires performing a huge number of simulations, the computational cost of which can be greatly alleviated with the help of metamodeling techniques. An intensive comparative study was performed on the approximation performance of three prospective artificial intelligence metamodels, that is, artificial neural network, radial basis function, and support vector regression. The genetic algorithm was used to optimize the predetermined parameters of each metamodel for the sake of a fair comparison. Through testing on 10 nonlinear functions with different problem scales and sample sizes, the genetic algorithm–support vector regression metamodel was found more accurate and robust than the other two counterparts. Accordingly, the genetic algorithm–support vector regression metamodel was selected and combined with the Monte Carlo simulation method for the uncertainty analysis of a wind turbine airfoil under two types of surface roughness uncertainties. The results show that the genetic algorithm–support vector regression metamodel can capture well the uncertainty propagation from the surface roughness to the airfoil aerodynamic performance. This work is useful to the application of metamodeling techniques in the robust design optimization of turbomachinery.

  4. Uncertainty of the sample size reduction step in pesticide residue analysis of large-sized crops.

    Science.gov (United States)

    Omeroglu, P Yolci; Ambrus, Á; Boyacioglu, D; Majzik, E Solymosne

    2013-01-01

    To estimate the uncertainty of the sample size reduction step, each unit in laboratory samples of papaya and cucumber was cut into four segments in longitudinal directions and two opposite segments were selected for further homogenisation while the other two were discarded. Jackfruit was cut into six segments in longitudinal directions, and all segments were kept for further analysis. To determine the pesticide residue concentrations in each segment, they were individually homogenised and analysed by chromatographic methods. One segment from each unit of the laboratory sample was drawn randomly to obtain 50 theoretical sub-samples with an MS Office Excel macro. The residue concentrations in a sub-sample were calculated from the weight of segments and the corresponding residue concentration. The coefficient of variation calculated from the residue concentrations of 50 sub-samples gave the relative uncertainty resulting from the sample size reduction step. The sample size reduction step, which is performed by selecting one longitudinal segment from each unit of the laboratory sample, resulted in relative uncertainties of 17% and 21% for field-treated jackfruits and cucumber, respectively, and 7% for post-harvest treated papaya. The results demonstrated that sample size reduction is an inevitable source of uncertainty in pesticide residue analysis of large-sized crops. The post-harvest treatment resulted in a lower variability because the dipping process leads to a more uniform residue concentration on the surface of the crops than does the foliar application of pesticides.

  5. Multivariate Copula Analysis Toolbox (MvCAT): Describing dependence and underlying uncertainty using a Bayesian framework

    Science.gov (United States)

    Sadegh, Mojtaba; Ragno, Elisa; AghaKouchak, Amir

    2017-06-01

    We present a newly developed Multivariate Copula Analysis Toolbox (MvCAT) which includes a wide range of copula families with different levels of complexity. MvCAT employs a Bayesian framework with a residual-based Gaussian likelihood function for inferring copula parameters and estimating the underlying uncertainties. The contribution of this paper is threefold: (a) providing a Bayesian framework to approximate the predictive uncertainties of fitted copulas, (b) introducing a hybrid-evolution Markov Chain Monte Carlo (MCMC) approach designed for numerical estimation of the posterior distribution of copula parameters, and (c) enabling the community to explore a wide range of copulas and evaluate them relative to the fitting uncertainties. We show that the commonly used local optimization methods for copula parameter estimation often get trapped in local minima. The proposed method, however, addresses this limitation and improves describing the dependence structure. MvCAT also enables evaluation of uncertainties relative to the length of record, which is fundamental to a wide range of applications such as multivariate frequency analysis.

  6. Uncertainty analysis of multi-rate kinetics of uranium desorption from sediments

    Science.gov (United States)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.; Zhang, Guannan

    2014-01-01

    Multi-rate surface complexation models have been proposed to describe the kinetics of uranyl (U(VI) surface complexation reactions (SCR) rate-limited by diffusive mass transfer to and from intragranular sorption sites in subsurface sediments. In this study, a Bayesian-based, Differential Evolution Markov Chain method was used to assess the uncertainty and to identify factors controlling the uncertainties of the multi-rate SCR model. The rate constants in the multi-rate SCR were estimated with and without assumption of a specified lognormal distribution to test the lognormal assumption typically used to minimize the number of the rate constants in the multi-rate model. U(VI) desorption under variable chemical conditions from a contaminated sediment at US Hanford 300 Area, Washington was used as an example. The results indicated that the estimated rate constants without a specified lognormal assumption approximately followed a lognormal distribution, indicating that the lognormal is an effective assumption for the rate constants in the multi-rate SCR model. However, those rate constants with their corresponding half-lives longer than the experimental durations for model characterization had larger uncertainties and could not be reliably estimated. The uncertainty analysis revealed that the time-scale of the experiments for calibrating the multi-rate SCR model, the assumption for the rate constant distribution, the geochemical conditions involved in predicting U(VI) desorption, and equilibrium U(VI) speciation reaction constants were the major factors contributing to the extrapolation uncertainties of the multi-rate SCR model. Overall, the results from this study demonstrated that the multi-rate SCR model with a lognormal distribution of its rate constants is an effective approach for describing rate-limited U(VI) desorption; however, the model contains uncertainties, especially for those smaller rate constants, that require careful consideration for predicting U

  7. Biophysical and Economic Uncertainty in the Analysis of Poverty Impacts of Climate Change

    Science.gov (United States)

    Hertel, T. W.; Lobell, D. B.; Verma, M.

    2011-12-01

    This paper seeks to understand the main sources of uncertainty in assessing the impacts of climate change on agricultural output, international trade, and poverty. We incorporate biophysical uncertainty by sampling from a distribution of global climate model predictions for temperature and precipitation for 2050. The implications of these realizations for crop yields around the globe are estimated using the recently published statistical crop yield functions provided by Lobell, Schlenker and Costa-Roberts (2011). By comparing these yields to those predicted under current climate, we obtain the likely change in crop yields owing to climate change. The economic uncertainty in our analysis relates to the response of the global economic system to these biophysical shocks. We use a modified version of the GTAP model to elicit the impact of the biophysical shocks on global patterns of production, consumption, trade and poverty. Uncertainty in these responses is reflected in the econometrically estimated parameters governing the responsiveness of international trade, consumption, production (and hence the intensive margin of supply response), and factor supplies (which govern the extensive margin of supply response). We sample from the distributions of these parameters as specified by Hertel et al. (2007) and Keeney and Hertel (2009). We find that, even though it is difficult to predict where in the world agricultural crops will be favorably affected by climate change, the responses of economic variables, including output and exports can be far more robust (Table 1). This is due to the fact that supply and demand decisions depend on relative prices, and relative prices depend on productivity changes relative to other crops in a given region, or relative to similar crops in other parts of the world. We also find that uncertainty in poverty impacts of climate change appears to be almost entirely driven by biophysical uncertainty.

  8. Sensitivity and Uncertainty Analysis of IAEA CRP HTGR Benchmark Using McCARD

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Sang Hoon; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of)

    2016-05-15

    The benchmark consists of 4 phases starting from the local standalone modeling (Phase I) to the safety calculation of coupled system with transient situation (Phase IV). As a preliminary study of UAM on HTGR, this paper covers the exercise 1 and 2 of Phase I which defines the unit cell and lattice geometry of MHTGR-350 (General Atomics). The objective of these exercises is to quantify the uncertainty of the multiplication factor induced by perturbing nuclear data as well as to analyze the specific features of HTGR such as double heterogeneity and self-shielding treatment. The uncertainty quantification of IAEA CRP HTGR UAM benchmarks were conducted using first-order AWP method in McCARD. Uncertainty of the multiplication factor was estimated only for the microscopic cross section perturbation. To reduce the computation time and memory shortage, recently implemented uncertainty analysis module in MC wielandt calculation was adjusted. The covariance data of cross section was generated by NJOY/ERRORR module with ENDF/B-VII.1. The numerical result was compared with evaluation result of DeCART/MUSAD code system developed by KAERI. IAEA CRP HTGR UAM benchmark problems were analyzed using McCARD. The numerical results were compared with Serpent for eigenvalue calculation and DeCART/MUSAD for S/U analysis. In eigenvalue calculation, inconsistencies were found in the result with ENDF/B-VII.1 cross section library and it was found to be the effect of thermal scattering data of graphite. As to S/U analysis, McCARD results matched well with DeCART/MUSAD, but showed some discrepancy in 238U capture regarding implicit uncertainty.

  9. Demonstration of Uncertainty Quantification and Sensitivity Analysis for PWR Fuel Performance with BISON

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Hongbin; Ladd, Jacob; Zhao, Haihua; Zou, Ling; Burns, Douglas

    2015-11-01

    BISON is an advanced fuels performance code being developed at Idaho National Laboratory and is the code of choice for fuels performance by the U.S. Department of Energy (DOE)’s Consortium for Advanced Simulation of Light Water Reactors (CASL) Program. An approach to uncertainty quantification and sensitivity analysis with BISON was developed and a new toolkit was created. A PWR fuel rod model was developed and simulated by BISON, and uncertainty quantification and sensitivity analysis were performed with eighteen uncertain input parameters. The maximum fuel temperature and gap conductance were selected as the figures of merit (FOM). Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis.

  10. Scenario analysis to account for photovoltaic generation uncertainty in distribution grid reconfiguration

    DEFF Research Database (Denmark)

    Chittur Ramaswamy, Parvathy; Deconinck, Geert; Pillai, Jayakrishnan Radhakrishna

    2013-01-01

    This paper considers hourly reconfiguration of a low voltage distribution network with the objectives of minimizing power loss and voltage deviation. The uncertainty in photovoltaic (PV) generation which in turn will affect the optimum configuration is tackled with the help of scenario analysis....... In the proposed strategy of the scenario analysis called the Internal method, final non-dominated solutions (configurations) will have a relative acceptable performance in all scenarios. The use of scenario analysis helps in reducing the number of switches that needs to be installed in order to cater...... for the uncertainty in PV generation. This is possible because the PV generation values for a given hour during different days are taken into account by means of defining various scenarios when finding out the optimum configuration. Non-dominated sorting genetic algorithm (NSGA-II) used in this paper generates non...

  11. Sensitivity Analysis and Insights into Hydrological Processes and Uncertainty at Different Scales

    Science.gov (United States)

    Haghnegahdar, A.; Razavi, S.; Wheater, H. S.; Gupta, H. V.

    2015-12-01

    Sensitivity analysis (SA) is an essential tool for providing insight into model behavior, and conducting model calibration and uncertainty assessment. Numerous techniques have been used in environmental modelling studies for sensitivity analysis. However, it is often overlooked that the scale of modelling study, and the metric choice can significantly change the assessment of model sensitivity and uncertainty. In order to identify important hydrological processes across various scales, we conducted a multi-criteria sensitivity analysis using a novel and efficient technique, Variogram Analysis of Response Surfaces (VARS). The analysis was conducted using three different hydrological models, HydroGeoSphere (HGS), Soil and Water Assessment Tool (SWAT), and Modélisation Environmentale-Surface et Hydrologie (MESH). Models were applied at various scales ranging from small (hillslope) to large (watershed) scales. In each case, the sensitivity of simulated streamflow to model processes (represented through parameters) were measured using different metrics selected based on various hydrograph characteristics such as high flows, low flows, and volume. We demonstrate how the scale of the case study and the choice of sensitivity metric(s) can change our assessment of sensitivity and uncertainty. We present some guidelines to better align the metric choice with the objective and scale of a modelling study.

  12. Through the Camera's Eye: A Phenomenological Analysis of Teacher Subjectivity

    Science.gov (United States)

    Greenwalt, Kyle A.

    2008-01-01

    The purpose of this study is to understand how preservice teachers experience a common university assignment: the videotaping and analysis of their own instruction. Using empirical data and the thought of the French philosophers Michel Foucault and Emmanuel Levinas, the study examines the difficulties in transitioning from student subjectivity to…

  13. Neutron cross section sensitivity and uncertainty analysis of candidate accident tolerant fuel concepts

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Nicholas [Pennsylvania State University, University Park; Burns, Joseph R. [ORNL

    2017-12-01

    The aftermath of the Tōhoku earthquake and the Fukushima accident has led to a global push to improve the safety of existing light water reactors. A key component of this initiative is the development of nuclear fuel and cladding materials with potentially enhanced accident tolerance, also known as accident-tolerant fuels (ATF). These materials are intended to improve core fuel and cladding integrity under beyond design basis accident conditions while maintaining or enhancing reactor performance and safety characteristics during normal operation. To complement research that has already been carried out to characterize ATF neutronics, the present study provides an initial investigation of the sensitivity and uncertainty of ATF systems responses to nuclear cross section data. ATF concepts incorporate novel materials, including SiC and FeCrAl cladding and high density uranium silicide composite fuels, in turn introducing new cross section sensitivities and uncertainties which may behave differently from traditional fuel and cladding materials. In this paper, we conducted sensitivity and uncertainty analysis using the TSUNAMI-2D sequence of SCALE with infinite lattice models of ATF assemblies. Of all the ATF materials considered, it is found that radiative capture in 56Fe in FeCrAl cladding is the most significant contributor to eigenvalue uncertainty. 56Fe yields significant potential eigenvalue uncertainty associated with its radiative capture cross section; this is by far the largest ATF-specific uncertainty found in these cases, exceeding even those of uranium. We found that while significant new sensitivities indeed arise, the general sensitivity behavior of ATF assemblies does not markedly differ from traditional UO2/zirconium-based fuel/cladding systems, especially with regard to uncertainties associated with uranium. We assessed the similarity of the IPEN/MB-01 reactor benchmark model to application models with FeCrAl cladding. We used TSUNAMI-IP to calculate

  14. Bio-physical vs. Economic Uncertainty in the Analysis of Climate Change Impacts on World Agriculture

    Science.gov (United States)

    Hertel, T. W.; Lobell, D. B.

    2010-12-01

    Accumulating evidence suggests that agricultural production could be greatly affected by climate change, but there remains little quantitative understanding of how these agricultural impacts would affect economic livelihoods in poor countries. The recent paper by Hertel, Burke and Lobell (GEC, 2010) considers three scenarios of agricultural impacts of climate change, corresponding to the fifth, fiftieth, and ninety fifth percentiles of projected yield distributions for the world’s crops in 2030. They evaluate the resulting changes in global commodity prices, national economic welfare, and the incidence of poverty in a set of 15 developing countries. Although the small price changes under the medium scenario are consistent with previous findings, their low productivity scenario reveals the potential for much larger food price changes than reported in recent studies which have hitherto focused on the most likely outcomes. The poverty impacts of price changes under the extremely adverse scenario are quite heterogeneous and very significant in some population strata. They conclude that it is critical to look beyond central case climate shocks and beyond a simple focus on yields and highly aggregated poverty impacts. In this paper, we conduct a more formal, systematic sensitivity analysis (SSA) with respect to uncertainty in the biophysical impacts of climate change on agriculture, by explicitly specifying joint distributions for global yield changes - this time focusing on 2050. This permits us to place confidence intervals on the resulting price impacts and poverty results which reflect the uncertainty inherited from the biophysical side of the analysis. We contrast this with the economic uncertainty inherited from the global general equilibrium model (GTAP), by undertaking SSA with respect to the behavioral parameters in that model. This permits us to assess which type of uncertainty is more important for regional price and poverty outcomes. Finally, we undertake a

  15. Uncertainty in the Bayesian meta-analysis of normally distributed surrogate endpoints

    Science.gov (United States)

    Thompson, John R; Spata, Enti; Abrams, Keith R

    2015-01-01

    We investigate the effect of the choice of parameterisation of meta-analytic models and related uncertainty on the validation of surrogate endpoints. Different meta-analytical approaches take into account different levels of uncertainty which may impact on the accuracy of the predictions of treatment effect on the target outcome from the treatment effect on a surrogate endpoint obtained from these models. A range of Bayesian as well as frequentist meta-analytical methods are implemented using illustrative examples in relapsing–remitting multiple sclerosis, where the treatment effect on disability worsening is the primary outcome of interest in healthcare evaluation, while the effect on relapse rate is considered as a potential surrogate to the effect on disability progression, and in gastric cancer, where the disease-free survival has been shown to be a good surrogate endpoint to the overall survival. Sensitivity analysis was carried out to assess the impact of distributional assumptions on the predictions. Also, sensitivity to modelling assumptions and performance of the models were investigated by simulation. Although different methods can predict mean true outcome almost equally well, inclusion of uncertainty around all relevant parameters of the model may lead to less certain and hence more conservative predictions. When investigating endpoints as candidate surrogate outcomes, a careful choice of the meta-analytical approach has to be made. Models underestimating the uncertainty of available evidence may lead to overoptimistic predictions which can then have an effect on decisions made based on such predictions. PMID:26271918

  16. Neutron activation analysis of the 30Si content of highly enriched 28Si: proof of concept and estimation of the achievable uncertainty

    Science.gov (United States)

    D'Agostino, G.; Mana, G.; Oddone, M.; Prata, M.; Bergamaschi, L.; Giordani, L.

    2014-06-01

    We investigated the use of neutron activation to estimate the 30Si mole fraction of the ultra-pure silicon material highly enriched in 28Si for the measurement of the Avogadro constant. Specifically, we developed a relative method based on instrumental neutron activation analysis and using a natural-Si sample as a standard. To evaluate the achievable uncertainty, we irradiated a 6 g sample of a natural-Si material and modelled experimentally the signal that would be produced by a sample of the 28Si-enriched material of similar mass and subjected to the same measurement conditions. The extrapolation of the expected uncertainty from the experimental data indicates that a measurement of the 30Si mole fraction of the 28Si-enriched material might reach a 4% relative combined standard uncertainty.

  17. Uncertainties in shoreline position analysis: the role of run-up and tide in a gentle slope beach

    Science.gov (United States)

    Manno, Giorgio; Lo Re, Carlo; Ciraolo, Giuseppe

    2017-09-01

    In recent decades in the Mediterranean Sea, high anthropic pressure from increasing economic and touristic development has affected several coastal areas. Today the erosion phenomena threaten human activities and existing structures, and interdisciplinary studies are needed to better understand actual coastal dynamics. Beach evolution analysis can be conducted using GIS methodologies, such as the well-known Digital Shoreline Analysis System (DSAS), in which error assessment based on shoreline positioning plays a significant role. In this study, a new approach is proposed to estimate the positioning errors due to tide and wave run-up influence. To improve the assessment of the wave run-up uncertainty, a spectral numerical model was used to propagate waves from deep to intermediate water and a Boussinesq-type model for intermediate water up to the swash zone. Tide effects on the uncertainty of shoreline position were evaluated using data collected by a nearby tide gauge. The proposed methodology was applied to an unprotected, dissipative Sicilian beach far from harbors and subjected to intense human activities over the last 20 years. The results show wave run-up and tide errors ranging from 0.12 to 4.5 m and from 1.20 to 1.39 m, respectively.

  18. Robust Stability Analysis of PMSM with Parametric Uncertainty using Kharitonov Theorem

    Directory of Open Access Journals (Sweden)

    Anil Kumar Yadav

    2016-06-01

    Full Text Available The permanent magnet synchronous motors (PMSM are used as servo motor for precise motion control and are used as generator to generate electrical energy driven by wind energy. There is large variation in inertia due to varying load and parametric uncertainty in PMSM. The design objective of this paper is to analytically determine the relative robust stability of PMSM with parametric uncertainty using Kharitonov theorem and Routh stability criterion. The conventional integral controller (IC and two robust internal model controllers (IMCs are used for relative robust stability analysis of speed control of PMSM. The frequency domain performance specifications like gain margin (GM and phase margin (PM are taken for relative robust stability analysis, and the effect of controllers on time domain performance specifications such as settling time (ST, rise time (RT and overshoot (OS is also studied.

  19. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    Science.gov (United States)

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  20. Micropollutants throughout an integrated urban drainage model: Sensitivity and uncertainty analysis

    Science.gov (United States)

    Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare

    2017-11-01

    The paper presents the sensitivity and uncertainty analysis of an integrated urban drainage model which includes micropollutants. Specifically, a bespoke integrated model developed in previous studies has been modified in order to include the micropollutant assessment (namely, sulfamethoxazole - SMX). The model takes into account also the interactions between the three components of the system: sewer system (SS), wastewater treatment plant (WWTP) and receiving water body (RWB). The analysis has been applied to an experimental catchment nearby Palermo (Italy): the Nocella catchment. Overall, five scenarios, each characterized by different uncertainty combinations of sub-systems (i.e., SS, WWTP and RWB), have been considered applying, for the sensitivity analysis, the Extended-FAST method in order to select the key factors affecting the RWB quality and to design a reliable/useful experimental campaign. Results have demonstrated that sensitivity analysis is a powerful tool for increasing operator confidence in the modelling results. The approach adopted here can be used for blocking some non-identifiable factors, thus wisely modifying the structure of the model and reducing the related uncertainty. The model factors related to the SS have been found to be the most relevant factors affecting the SMX modeling in the RWB when all model factors (scenario 1) or model factors of SS (scenarios 2 and 3) are varied. If the only factors related to the WWTP are changed (scenarios 4 and 5), the SMX concentration in the RWB is mainly influenced (till to 95% influence of the total variance for SSMX,max) by the aerobic sorption coefficient. A progressive uncertainty reduction from the upstream to downstream was found for the soluble fraction of SMX in the RWB.

  1. Uncertainty analysis on reactivity and discharged inventory for a pressurized water reactor fuel assembly due to {sup 235,238}U nuclear data uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Da Cruz, D. F.; Rochman, D.; Koning, A. J. [Nuclear Research and Consultancy Group NRG, Westerduinweg 3, 1755 ZG Petten (Netherlands)

    2012-07-01

    This paper discusses the uncertainty analysis on reactivity and inventory for a typical PWR fuel element as a result of uncertainties in {sup 235,238}U nuclear data. A typical Westinghouse 3-loop fuel assembly fuelled with UO{sub 2} fuel with 4.8% enrichment has been selected. The Total Monte-Carlo method has been applied using the deterministic transport code DRAGON. This code allows the generation of the few-groups nuclear data libraries by directly using data contained in the nuclear data evaluation files. The nuclear data used in this study is from the JEFF3.1 evaluation, and the nuclear data files for {sup 238}U and {sup 235}U (randomized for the generation of the various DRAGON libraries) are taken from the nuclear data library TENDL. The total uncertainty (obtained by randomizing all {sup 238}U and {sup 235}U nuclear data in the ENDF files) on the reactor parameters has been split into different components (different nuclear reaction channels). Results show that the TMC method in combination with a deterministic transport code constitutes a powerful tool for performing uncertainty and sensitivity analysis of reactor physics parameters. (authors)

  2. Two non-probabilistic methods for uncertainty analysis in accident reconstruction.

    Science.gov (United States)

    Zou, Tiefang; Yu, Zhi; Cai, Ming; Liu, Jike

    2010-05-20

    There are many uncertain factors in traffic accidents, it is necessary to study the influence of these uncertain factors to improve the accuracy and confidence of accident reconstruction results. It is difficult to evaluate the uncertainty of calculation results if the expression of the reconstruction model is implicit and/or the distributions of the independent variables are unknown. Based on interval mathematics, convex models and design of experiment, two non-probabilistic methods were proposed. These two methods are efficient under conditions where existing uncertainty analysis methods can hardly work because the accident reconstruction model is implicit and/or the distributions of independent variables are unknown; and parameter sensitivity can be obtained from them too. An accident case is investigated by the methods proposed in the paper. Results show that the convex models method is the most conservative method, and the solution of interval analysis method is very close to the other methods. These two methods are a beneficial supplement to the existing uncertainty analysis methods.

  3. Report on INL Activities for UncertaintyReduction Analysis of FY10

    Energy Technology Data Exchange (ETDEWEB)

    G. Palmiotti; H. Hiruta; M. Salvatores

    2010-09-01

    The work scope of this project related to the Work Packages of “Uncertainty Reduction Analyses” with the goal of reducing nuclear data uncertainties is to produce a set of improved nuclear data to be used both for a wide range of validated advanced fast reactor design calculations, and for providing guidelines for further improvements of the ENDF/B files (i.e. ENDF/B-VII, and future releases). This report presents the status of activities performed at INL under the FC R&D Work Package previously mentioned. First an analysis of uncertainty evaluation is presented using the new covariance data (AFCI version 1.2) made available by BNL. Then, analyses of a number of experiments, among those selected in the previous fiscal year and available, are presented making use of ENDF/B-VII data. These experiments include: updating of the ZPR-6/7 assembly (improved model and spectral indices), ZPPR-9 assembly (only simplified model available), ZPPR-10 (full detailed model), and irradiation experiments. These last experiments include PROFIL-1 were a new methodology has been employed in the Monte Carlo calculations, and also a deterministic analysis has been performed. This is the first time the Monte Carlo approach and ENDF/B-VII have been used for the PROFIL experiments. The PROFIL-2 and TRAPU experiments have been for the moment only modeled and a full analysis of the irradiation results will be finalized next fiscal year.

  4. Wavelet-Monte Carlo Hybrid System for HLW Nuclide Migration Modeling and Sensitivity and Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nasif, Hesham; Neyama, Atsushi

    2003-02-26

    This paper presents results of an uncertainty and sensitivity analysis for performance of the different barriers of high level radioactive waste repositories. SUA is a tool to perform the uncertainty and sensitivity on the output of Wavelet Integrated Repository System model (WIRS), which is developed to solve a system of nonlinear partial differential equations arising from the model formulation of radionuclide transport through repository. SUA performs sensitivity analysis (SA) and uncertainty analysis (UA) on a sample output from Monte Carlo simulation. The sample is generated by WIRS and contains the values of the output values of the maximum release rate in the form of time series and values of the input variables for a set of different simulations (runs), which are realized by varying the model input parameters. The Monte Carlo sample is generated with SUA as a pure random sample or using Latin Hypercube sampling technique. Tchebycheff and Kolmogrov confidence bounds are compute d on the maximum release rate for UA and effective non-parametric statistics to rank the influence of the model input parameters SA. Based on the results, we point out parameters that have primary influences on the performance of the engineered barrier system of a repository. The parameters found to be key contributor to the release rate are selenium and Cesium distribution coefficients in both geosphere and major water conducting fault (MWCF), the diffusion depth and water flow rate in the excavation-disturbed zone (EDZ).

  5. Estimation of the Fuel Depletion Code Bias and Uncertainty in Burnup-Credit Criticality Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Woon; Cho, Nam Zin [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of); Lee, Sang Jin; Bae, Chang Yeal [Nuclear Environment Technology Institute, Taejon (Korea, Republic of)

    2006-07-01

    In the past, criticality safety analyses for commercial light-water-reactor (LWR) spent nuclear fuel (SNF) storage and transportation canisters assumed the spent fuel to be fresh (unirradiated) fuel with uniform isotopic compositions. This fresh-fuel assumption provides a well-defined, bounding approach to the criticality safety analysis that eliminates concerns related to the fuel operating history, and thus considerably simplifies the safety analysis. However, because this assumption ignores the inherent decrease in reactivity as a result of irradiation, it is very conservative. The concept of taking credit for the reduction in reactivity due to fuel burnup is commonly referred to as burnup credit. Implementation of burnup credit requires the computational prediction of the nuclide inventories (compositions) for the dominant fissile and absorbing nuclide species in spent fuel. In addition to that, the bias and uncertainty in the predicted concentration of all nuclides used in the analysis be established by comparisons of calculated and measured radiochemical assay data. In this paper, three methods for considering the bias and uncertainty will be reviewed. The estimated bias and uncertainty that the results of 3rd method are presented.

  6. Uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model at multiple flux tower sites

    Science.gov (United States)

    Chen, Mingshi; Senay, Gabriel B.; Singh, Ramesh K.; Verdin, James P.

    2016-01-01

    Evapotranspiration (ET) is an important component of the water cycle – ET from the land surface returns approximately 60% of the global precipitation back to the atmosphere. ET also plays an important role in energy transport among the biosphere, atmosphere, and hydrosphere. Current regional to global and daily to annual ET estimation relies mainly on surface energy balance (SEB) ET models or statistical and empirical methods driven by remote sensing data and various climatological databases. These models have uncertainties due to inevitable input errors, poorly defined parameters, and inadequate model structures. The eddy covariance measurements on water, energy, and carbon fluxes at the AmeriFlux tower sites provide an opportunity to assess the ET modeling uncertainties. In this study, we focused on uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model for ET estimation at multiple AmeriFlux tower sites with diverse land cover characteristics and climatic conditions. The 8-day composite 1-km MODerate resolution Imaging Spectroradiometer (MODIS) land surface temperature (LST) was used as input land surface temperature for the SSEBop algorithms. The other input data were taken from the AmeriFlux database. Results of statistical analysis indicated that the SSEBop model performed well in estimating ET with an R2 of 0.86 between estimated ET and eddy covariance measurements at 42 AmeriFlux tower sites during 2001–2007. It was encouraging to see that the best performance was observed for croplands, where R2 was 0.92 with a root mean square error of 13 mm/month. The uncertainties or random errors from input variables and parameters of the SSEBop model led to monthly ET estimates with relative errors less than 20% across multiple flux tower sites distributed across different biomes. This uncertainty of the SSEBop model lies within the error range of other SEB models, suggesting systematic error or bias of the SSEBop model is within

  7. Tolerance of uncertainty: Conceptual analysis, integrative model, and implications for healthcare.

    Science.gov (United States)

    Hillen, Marij A; Gutheil, Caitlin M; Strout, Tania D; Smets, Ellen M A; Han, Paul K J

    2017-05-01

    Uncertainty tolerance (UT) is an important, well-studied phenomenon in health care and many other important domains of life, yet its conceptualization and measurement by researchers in various disciplines have varied substantially and its essential nature remains unclear. The objectives of this study were to: 1) analyze the meaning and logical coherence of UT as conceptualized by developers of UT measures, and 2) develop an integrative conceptual model to guide future empirical research regarding the nature, causes, and effects of UT. A narrative review and conceptual analysis of 18 existing measures of Uncertainty and Ambiguity Tolerance was conducted, focusing on how measure developers in various fields have defined both the "uncertainty" and "tolerance" components of UT-both explicitly through their writings and implicitly through the items constituting their measures. Both explicit and implicit conceptual definitions of uncertainty and tolerance vary substantially and are often poorly and inconsistently specified. A logically coherent, unified understanding or theoretical model of UT is lacking. To address these gaps, we propose a new integrative definition and multidimensional conceptual model that construes UT as the set of negative and positive psychological responses-cognitive, emotional, and behavioral-provoked by the conscious awareness of ignorance about particular aspects of the world. This model synthesizes insights from various disciplines and provides an organizing framework for future research. We discuss how this model can facilitate further empirical and theoretical research to better measure and understand the nature, determinants, and outcomes of UT in health care and other domains of life. Uncertainty tolerance is an important and complex phenomenon requiring more precise and consistent definition. An integrative definition and conceptual model, intended as a tentative and flexible point of departure for future research, adds needed breadth

  8. Inter-individual variability of oscillatory responses to subject's own name. A single-subject analysis.

    Science.gov (United States)

    Höller, Yvonne; Kronbichler, Martin; Bergmann, Jürgen; Crone, Julia Sophia; Schmid, Elisabeth Verena; Golaszewski, Stefan; Ladurner, Gunther

    2011-06-01

    In previous studies event-related potentials and oscillations in response to subject's own name have been analyzed extensively on group-level in healthy subjects and in patients with a disorder of consciousness. Subject's own name as a deviant produces a P3. With equiprobable stimuli, non-phase-locked alpha oscillations are smaller in response to subject's own name compared to other names or subject's own name backwards. However, little is known about replicability on a single-subject level. Seventeen healthy subjects were assessed in an own-name paradigm with equiprobable stimuli of subject's own name, another name, and subject's own name backwards. Event-related potentials and non-phase locked oscillations were analyzed with single-subject, non-parametric statistics. No consistent results were found either for ERPs or for the non-phase locked changes of oscillatory activities. Only 4 subjects showed a robust effect as expected, that is, a lower activity in the alpha-beta range to subject's own name compared to other conditions. Four subjects elicited a higher activity for subject's own name. Thus, analyzing the EEG reactivity in the own-name paradigm with equiprobable stimuli on a single-subject level yields a high variance between subjects. In future research, single-subject statistics should be applied for examining the validity of physiologic measurements in other paradigms and for examining the pattern of reactivity in patients. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  10. A Review and Classification of Approaches for Dealing with Uncertainty in Multi-Criteria Decision Analysis for Healthcare Decisions

    NARCIS (Netherlands)

    Broekhuizen, Hindrik; Groothuis-Oudshoorn, Catharina Gerarda Maria; van Til, Janine Astrid; Hummel, J. Marjan; IJzerman, Maarten Joost

    2015-01-01

    Multi-criteria decision analysis (MCDA) is increasingly used to support decisions in healthcare involving multiple and conflicting criteria. Although uncertainty is usually carefully addressed in health economic evaluations, whether and how the different sources of uncertainty are dealt with and

  11. Predictive uncertainty analysis of a saltwater intrusion model using null-space Monte Carlo

    Science.gov (United States)

    Herckenrath, Daan; Langevin, Christian D.; Doherty, John

    2011-01-01

    Because of the extensive computational burden and perhaps a lack of awareness of existing methods, rigorous uncertainty analyses are rarely conducted for variable-density flow and transport models. For this reason, a recently developed null-space Monte Carlo (NSMC) method for quantifying prediction uncertainty was tested for a synthetic saltwater intrusion model patterned after the Henry problem. Saltwater intrusion caused by a reduction in fresh groundwater discharge was simulated for 1000 randomly generated hydraulic conductivity distributions, representing a mildly heterogeneous aquifer. From these 1000 simulations, the hydraulic conductivity distribution giving rise to the most extreme case of saltwater intrusion was selected and was assumed to represent the "true" system. Head and salinity values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. The NSMC method was used to calculate 1000 calibration-constrained parameter fields. If the dimensionality of the solution space was set appropriately, the estimated uncertainty range from the NSMC analysis encompassed the truth. Several variants of the method were implemented to investigate their effect on the efficiency of the NSMC method. Reducing the dimensionality of the null-space for the processing of the random parameter sets did not result in any significant gains in efficiency and compromised the ability of the NSMC method to encompass the true prediction value. The addition of intrapilot point heterogeneity to the NSMC process was also tested. According to a variogram comparison, this provided the same scale of heterogeneity that was used to generate the truth. However, incorporation of intrapilot point variability did not make a noticeable difference to the uncertainty of the prediction. With this higher level of heterogeneity, however, the computational burden of

  12. Measuring and explaining eco-efficiencies of wastewater treatment plants in China: An uncertainty analysis perspective.

    Science.gov (United States)

    Dong, Xin; Zhang, Xinyi; Zeng, Siyu

    2017-04-01

    In the context of sustainable development, there has been an increasing requirement for an eco-efficiency assessment of wastewater treatment plants (WWTPs). Data envelopment analysis (DEA), a technique that is widely applied for relative efficiency assessment, is used in combination with the tolerances approach to handle WWTPs' multiple inputs and outputs as well as their uncertainty. The economic cost, energy consumption, contaminant removal, and global warming effect during the treatment processes are integrated to interpret the eco-efficiency of WWTPs. A total of 736 sample plants from across China are assessed, and large sensitivities to variations in inputs and outputs are observed for most samples, with only three WWTPs identified as being stably efficient. Size of plant, overcapacity, climate type, and influent characteristics are proven to have a significant influence on both the mean efficiency and performance sensitivity of WWTPs, while no clear relationships were found between eco-efficiency and technology under the framework of uncertainty analysis. The incorporation of uncertainty quantification and environmental impact consideration has improved the liability and applicability of the assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Uncertainty and sensitivity analysis in quantitative pest risk assessments; practical rules for risk assessors

    Directory of Open Access Journals (Sweden)

    David Makowski

    2013-09-01

    Full Text Available Quantitative models have several advantages compared to qualitative methods for pest risk assessments (PRA. Quantitative models do not require the definition of categorical ratings and can be used to compute numerical probabilities of entry and establishment, and to quantify spread and impact. These models are powerful tools, but they include several sources of uncertainty that need to be taken into account by risk assessors and communicated to decision makers. Uncertainty analysis (UA and sensitivity analysis (SA are useful for analyzing uncertainty in models used in PRA, and are becoming more popular. However, these techniques should be applied with caution because several factors may influence their results. In this paper, a brief overview of methods of UA and SA are given. As well, a series of practical rules are defined that can be followed by risk assessors to improve the reliability of UA and SA results. These rules are illustrated in a case study based on the infection model of Magarey et al. (2005 where the results of UA and SA are shown to be highly dependent on the assumptions made on the probability distribution of the model inputs.

  14. ANALYSIS OF UNCERTAINTY QUANTIFICATION METHOD BY COMPARING MONTE-CARLO METHOD AND WILKS’ FORMULA

    Directory of Open Access Journals (Sweden)

    SEUNG WOOK LEE

    2014-08-01

    Full Text Available An analysis of the uncertainty quantification related to LBLOCA using the Monte-Carlo calculation has been performed and compared with the tolerance level determined by the Wilks’ formula. The uncertainty range and distribution of each input parameter associated with the LOCA phenomena were determined based on previous PIRT results and documentation during the BEMUSE project. Calulations were conducted on 3,500 cases within a 2-week CPU time on a 14-PC cluster system. The Monte-Carlo exercise shows that the 95% upper limit PCT value can be obtained well, with a 95% confidence level using the Wilks’ formula, although we have to endure a 5% risk of PCT under-prediction. The results also show that the statistical fluctuation of the limit value using Wilks’ first-order is as large as the uncertainty value itself. It is therefore desirable to increase the order of the Wilks’ formula to be higher than the second-order to estimate the reliable safety margin of the design features. It is also shown that, with its ever increasing computational capability, the Monte-Carlo method is accessible for a nuclear power plant safety analysis within a realistic time frame.

  15. Designing, operating and maintaining artificial recharge pond under uncertainty: a probabilistic risk analysis

    Science.gov (United States)

    Pedretti, D.; Sanchez-Vila, X.; Fernandez-Garcia, D.; Bolster, D.; Tartakovsky, D. M.; Barahona-Palomo, M.

    2011-12-01

    Decision makers require long term effective hydraulic criteria to optimize the design of artificial recharge ponds. However, uncontrolled multiscale pore clogging effects on heterogeneous soils determines uncertainties which must be quantified. One of the most remarkable effect is the reduction of infiltration capacity over time, which affect the quantity and quality of aquifer recharging water. We developed a probabilistic (engineering) risk analysis where pore clogging is modeled as an exponential decay with time and where clogging mechanisms are differently sensitive to some properties of the soils, which are heterogeneously organized in space. We studied both a real case and some synthetic infiltration ponds. The risk is defined for the infiltration capacity to drop below a target value at a specific time after the facility is working. We can account for a variety of maintenance strategies that target different clogging mechanisms. In our analysis, physical clogging mechanisms induce the greatest uncertainty and that maintenance targeted at these can yield optimal results. However, considering the fundamental role of the spatial variability in the initial properties, we conclude that an adequate initial characterization of the surface infiltration ponds is strategically critical to determine the degree of uncertainty of different maintenance solutions and thus to make cost-effective and reliable decisions.

  16. 3D Geostatistical Modeling and Uncertainty Analysis in a Carbonate Reservoir, SW Iran

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Kamali

    2013-01-01

    Full Text Available The aim of geostatistical reservoir characterization is to utilize wide variety of data, in different scales and accuracies, to construct reservoir models which are able to represent geological heterogeneities and also quantifying uncertainties by producing numbers of equiprobable models. Since all geostatistical methods used in estimation of reservoir parameters are inaccurate, modeling of “estimation error” in form of uncertainty analysis is very important. In this paper, the definition of Sequential Gaussian Simulation has been reviewed and construction of stochastic models based on it has been discussed. Subsequently ranking and uncertainty quantification of those stochastically populated equiprobable models and sensitivity study of modeled properties have been presented. Consequently, the application of sensitivity analysis on stochastic models of reservoir horizons, petrophysical properties, and stochastic oil-water contacts, also their effect on reserve, clearly shows any alteration in the reservoir geometry has significant effect on the oil in place. The studied reservoir is located at carbonate sequences of Sarvak Formation, Zagros, Iran; it comprises three layers. The first one which is located beneath the cap rock contains the largest portion of the reserve and other layers just hold little oil. Simulations show that average porosity and water saturation of the reservoir is about 20% and 52%, respectively.

  17. Single-subject analysis reveals variation in knee mechanics during step landing.

    Science.gov (United States)

    Scholes, Corey J; McDonald, Michael D; Parker, Anthony W

    2012-08-09

    Evidence concerning the alteration of knee function during landing suffers from a lack of consensus. This uncertainty can be attributed to methodological flaws, particularly in relation to the statistical analysis of variable human movement data. The aim of this study was to compare single-subject and group analyses in detecting changes in knee stiffness and coordination during step landing that occur independent of an experimental intervention. A group of healthy men (N=12) stepped-down from a knee-high platform for 60 consecutive trials, each trial separated by a 1-minute rest. The magnitude and within-participant variability of sagittal stiffness and coordination of the landing knee were evaluated with both group and single-subject analyses. The group analysis detected significant changes in knee coordination. However, the single-subject analyses detected changes in all dependent variables, which included increases in variability with task repetition. Between-individual variation was also present in the timing, size and direction of alterations. The results have important implications for the interpretation of existing information regarding the adaptation of knee mechanics to interventions such as fatigue, footwear or landing height. It is proposed that a participant's natural variation in knee mechanics should be analysed prior to an intervention in future experiments. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Tract Orientation and Angular Dispersion Deviation Indicator (TOADDI): A framework for single-subject analysis in diffusion tensor imaging.

    Science.gov (United States)

    Koay, Cheng Guan; Yeh, Ping-Hong; Ollinger, John M; İrfanoğlu, M Okan; Pierpaoli, Carlo; Basser, Peter J; Oakes, Terrence R; Riedy, Gerard

    2016-02-01

    The purpose of this work is to develop a framework for single-subject analysis of diffusion tensor imaging (DTI) data. This framework is termed Tract Orientation and Angular Dispersion Deviation Indicator (TOADDI) because it is capable of testing whether an individual tract as represented by the major eigenvector of the diffusion tensor and its corresponding angular dispersion are significantly different from a group of tracts on a voxel-by-voxel basis. This work develops two complementary statistical tests based on the elliptical cone of uncertainty, which is a model of uncertainty or dispersion of the major eigenvector of the diffusion tensor. The orientation deviation test examines whether the major eigenvector from a single subject is within the average elliptical cone of uncertainty formed by a collection of elliptical cones of uncertainty. The shape deviation test is based on the two-tailed Wilcoxon-Mann-Whitney two-sample test between the normalized shape measures (area and circumference) of the elliptical cones of uncertainty of the single subject against a group of controls. The False Discovery Rate (FDR) and False Non-discovery Rate (FNR) were incorporated in the orientation deviation test. The shape deviation test uses FDR only. TOADDI was found to be numerically accurate and statistically effective. Clinical data from two Traumatic Brain Injury (TBI) patients and one non-TBI subject were tested against the data obtained from a group of 45 non-TBI controls to illustrate the application of the proposed framework in single-subject analysis. The frontal portion of the superior longitudinal fasciculus seemed to be implicated in both tests (orientation and shape) as significantly different from that of the control group. The TBI patients and the single non-TBI subject were well separated under the shape deviation test at the chosen FDR level of 0.0005. TOADDI is a simple but novel geometrically based statistical framework for analyzing DTI data. TOADDI may be

  19. Uncertainties in scientific measurements

    Energy Technology Data Exchange (ETDEWEB)

    Holden, N.E.

    1986-11-16

    Some examples of nuclear data in which the uncertainty has been underestimated, or at least appears to be underestimated, are reviewed. The subjective aspect of the problem of systematic uncertainties is discussed. Historical aspects of the data uncertainty problem are noted. 64 refs., 6 tabs.

  20. Quantifying uncertainty in LCA-modelling of waste management systems

    DEFF Research Database (Denmark)

    Clavreul, Julie; Guyonnet, D.; Christensen, Thomas Højlund

    2012-01-01

    Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present...... the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining...

  1. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes.

    Science.gov (United States)

    Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul

    2013-11-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  3. Fast eutrophication assessment for stormwater wet detention ponds via fuzzy probit regression analysis under uncertainty.

    Science.gov (United States)

    Tahsin, Subrina; Chang, Ni-Bin

    2016-02-01

    Stormwater wet detention ponds have been a commonly employed best management practice for stormwater management throughout the world for many years. In the past, the trophic state index values have been used to evaluate seasonal changes in water quality and rank lakes within a region or between several regions; yet, to date, there is no similar index for stormwater wet detention ponds. This study aimed to develop a new multivariate trophic state index (MTSI) suitable for conducting a rapid eutrophication assessment of stormwater wet detention ponds under uncertainty with respect to three typical physical and chemical properties. Six stormwater wet detention ponds in Florida were selected for demonstration of the new MTSI with respect to total phosphorus (TP), total nitrogen (TN), and Secchi disk depth (SDD) as cognitive assessment metrics to sense eutrophication potential collectively and inform the environmental impact holistically. Due to the involvement of multiple endogenous variables (i.e., TN, TP, and SDD) for the eutrophication assessment simultaneously under uncertainty, fuzzy synthetic evaluation was applied to first standardize and synchronize the sources of uncertainty in the decision analysis. The ordered probit regression model was then formulated for assessment based on the concept of MTSI with the inputs from the fuzzy synthetic evaluation. It is indicative that the severe eutrophication condition is present during fall, which might be due to frequent heavy summer storm events contributing to high-nutrient inputs in these six ponds.

  4. Robustness with respect to disturbance model uncertainty: Theory and application to autopilot performance analysis

    Directory of Open Access Journals (Sweden)

    Davison Daniel E.

    2000-01-01

    Full Text Available This paper deals with the notion of disturbance model uncertainty. The disturbance is modeled as the output of a first-order filter which is driven by white noise and whose bandwidth and gain are uncertain. An analytical expression for the steady-state output variance as a function of the uncertain bandwidth and gain is derived, and several properties of this variance function are analyzed. Two notions, those of disturbance bandwidth margin and disturbance gain margin are also introduced. These tools are then applied to the analysis of a simple altitude-hold autopilot system in the presence of turbulence where the turbulence scale is treated as an uncertain parameter. It is shown that the autopilot, which is satisfactory for nominal turbulence scale, may be inadequate when the uncertainty is taken into account. Moreover, it is proven that, in order to obtain a design that provides robust performance in the face of turbulence scale uncertainty, it is necessary to substantially increase the controller bandwidth, even if one is willing to sacrifice the autopilot's holding ability and stability robustness.

  5. To be certain about the uncertainty: Bayesian statistics for (13) C metabolic flux analysis.

    Science.gov (United States)

    Theorell, Axel; Leweke, Samuel; Wiechert, Wolfgang; Nöh, Katharina

    2017-11-01

    (13) C Metabolic Fluxes Analysis ((13) C MFA) remains to be the most powerful approach to determine intracellular metabolic reaction rates. Decisions on strain engineering and experimentation heavily rely upon the certainty with which these fluxes are estimated. For uncertainty quantification, the vast majority of (13) C MFA studies relies on confidence intervals from the paradigm of Frequentist statistics. However, it is well known that the confidence intervals for a given experimental outcome are not uniquely defined. As a result, confidence intervals produced by different methods can be different, but nevertheless equally valid. This is of high relevance to (13) C MFA, since practitioners regularly use three different approximate approaches for calculating confidence intervals. By means of a computational study with a realistic model of the central carbon metabolism of E. coli, we provide strong evidence that confidence intervals used in the field depend strongly on the technique with which they were calculated and, thus, their use leads to misinterpretation of the flux uncertainty. In order to provide a better alternative to confidence intervals in (13) C MFA, we demonstrate that credible intervals from the paradigm of Bayesian statistics give more reliable flux uncertainty quantifications which can be readily computed with high accuracy using Markov chain Monte Carlo. In addition, the widely applied chi-square test, as a means of testing whether the model reproduces the data, is examined closer. © 2017 Wiley Periodicals, Inc.

  6. A scenario-based modeling approach for emergency evacuation management and risk analysis under multiple uncertainties.

    Science.gov (United States)

    Lv, Y; Huang, G H; Guo, L; Li, Y P; Dai, C; Wang, X W; Sun, W

    2013-02-15

    Nuclear emergency evacuation is important to prevent radioactive harms by hazardous materials and to limit the accidents' consequences; however, uncertainties are involved in the components and processes of such a management system. In the study, an interval-parameter joint-probabilistic integer programming (IJIP) method is developed for emergency evacuation management under uncertainties. Optimization techniques of interval-parameter programming (IPP) and joint-probabilistic constrained (JPC) programming are incorporated into an integer linear programming framework, so that the approach can deal with uncertainties expressed as joint probability and interval values. The IJIP method can schedule the optimal routes to guarantee the maximum population evacuated away from the effected zone during a finite time. Furthermore, it can also facilitate post optimization analysis to enhance robustness in controlling system violation risk imposed on the joint-probabilistic constraints. The developed method has been applied to a case study of nuclear emergency management; meanwhile, a number of scenarios under different system conditions have been analyzed. It is indicated that the solutions are useful for evacuation management practices. The result of the IJIP method can not only help to raise the capability of disaster responses in a systematic manner, but also provide an insight into complex relationships among evacuation planning, resources utilizations, policy requirements and system risks. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping.

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-03-04

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster-Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster-Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC operation

  8. Error modeling based on geostatistics for uncertainty analysis in crop mapping using Gaofen-1 multispectral imagery

    Science.gov (United States)

    You, Jiong; Pei, Zhiyuan

    2015-01-01

    With the development of remote sensing technology, its applications in agriculture monitoring systems, crop mapping accuracy, and spatial distribution are more and more being explored by administrators and users. Uncertainty in crop mapping is profoundly affected by the spatial pattern of spectral reflectance values obtained from the applied remote sensing data. Errors in remotely sensed crop cover information and the propagation in derivative products need to be quantified and handled correctly. Therefore, this study discusses the methods of error modeling for uncertainty characterization in crop mapping using GF-1 multispectral imagery. An error modeling framework based on geostatistics is proposed, which introduced the sequential Gaussian simulation algorithm to explore the relationship between classification errors and the spectral signature from remote sensing data source. On this basis, a misclassification probability model to produce a spatially explicit classification error probability surface for the map of a crop is developed, which realizes the uncertainty characterization for crop mapping. In this process, trend surface analysis was carried out to generate a spatially varying mean response and the corresponding residual response with spatial variation for the spectral bands of GF-1 multispectral imagery. Variogram models were employed to measure the spatial dependence in the spectral bands and the derived misclassification probability surfaces. Simulated spectral data and classification results were quantitatively analyzed. Through experiments using data sets from a region in the low rolling country located at the Yangtze River valley, it was found that GF-1 multispectral imagery can be used for crop mapping with a good overall performance, the proposal error modeling framework can be used to quantify the uncertainty in crop mapping, and the misclassification probability model can summarize the spatial variation in map accuracy and is helpful for

  9. An optimization method based on scenario analysis for watershed management under uncertainty.

    Science.gov (United States)

    Liu, Yong; Guo, Huaicheng; Zhang, Zhenxing; Wang, Lijing; Dai, Yongli; Fan, Yingying

    2007-05-01

    In conjunction with socioeconomic development in watersheds, increasingly challenging problems, such as scarcity of water resources and environmental deterioration, have arisen. Watershed management is a useful tool for dealing with these issues and maintaining sustainable development at the watershed scale. The complex and uncertain characteristics of watershed systems have a great impact on decisions about countermeasures and other techniques that will be applied in the future. An optimization method based on scenario analysis is proposed in this paper as a means of handling watershed management under uncertainty. This method integrates system analysis, forecast methods, and scenario analysis, as well as the contributions of stakeholders and experts, into a comprehensive framework. The proposed method comprises four steps: system analyses, a listing of potential engineering techniques and countermeasures, scenario analyses, and the optimal selection of countermeasures and engineering techniques. The proposed method was applied to the case of the Lake Qionghai watershed in southwestern China, and the results are reported in this paper. This case study demonstrates that the proposed method can be used to deal efficiently with uncertainties at the watershed level. Moreover, this method takes into consideration the interests of different groups, which is crucial for successful watershed management. In particular, social, economic, environmental, and resource systems are all considered in order to improve the applicability of the method. In short, the optimization method based on scenario analysis proposed here is a valuable tool for watershed management.

  10. Uncertainty propagation analysis applied to volcanic ash dispersal at Mt. Etna by using a Lagrangian model

    Science.gov (United States)

    de'Michieli Vitturi, Mattia; Pardini, Federica; Spanu, Antonio; Neri, Augusto; Vittoria Salvetti, Maria

    2015-04-01

    Volcanic ash clouds represent a major hazard for populations living nearby volcanic centers producing a risk for humans and a potential threat to crops, ground infrastructures, and aviation traffic. Lagrangian particle dispersal models are commonly used for tracking ash particles emitted from volcanic plumes and transported under the action of atmospheric wind fields. In this work, we present the results of an uncertainty propagation analysis applied to volcanic ash dispersal from weak plumes with specific focus on the uncertainties related to the grain-size distribution of the mixture. To this aim, the Eulerian fully compressible mesoscale non-hydrostatic model WRF was used to generate the driving wind, representative of the atmospheric conditions occurring during the event of November 24, 2006 at Mt. Etna. Then, the Lagrangian particle model LPAC (de' Michieli Vitturi et al., JGR 2010) was used to simulate the transport of mass particles under the action of atmospheric conditions. The particle motion equations were derived by expressing the Lagrangian particle acceleration as the sum of the forces acting along its trajectory, with drag forces calculated as a function of particle diameter, density, shape and Reynolds number. The simulations were representative of weak plume events of Mt. Etna and aimed to quantify the effect on the dispersal process of the uncertainty in the particle sphericity and in the mean and variance of a log-normal distribution function describing the grain-size of ash particles released from the eruptive column. In order to analyze the sensitivity of particle dispersal to these uncertain parameters with a reasonable number of simulations, and therefore with affordable computational costs, response surfaces in the parameter space were built by using the generalized polynomial chaos technique. The uncertainty analysis allowed to quantify the most probable values, as well as their pdf, of the number of particles as well as of the mean and

  11. Quantifying uncertainty in LCA-modelling of waste management systems.

    Science.gov (United States)

    Clavreul, Julie; Guyonnet, Dominique; Christensen, Thomas H

    2012-12-01

    Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining the selected methods: (Step 1) a sensitivity analysis evaluating the sensitivities of the results with respect to the input uncertainties, (Step 2) an uncertainty propagation providing appropriate tools for representing uncertainties and calculating the overall uncertainty of the model results, (Step 3) an uncertainty contribution analysis quantifying the contribution of each parameter uncertainty to the final uncertainty and (Step 4) as a new approach, a combined sensitivity analysis providing a visualisation of the shift in the ranking of different options due to variations of selected key parameters. This tiered approach optimises the resources available to LCA practitioners by only propagating the most influential uncertainties. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities

    Science.gov (United States)

    Esposito, Gaetano

    Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and

  13. Uncertainty and Fiscal Cliffs

    OpenAIRE

    Andrew Foerster; Troy Davig

    2015-01-01

    Motivated by the US Fiscal Cliff in 2012, this paper considers the short- and longer- term impact of uncertainty generated by fiscal policy. Empirical evidence shows increases in economic policy uncertainty lower investment and employment. Investment that is longer-lived and subject to a longer planning horizon responds to policy uncertainty with a lag, while capital that depreciates more quickly and can be installed with few costs falls immediately. A DSGE model incorporating uncertainty ove...

  14. Final Report: Optimal Model Complexity in Geological Carbon Sequestration: A Response Surface Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Ye [Univ. of Wyoming, Laramie, WY (United States)

    2018-01-17

    The critical component of a risk assessment study in evaluating GCS is an analysis of uncertainty in CO2 modeling. In such analyses, direct numerical simulation of CO2 flow and leakage requires many time-consuming model runs. Alternatively, analytical methods have been developed which allow fast and efficient estimation of CO2 storage and leakage, although restrictive assumptions on formation rock and fluid properties are employed. In this study, an intermediate approach is proposed based on the Design of Experiment and Response Surface methodology, which consists of using a limited number of numerical simulations to estimate a prediction outcome as a combination of the most influential uncertain site properties. The methodology can be implemented within a Monte Carlo framework to efficiently assess parameter and prediction uncertainty while honoring the accuracy of numerical simulations. The choice of the uncertain properties is flexible and can include geologic parameters that influence reservoir heterogeneity, engineering parameters that influence gas trapping and migration, and reactive parameters that influence the extent of fluid/rock reactions. The method was tested and verified on modeling long-term CO2 flow, non-isothermal heat transport, and CO2 dissolution storage by coupling two-phase flow with explicit miscibility calculation using an accurate equation of state that gives rise to convective mixing of formation brine variably saturated with CO2. All simulations were performed using three-dimensional high-resolution models including a target deep saline aquifer, overlying caprock, and a shallow aquifer. To evaluate the uncertainty in representing reservoir permeability, sediment hierarchy of a heterogeneous digital stratigraphy was mapped to create multiple irregularly shape stratigraphic models of decreasing geologic resolutions: heterogeneous (reference), lithofacies, depositional environment, and a (homogeneous) geologic formation. To ensure model

  15. Subjective facial analysis and its correlation with dental relationships

    Directory of Open Access Journals (Sweden)

    Gustavo Silva Siécola

    Full Text Available ABSTRACT INTRODUCTION: Subjective facial analysis is a diagnostic method that provides morphological analysis of the face. Thus, the aim of the present study was to compare the facial and dental diagnoses and investigate their relationship. METHODS: This sample consisted of 151 children (7 to 13 years old, without previous orthodontic treatment, analyzed by an orthodontist. Standardized extraoral and intraoral photographs were taken for the subjective facial classification according to Facial Pattern classification and occlusal analyses. It has been researched the occurrence of different Facial Patterns, the relationship between Facial Pattern classification in frontal and profile views, the relationship between Facial Patterns and Angle classification, and between anterior open bite and Long Face Pattern. RESULTS: Facial Pattern I was verified in 64.24% of the children, Pattern II in 21.29%, Pattern III in 6.62%, Long Face Pattern in 5.96% and Short Face Pattern in 1.99%. A substantial strength of agreement of approximately 84% between frontal and profile classification of Facial Pattern was observed (Kappa = 0.69. Agreement between the Angle classification and the Facial Pattern was seen in approximately 63% of the cases (Kappa = 0.27. Long Face Pattern did not present more open bite prevalence. CONCLUSION: Facial Patterns I and II were the most prevalent in children and the less prevalent was the Short Face Pattern. A significant concordance was observed between profile and frontal subjective facial analysis. There was slight concordance between the Facial Pattern and the sagittal dental relationships. The anterior open bite (AOB was not significantly prevalent in any Facial Pattern.

  16. Subjective facial analysis and its correlation with dental relationships

    Science.gov (United States)

    Siécola, Gustavo Silva; Capelozza, Leopoldino; Lorenzoni, Diego Coelho; Janson, Guilherme; Henriques, José Fernando Castanha

    2017-01-01

    ABSTRACT INTRODUCTION: Subjective facial analysis is a diagnostic method that provides morphological analysis of the face. Thus, the aim of the present study was to compare the facial and dental diagnoses and investigate their relationship. METHODS: This sample consisted of 151 children (7 to 13 years old), without previous orthodontic treatment, analyzed by an orthodontist. Standardized extraoral and intraoral photographs were taken for the subjective facial classification according to Facial Pattern classification and occlusal analyses. It has been researched the occurrence of different Facial Patterns, the relationship between Facial Pattern classification in frontal and profile views, the relationship between Facial Patterns and Angle classification, and between anterior open bite and Long Face Pattern. RESULTS: Facial Pattern I was verified in 64.24% of the children, Pattern II in 21.29%, Pattern III in 6.62%, Long Face Pattern in 5.96% and Short Face Pattern in 1.99%. A substantial strength of agreement of approximately 84% between frontal and profile classification of Facial Pattern was observed (Kappa = 0.69). Agreement between the Angle classification and the Facial Pattern was seen in approximately 63% of the cases (Kappa = 0.27). Long Face Pattern did not present more open bite prevalence. CONCLUSION: Facial Patterns I and II were the most prevalent in children and the less prevalent was the Short Face Pattern. A significant concordance was observed between profile and frontal subjective facial analysis. There was slight concordance between the Facial Pattern and the sagittal dental relationships. The anterior open bite (AOB) was not significantly prevalent in any Facial Pattern. PMID:28658360

  17. Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Johnson, J.D.; Rollstin, J.A. [GRAM, Inc., Albuquerque, NM (United States); Shiver, A.W.; Sprung, J.L. [Sandia National Labs., Albuquerque, NM (United States)

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 87 imprecisely-known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, milk growing season dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, area dependent cost, crop disposal cost, milk disposal cost, condemnation area, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: fraction of cesium deposition on grain fields that is retained on plant surfaces and transferred directly to grain, maximum allowable ground concentrations of Cs-137 and Sr-90 for production of crops, ground concentrations of Cs-134, Cs-137 and I-131 at which the disposal of milk will be initiated due to accidents that occur during the growing season, ground concentrations of Cs-134, I-131 and Sr-90 at which the disposal of crops will be initiated due to accidents that occur during the growing season, rate of depletion of Cs-137 and Sr-90 from the root zone, transfer of Sr-90 from soil to legumes, transfer of Cs-137 from soil to pasture, transfer of cesium from animal feed to meat, and the transfer of cesium, iodine and strontium from animal feed to milk.

  18. SENSITIVITY AND UNCERTAINTY ANALYSIS OF COMMERCIAL REACTOR CRITICALS FOR BURNUP CREDIT

    Energy Technology Data Exchange (ETDEWEB)

    Radulescu, Georgeta [ORNL; Mueller, Don [ORNL; Wagner, John C [ORNL

    2009-01-01

    The purpose of this study is to provide insights into the neutronic similarities that may exist between a generic cask containing typical spent nuclear fuel assemblies and commercial reactor critical (CRC) state-points. Forty CRC state-points from five pressurized-water reactors were selected for the study and the type of CRC state-points that may be applicable for validation of burnup credit criticality safety calculations for spent fuel transport/storage/disposal systems are identified. The study employed cross-section sensitivity and uncertainty analysis methods developed at Oak Ridge National Laboratory and the TSUNAMI set of tools in the SCALE code system as a means to investigate system similarity on an integral and nuclide-reaction specific level. The results indicate that, except for the fresh fuel core configuration, all analyzed CRC state-points are either highly similar, similar, or marginally similar to a generic cask containing spent nuclear fuel assemblies with burnups ranging from 10 to 60 GWd/MTU. Based on the integral system parameter, C{sub k}, approximately 30 of the 40 CRC state-points are applicable to validation of burnup credit in the generic cask containing typical spent fuel assemblies with burnups ranging from 10 to 60 GWd/MTU. The state-points providing the highest similarity (C{sub k} > 0.95) were attained at or near the end of a reactor cycle. The C{sub k} values are dominated by neutron reactions with major actinides and hydrogen, as the sensitivities of these reactions are much higher than those of the minor actinides and fission products. On a nuclide-reaction specific level, the CRC state-points provide significant similarity for most of the actinides and fission products relevant to burnup credit. A comparison of energy-dependent sensitivity profiles shows a slight shift of the CRC K{sub eff} sensitivity profiles toward higher energies in the thermal region as compared to the K{sub eff} sensitivity profile of the generic cask

  19. Analysis of sagittal condyl inclination in subjects with temporomandibular disorders

    Directory of Open Access Journals (Sweden)

    Dodić Slobodan

    2010-01-01

    Full Text Available Bacground/Aim. Disturbances of mandibular border movements is considered to be one of the major signs of temporomandibular disorders (TMD. The purpose of this study was to evaluate the possible association between disturbances of mandibular border movements and the presence of symptoms of TMD in the young. Methods. This study included two groups of volunteers between 18 and 26 years of age. The study group included 30 examineers with signs (symptoms of TMD, and the control group also included 30 persons without any signs (symptoms of TMD. The presence of TMD was confirmed according to the craniomandibular index (Helkimo. The functional analysis of mandibular movements was performed in each subject using the computer pantograph. Results. The results of this study did not confirm any significant differences between the values of the condylar variables/sagittal condylar inclination, length of the sagital condylar guidance, in the control and in the study group. Conclusion. The study did not confirm significant differences in the length and inclination of the protrusive condylar guidance, as well as in the values of the sagittal condylar inclination between the subjects with the signs and symptoms of TMD and the normal asymptomatic subjects.

  20. Bias analysis applied to Agricultural Health Study publications to estimate non-random sources of uncertainty

    Directory of Open Access Journals (Sweden)

    Lash Timothy L

    2007-11-01

    Full Text Available Abstract Background The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. Methods For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. Results The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Conclusion Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a

  1. Response surface methodology and improved interval analysis method--for analyzing uncertainty in accident reconstruction.

    Science.gov (United States)

    Zou, Tiefang; Cai, Ming; Shu, Xiong

    2012-10-10

    Methods used to calculate intervals of accident reconstruction results are research hotspot in the word. The response surface methodology-interval analysis method (RSM-IAM) is a useful method for analyzing uncertainty of simulation results in this field, but there are two problems in this method because of the interval extension problem and inaccurate response surface models. In order to tackle these two problems, based on subinterval analysis thought and response surface methodology, an improved interval analysis method (RSM-IIAM) is proposed. In RSM-IIAM, the stepwise regression technique is used to obtain a reasonable response surface mode of the simulation model; and then, intervals of uncertain parameters are divided into several subintervals; after that, intervals of simulation results in accident reconstruction are calculated according to these subintervals. Finally, four numerical cases were given. Results showed that the RSM-IIAM is simple and high accuracy, which will be useful in analyzing uncertainty of simulation results in accident reconstruction. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  2. Numerical Uncertainty Analysis for Computational Fluid Dynamics using Student T Distribution -- Application of CFD Uncertainty Analysis Compared to Exact Analytical Solution

    Science.gov (United States)

    Groves, Curtis E.; Ilie, marcel; Shallhorn, Paul A.

    2014-01-01

    Computational Fluid Dynamics (CFD) is the standard numerical tool used by Fluid Dynamists to estimate solutions to many problems in academia, government, and industry. CFD is known to have errors and uncertainties and there is no universally adopted method to estimate such quantities. This paper describes an approach to estimate CFD uncertainties strictly numerically using inputs and the Student-T distribution. The approach is compared to an exact analytical solution of fully developed, laminar flow between infinite, stationary plates. It is shown that treating all CFD input parameters as oscillatory uncertainty terms coupled with the Student-T distribution can encompass the exact solution.

  3. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    Science.gov (United States)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation

  4. How to assess the Efficiency and "Uncertainty" of Global Sensitivity Analysis?

    Science.gov (United States)

    Haghnegahdar, Amin; Razavi, Saman

    2016-04-01

    Sensitivity analysis (SA) is an important paradigm for understanding model behavior, characterizing uncertainty, improving model calibration, etc. Conventional "global" SA (GSA) approaches are rooted in different philosophies, resulting in different and sometime conflicting and/or counter-intuitive assessment of sensitivity. Moreover, most global sensitivity techniques are highly computationally demanding to be able to generate robust and stable sensitivity metrics over the entire model response surface. Accordingly, a novel sensitivity analysis method called Variogram Analysis of Response Surfaces (VARS) is introduced to overcome the aforementioned issues. VARS uses the Variogram concept to efficiently provide a comprehensive assessment of global sensitivity across a range of scales within the parameter space. Based on the VARS principles, in this study we present innovative ideas to assess (1) the efficiency of GSA algorithms and (2) the level of confidence we can assign to a sensitivity assessment. We use multiple hydrological models with different levels of complexity to explain the new ideas.

  5. Evaluating the use of uncertainty visualization for exploratory analysis of land cover change: A qualitative expert user study

    Science.gov (United States)

    Kinkeldey, Christoph; Schiewe, Jochen; Gerstmann, Henning; Götze, Christian; Kit, Oleksandr; Lüdeke, Matthias; Taubenböck, Hannes; Wurm, Michael

    2015-11-01

    Extensive research on geodata uncertainty has been conducted in the past decades, mostly related to modeling, quantifying, and communicating uncertainty. But findings on if and how users can incorporate this information into spatial analyses are still rare. In this paper we address these questions with a focus on land cover change analysis. We conducted semi-structured interviews with three expert groups dealing with change analysis in the fields of climate research, urban development, and vegetation monitoring. During the interviews we used a software prototype to show change scenarios that the experts had analyzed before, extended by visual depiction of uncertainty related to land cover change. This paper describes the study, summarizes results, and discusses findings as well as the study method. Participants came up with several ideas for applications that could be supported by uncertainty, for example, identification of erroneous change, description of change detection algorithm characteristics, or optimization of change detection parameters. Regarding the aspect of reasoning with uncertainty in land cover change data the interviewees saw potential in better-informed hypotheses and insights about change. Communication of uncertainty information to users was seen as critical, depending on the users' role and expertize. We judge semi-structured interviews to be suitable for the purpose of this study and emphasize the potential of qualitative methods (workshops, focus groups etc.) for future uncertainty visualization studies.

  6. Analysis of uncertainty and incertitude in the risk assessment when development and implementation of integrated management systems

    Directory of Open Access Journals (Sweden)

    Morteza Rajab Zadeh

    2015-03-01

    Full Text Available This paper is devoted to research of the problems associated with uncertainty and incertitude in the risk assessment, which can take place in developing and implementing an integrated management system. Analysis of current researches on the use of the concepts of “uncertainty” and “incertitude” in the risk assessment is carried out. The classification of types of uncertainty and incertitude in the risk assessment of an integrated management system is developed. Recommendations are given on methods to overcome the problems associated with the uncertainties and incertitudes that occur in the development and implementation of an integrated management system. It became possible through a more accurate risk assessment.

  7. Voxel-based statistical analysis of uncertainties associated with deformable image registration.

    Science.gov (United States)

    Li, Shunshan; Glide-Hurst, Carri; Lu, Mei; Kim, Jinkoo; Wen, Ning; Adams, Jeffrey N; Gordon, James; Chetty, Indrin J; Zhong, Hualiang

    2013-09-21

    Deformable image registration (DIR) algorithms have inherent uncertainties in their displacement vector fields (DVFs).The purpose of this study is to develop an optimal metric to estimate DIR uncertainties. Six computational phantoms have been developed from the CT images of lung cancer patients using a finite element method (FEM). The FEM generated DVFs were used as a standard for registrations performed on each of these phantoms. A mechanics-based metric, unbalanced energy (UE), was developed to evaluate these registration DVFs. The potential correlation between UE and DIR errors was explored using multivariate analysis, and the results were validated by landmark approach and compared with two other error metrics: DVF inverse consistency (IC) and image intensity difference (ID). Landmark-based validation was performed using the POPI-model. The results show that the Pearson correlation coefficient between UE and DIR error is rUE-error = 0.50. This is higher than rIC-error = 0.29 for IC and DIR error and rID-error = 0.37 for ID and DIR error. The Pearson correlation coefficient between UE and the product of the DIR displacements and errors is rUE-error × DVF = 0.62 for the six patients and rUE-error × DVF = 0.73 for the POPI-model data. It has been demonstrated that UE has a strong correlation with DIR errors, and the UE metric outperforms the IC and ID metrics in estimating DIR uncertainties. The quantified UE metric can be a useful tool for adaptive treatment strategies, including probability-based adaptive treatment planning.

  8. Uncertainty analysis in 3D global models: Aerosol representation in MOZART-4

    Science.gov (United States)

    Gasore, J.; Prinn, R. G.

    2012-12-01

    The Probabilistic Collocation Method (PCM) has been proven to be an efficient general method of uncertainty analysis in atmospheric models (Tatang et al 1997, Cohen&Prinn 2011). However, its application has been mainly limited to urban- and regional-scale models and chemical source-sink models, because of the drastic increase in computational cost when the dimension of uncertain parameters increases. Moreover, the high-dimensional output of global models has to be reduced to allow a computationally reasonable number of polynomials to be generated. This dimensional reduction has been mainly achieved by grouping the model grids into a few regions based on prior knowledge and expectations; urban versus rural for instance. As the model output is used to estimate the coefficients of the polynomial chaos expansion (PCE), the arbitrariness in the regional aggregation can generate problems in estimating uncertainties. To address these issues in a complex model, we apply the probabilistic collocation method of uncertainty analysis to the aerosol representation in MOZART-4, which is a 3D global chemical transport model (Emmons et al., 2010). Thereafter, we deterministically delineate the model output surface into regions of homogeneous response using the method of Principal Component Analysis. This allows the quantification of the uncertainty associated with the dimensional reduction. Because only a bulk mass is calculated online in Mozart-4, a lognormal number distribution is assumed with a priori fixed scale and location parameters, to calculate the surface area for heterogeneous reactions involving tropospheric oxidants. We have applied the PCM to the six parameters of the lognormal number distributions of Black Carbon, Organic Carbon and Sulfate. We have carried out a Monte-Carlo sampling from the probability density functions of the six uncertain parameters, using the reduced PCE model. The global mean concentration of major tropospheric oxidants did not show a

  9. Synthesis and Design of Biorefinery Processing Networks with Uncertainty and Sustainability analysis

    DEFF Research Database (Denmark)

    Cheali, Peam; Gernaey, Krist; Sin, Gürkan

    Chemical industries usually rely on fossil based feedstock, which is a limited resource. In view of increasing energy demands and the negative environmental and climate effects related to the use of fossil based fuels, this motivates the development of new and more sustainable technologies...... solution obtained after the MINLP by using an in-house software (SustainPRO) that employs ICHEME sustainability metrics. Secondly, the sustainability analysis was included proactively as part of the MINLP optimization problem that is performed to find the optimal processing path with respect to multi-criteria...... assessment including technical, economics and sustainability. The expanded database and superstructure with uncertainty and sustainability analysis form a powerful process synthesis toolbox to be used in design of future biorefineries with multi-criteria evaluation (technical and economic feasibility...

  10. Measurements and their uncertainties a practical guide to modern error analysis

    CERN Document Server

    Hughes, Ifan G

    2010-01-01

    This hands-on guide is primarily intended to be used in undergraduate laboratories in the physical sciences and engineering. It assumes no prior knowledge of statistics. It introduces the necessary concepts where needed, with key points illustrated with worked examples and graphic illustrations. In contrast to traditional mathematical treatments it uses a combination of spreadsheet and calculus-based approaches, suitable as a quick and easy on-the-spot reference. The emphasisthroughout is on practical strategies to be adopted in the laboratory. Error analysis is introduced at a level accessible to school leavers, and carried through to research level. Error calculation and propagation is presented though a series of rules-of-thumb, look-up tables and approaches amenable to computer analysis. The general approach uses the chi-square statistic extensively. Particular attention is given to hypothesis testing and extraction of parameters and their uncertainties by fitting mathematical models to experimental data....

  11. Uncertainty analysis for parameters of CFAST in the main control room fire scenario

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Wanhong; Guo, Yun; Peng, Changhong [Univ. of Science and Technology of China No. 96, Anhui (China). School of Nuclear Science and Technology

    2017-07-15

    The fire accident is one of important initial events in the nuclear power plant. Moreover, the fire development process is extremely difficult and complex to predict accurately. As a result, the plant internal fire accidents have become one of the most realistic threat on the safety of the nuclear power plants. The main control room contains all the control and monitoring equipment that operators need. Once it is on fire, hostile environments would greatly impact on the safety of human operations. Therefore, fire probability safety analysis on the main control room has become a significant task. By using CFAST and Monte Carlo sampling method as a tool for fire modeling to simulate main control room on fire, we can examine uncertainty analysis for the important parameters of CFAST.

  12. Global sensitivity analysis and uncertainties in SEA models of vibroacoustic systems

    Science.gov (United States)

    Christen, Jean-Loup; Ichchou, Mohamed; Troclet, Bernard; Bareille, Olivier; Ouisse, Morvan

    2017-06-01

    The effect of parametric uncertainties on the dispersion of Statistical Energy Analysis (SEA) models of structural-acoustic coupled systems is studied with the Fourier analysis sensitivity test (FAST) method. The method is firstly applied to an academic example representing a transmission suite, then to a more complex industrial structure from the space industry. Two sets of parameters are considered, namely error on the SEA model's coefficients, or directly the engineering parameters. The first case is an intrusive approach, but enables to identify the dominant phenomena taking place in a given configuration. The second is non-intrusive and appeals more to engineering considerations, by studying the effect of input parameters such as geometry or material characteristics on the SEA outputs. A study of the distribution of results in each frequency band with the same sampling shows some interesting features, such as bimodal repartitions in some ranges.

  13. Data Quality Assessment of the Uncertainty Analysis Applied to the Greenhouse Gas Emissions of a Dairy Cow System

    Directory of Open Access Journals (Sweden)

    Chun-Youl Baek

    2017-09-01

    Full Text Available The results of an uncertainty analysis are achieved by the statistical information (standard error, type of probability distributions, and range of minimum and maximum of the selected input parameters. However, there are limitations in identifying sufficient data samples for the selected input parameters for statistical information in the field of life cycle assessment (LCA. Therefore, there is a strong need for a consistent screening procedure to identify the input parameters for use in uncertainty analysis in the area of LCA. The conventional procedure for identifying input parameters for the uncertainty analysis method includes assessing the data quality using the pedigree method and the contribution analysis of the LCA results. This paper proposes a simplified procedure for ameliorating the existing data quality assessment method, which can lead to an efficient uncertainly analysis of LCA results. The proposed method has two salient features: (i a simplified procedure based on contribution analysis followed by a data quality assessment for selecting the input parameters for the uncertainty analysis; and (ii a quantitative data quality assessment method is proposed, based on the pedigree method, that adopts the analytic hierarchy process (AHP method and quality function deployment (QFD. The effects of the uncertainty of the selected input parameters on the LCA results were assessed using the Monte Carlo simulation method. A case study of greenhouse gas (GHG emissions from a dairy cow system was used to demonstrate the applicability of the proposed procedure.

  14. The subjective meaning of cognitive architecture: a Marrian analysis.

    Science.gov (United States)

    Varma, Sashank

    2014-01-01

    Marr famously decomposed cognitive theories into three levels. Newell, Pylyshyn, and Anderson offered parallel decompositions of cognitive architectures, which are psychologically plausible computational formalisms for expressing computational models of cognition. These analyses focused on the objective meaning of each level - how it supports computational models that correspond to cognitive phenomena. This paper develops a complementary analysis of the subjective meaning of each level - how it helps cognitive scientists understand cognition. It then argues against calls to eliminatively reduce higher levels to lower levels, for example, in the name of parsimony. Finally, it argues that the failure to attend to the multiple meanings and levels of cognitive architecture contributes to the current, disunified state of theoretical cognitive science.

  15. Bayesian analysis of factors associated with fibromyalgia syndrome subjects

    Science.gov (United States)

    Jayawardana, Veroni; Mondal, Sumona; Russek, Leslie

    2015-01-01

    Factors contributing to movement-related fear were assessed by Russek, et al. 2014 for subjects with Fibromyalgia (FM) based on the collected data by a national internet survey of community-based individuals. The study focused on the variables, Activities-Specific Balance Confidence scale (ABC), Primary Care Post-Traumatic Stress Disorder screen (PC-PTSD), Tampa Scale of Kinesiophobia (TSK), a Joint Hypermobility Syndrome screen (JHS), Vertigo Symptom Scale (VSS-SF), Obsessive-Compulsive Personality Disorder (OCPD), Pain, work status and physical activity dependent from the "Revised Fibromyalgia Impact Questionnaire" (FIQR). The study presented in this paper revisits same data with a Bayesian analysis where appropriate priors were introduced for variables selected in the Russek's paper.

  16. Rigorous evaluation of chemical measurement uncertainty: liquid chromatographic analysis methods using detector response factor calibration

    Science.gov (United States)

    Toman, Blaza; Nelson, Michael A.; Bedner, Mary

    2017-06-01

    Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).

  17. Rigorous evaluation of chemical measurement uncertainty: Liquid chromatographic analysis methods using detector response factor calibration.

    Science.gov (United States)

    Toman, Blaza; Nelson, Michael A; Bedner, Mary

    2017-06-01

    Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).

  18. TRITIUM UNCERTAINTY ANALYSIS FOR SURFACE WATER SAMPLES AT THE SAVANNAH RIVER SITE

    Energy Technology Data Exchange (ETDEWEB)

    Atkinson, R.

    2012-07-31

    Radiochemical analyses of surface water samples, in the framework of Environmental Monitoring, have associated uncertainties for the radioisotopic results reported. These uncertainty analyses pertain to the tritium results from surface water samples collected at five locations on the Savannah River near the U.S. Department of Energy's Savannah River Site (SRS). Uncertainties can result from the field-sampling routine, can be incurred during transport due to the physical properties of the sample, from equipment limitations, and from the measurement instrumentation used. The uncertainty reported by the SRS in their Annual Site Environmental Report currently considers only the counting uncertainty in the measurements, which is the standard reporting protocol for radioanalytical chemistry results. The focus of this work is to provide an overview of all uncertainty components associated with SRS tritium measurements, estimate the total uncertainty according to ISO 17025, and to propose additional experiments to verify some of the estimated uncertainties. The main uncertainty components discovered and investigated in this paper are tritium absorption or desorption in the sample container, HTO/H{sub 2}O isotopic effect during distillation, pipette volume, and tritium standard uncertainty. The goal is to quantify these uncertainties and to establish a combined uncertainty in order to increase the scientific depth of the SRS Annual Site Environmental Report.

  19. Uncertainty analysis in raw material and utility cost of biorefinery synthesis and design

    DEFF Research Database (Denmark)

    Cheali, Peam; Quaglia, Alberto; Gernaey, Krist

    2014-01-01

    are characterized by considerable uncertainty. These uncertainties might have significant impact on the results of the design problem, and therefore need to be carefully evaluated and managed, in order to generate candidates for robust design. In this contribution, we study the effect of data uncertainty (raw...... material price and utility cost) on the design of a biorefinery process network....

  20. Spatial Uncertainty Analysis for LVIS and UAV-SAR Attribute Fusion

    Science.gov (United States)

    Chakravarty, S.; Franks, S.

    2011-12-01

    .Due to the medium to low resolution of the above sensors, fusion analysis on the extracted attributes is mostly plagued with uncertainties. In this study the extracted information from the two modalities are treated using spatial uncertainty analysis. Statistical-Set theoretic based analysis as well as simulation based approach using error propagation law are tried. The results of uncertainty analysis can be used as performance metric or feedback for the respective attribute extraction algorithms. (1) http://lvis.gsfc.nasa.gov/index.php (2) http://uavsar.jpl.nasa.gov/

  1. Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy.

    Science.gov (United States)

    Frey, K; Unholtz, D; Bauer, J; Debus, J; Min, C H; Bortfeld, T; Paganetti, H; Parodi, K

    2014-10-07

    We introduce the automation of the range difference calculation deduced from particle-irradiation induced β(+)-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to 'red flag' problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in a

  2. A review and classification of approaches for dealing with uncertainty in multi-criteria decision analysis for healthcare decisions.

    Science.gov (United States)

    Broekhuizen, Henk; Groothuis-Oudshoorn, Catharina G M; van Til, Janine A; Hummel, J Marjan; IJzerman, Maarten J

    2015-05-01

    Multi-criteria decision analysis (MCDA) is increasingly used to support decisions in healthcare involving multiple and conflicting criteria. Although uncertainty is usually carefully addressed in health economic evaluations, whether and how the different sources of uncertainty are dealt with and with what methods in MCDA is less known. The objective of this study is to review how uncertainty can be explicitly taken into account in MCDA and to discuss which approach may be appropriate for healthcare decision makers. A literature review was conducted in the Scopus and PubMed databases. Two reviewers independently categorized studies according to research areas, the type of MCDA used, and the approach used to quantify uncertainty. Selected full text articles were read for methodological details. The search strategy identified 569 studies. The five approaches most identified were fuzzy set theory (45% of studies), probabilistic sensitivity analysis (15%), deterministic sensitivity analysis (31%), Bayesian framework (6%), and grey theory (3%). A large number of papers considered the analytic hierarchy process in combination with fuzzy set theory (31%). Only 3% of studies were published in healthcare-related journals. In conclusion, our review identified five different approaches to take uncertainty into account in MCDA. The deterministic approach is most likely sufficient for most healthcare policy decisions because of its low complexity and straightforward implementation. However, more complex approaches may be needed when multiple sources of uncertainty must be considered simultaneously.

  3. Accuracy and Uncertainty Analysis of PSBT Benchmark Exercises Using a Subchannel Code MATRA

    Directory of Open Access Journals (Sweden)

    Dae-Hyun Hwang

    2012-01-01

    Full Text Available In the framework of the OECD/NRC PSBT benchmark, the subchannel grade void distribution data and DNB data were assessed by a subchannel code, MATRA. The prediction accuracy and uncertainty of the zone-averaged void fraction at the central region of the 5 × 5 test bundle were evaluated for the steady-state and transient benchmark data. Optimum values of the turbulent mixing parameter were evaluated for the subchannel exit temperature distribution benchmark. The influence of the mixing vanes on the subchannel flow distribution was investigated through a CFD analysis. In addition, a regionwise turbulent mixing model was examined to account for the nonhomogeneous mixing characteristics caused by the vane effect. The steady-state DNB benchmark data with uniform and nonuniform axial power shapes were evaluated by employing various DNB prediction models: EPRI bundle CHF correlation, AECL-IPPE 1995 CHF lookup table, and representative mechanistic DNB models such as a sublayer dryout model and a bubble crowding model. The DNBR prediction uncertainties for various DNB models were evaluated from a Monte-Carlo simulation for a selected steady-state condition.

  4. Strain Gauge Balance Uncertainty Analysis at NASA Langley: A Technical Review

    Science.gov (United States)

    Tripp, John S.

    1999-01-01

    This paper describes a method to determine the uncertainties of measured forces and moments from multi-component force balances used in wind tunnel tests. A multivariate regression technique is first employed to estimate the uncertainties of the six balance sensitivities and 156 interaction coefficients derived from established balance calibration procedures. These uncertainties are then employed to calculate the uncertainties of force-moment values computed from observed balance output readings obtained during tests. Confidence and prediction intervals are obtained for each computed force and moment as functions of the actual measurands. Techniques are discussed for separate estimation of balance bias and precision uncertainties.

  5. Uncertainty analysis of the CPA and a quadrupolar CPA equation of state - With emphasis on CO2

    DEFF Research Database (Denmark)

    Bjørner, Martin G.; Sin, Gürkan; Kontogeorgis, Georgios M.

    2016-01-01

    The parameters of thermodynamic models, such as the cubic plus association (CPA) equation of state, are subject to uncertainties due to measurement errors in the experimental data that the models are correlated to. More importantly as the number of adjustable parameters increase, the parameter...... of correlation between the adjustable parameters. This results in significant propagated errors for certain output properties. To reduce the uncertainty in the adjustable model parameters the heat of vaporization was included as additional correlation data. This resulted in parameter distributions which followed...

  6. Descriptive Analysis of Single Subject Research Designs: 1983-2007

    Science.gov (United States)

    Hammond, Diana; Gast, David L.

    2010-01-01

    Single subject research methodology is commonly used and cited in special education courses and journals. This article reviews the types of single subject research designs published in eight refereed journals between 1983 and 2007 used to answer applied research questions. Single subject designs were categorized as withdrawal/reversal, time…

  7. Gait analysis in demented subjects: Interests and perspectives

    Directory of Open Access Journals (Sweden)

    Olivier Beauchet

    2008-03-01

    Full Text Available Olivier Beauchet1, Gilles Allali2, Gilles Berrut3, Caroline Hommet4, Véronique Dubost5, Frédéric Assal21Department of Geriatrics, Angers University Hospital, France; 2Department of Neurology, Geneva University Hospital, France; 3Department of Geriatrics, Nantes University Hospital, France; 4Department of Internal Medicine and Geriatrics, Tours University Hospital, France; 5Department of Geriatrics, Dijon University Hospital, FranceAbstract: Gait disorders are more prevalent in dementia than in normal aging and are related to the severity of cognitive decline. Dementia-related gait changes (DRGC mainly include decrease in walking speed provoked by a decrease in stride length and an increase in support phase. More recently, dual-task related changes in gait were found in Alzheimer’s disease (AD and non-Alzheimer dementia, even at an early stage. An increase in stride-to-stride variability while usual walking and dual-tasking has been shown to be more specific and sensitive than any change in mean value in subjects with dementia. Those data show that DRGC are not only associated to motor disorders but also to problem with central processing of information and highlight that dysfunction of temporal and frontal lobe may in part explain gait impairment among demented subjects. Gait assessment, and more particularly dual-task analysis, is therefore crucial in early diagnosis of dementia and/or related syndromes in the elderly. Moreover, dual-task disturbances could be a specific marker of falling at a pre-dementia stage.Keywords: gait, prediction of dementia, risk of falling, older adult

  8. Powder stickiness in milk drying: uncertainty and sensitivity analysis for process understanding

    DEFF Research Database (Denmark)

    Ferrari, Adrián; Gutiérrez, Soledad; Sin, Gürkan

    2017-01-01

    A powder stickiness model based in the glass transition temperature (Gordon – Taylor equations) was built for a production scale milk drying process (including a spray chamber, and internal/external fluid beds). To help process understanding, the model was subjected to sensitivity analysis (SA) o...

  9. Modeling a production scale milk drying process: parameter estimation, uncertainty and sensitivity analysis

    DEFF Research Database (Denmark)

    Ferrari, A.; Gutierrez, S.; Sin, Gürkan

    2016-01-01

    A steady state model for a production scale milk drying process was built to help process understanding and optimization studies. It involves a spray chamber and also internal/external fluid beds. The model was subjected to a comprehensive statistical analysis for quality assurance using sensitiv...

  10. Plant application uncertainty evaluation of LBLOCA analysis using RELAP5/MOD3/KAERI

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sang Yong; Chung, Bub Dong; Hwang, Tae Suk; Lee, Guy Hyung; Chang, Byung Hoon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-06-01

    A practical realistic evaluation methodology to evaluate the ECCS performance that satisfies the requirements of the revised ECCS rule has been developed and this report describes the application of new REM to large break LOCA. A computer code RELAP5/MOD3/KAERI, which was improved from RELAP5/ MOD3.1 was used as the best estimated code for the analysis and Kori unit 3 and 4 was selected as the reference plant. Response surfaces for blowdown and reflood PCTs were generated from the results of the sensitivity analyses and probability distribution functions were established by using Monte-Carlo sampler for each response surface. This study shows that plant application uncertainty can be quantified and demonstrates the applicability of the new realistic evaluation methodology. (Author) 29 refs., 40 figs., 8 tabs.

  11. Stochastic Risk and Uncertainty Analysis for Shale Gas Extraction in the Karoo Basin of South Africa

    Directory of Open Access Journals (Sweden)

    Abdon Atangana

    2014-01-01

    Full Text Available We made use of groundwater flow and mass transport equations to investigate the crucial potential risk of water pollution from hydraulic fracturing especially in the case of the Karoo system in South Africa. This paper shows that the upward migration of fluids will depend on the apertures of the cement cracks and fractures in the rock formation. The greater the apertures, the quicker the movement of the fluid. We presented a novel sampling method, which is the combination of the Monte Carlo and the Latin hypercube sampling. The method was used for uncertainties analysis of the apertures in the groundwater and mass transport equations. The study reveals that, in the case of the Karoo, fracking will only be successful if and only if the upward methane and fracking fluid migration can be controlled, for example, by plugging the entire fracked reservoir with cement.

  12. Method validation and uncertainty evaluation of organically bound tritium analysis in environmental sample.

    Science.gov (United States)

    Huang, Yan-Jun; Zeng, Fan; Zhang, Bing; Chen, Chao-Feng; Qin, Hong-Juan; Wu, Lian-Sheng; Guo, Gui-Yin; Yang, Li-Tao; Shang-Guan, Zhi-Hong

    2014-08-01

    The analytical method for organically bound tritium (OBT) was developed in our laboratory. The optimized operating conditions and parameters were established for sample drying, special combustion, distillation, and measurement on a liquid scintillation spectrometer (LSC). Selected types of OBT samples such as rice, corn, rapeseed, fresh lettuce and pork were analyzed for method validation of recovery rate reproducibility, the minimum detection concentration, and the uncertainty for typical low level environmental sample was evaluated. The combustion water recovery rate of different dried environmental sample was kept at about 80%, the minimum detection concentration of OBT ranged from 0.61 to 0.89 Bq/kg (dry weight), depending on the hydrogen content. It showed that this method is suitable for OBT analysis of environmental sample with stable recovery rate, and the combustion water yield of a sample with weight about 40 g would provide sufficient quantity for measurement on LSC. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Greenhouse gases from membrane bioreactors: Mathematical modelling, sensitivity and uncertainty analysis.

    Science.gov (United States)

    Mannina, Giorgio; Cosenza, Alida; Ekama, George A

    2017-09-01

    In this study a new mathematical model to quantify greenhouse gas emissions (namely, carbon dioxide and nitrous oxide) from membrane bioreactors (MBRs) is presented. The model has been adopted to predict the key processes of a pilot plant with pre-denitrification MBR scheme, filled with domestic and saline wastewater. The model was calibrated by adopting an advanced protocol based on an extensive dataset. In terms of nitrous oxide, the results show that an important role is played by the half saturation coefficients related to nitrogen removal processes and the model factors affecting the oxygen transfer rate in the aerobic and MBR tanks. Uncertainty analysis showed that for the gaseous model outputs 88-93% of the measured data lays inside the confidence bands showing an accurate model prediction. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Global sensitivity analysis in wastewater treatment plant model applications: Prioritizing sources of uncertainty

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2011-01-01

    This study demonstrates the usefulness of global sensitivity analysis in wastewater treatment plant (WWTP) design to prioritize sources of uncertainty and quantify their impact on performance criteria. The study, which is performed with the Benchmark Simulation Model no. 1 plant design, complements...... > 0.9) for effluent concentrations, sludge production and energy demand. This high extent of linearity means that the plant performance criteria can be described as linear functions of the model inputs under the defined plant conditions. In effect, the system of coupled ordinary differential equations...... in predicting sludge production and effluent ammonium concentration. While these results were in agreement with process knowledge, the added value is that the global sensitivity methods can quantify the contribution of the variance of significant parameters, e.g., ash content explains 70% of the variance...

  15. Rapid processing of PET list-mode data for efficient uncertainty estimation and data analysis.

    Science.gov (United States)

    Markiewicz, P J; Thielemans, K; Schott, J M; Atkinson, D; Arridge, S R; Hutton, B F; Ourselin, S

    2016-07-07

    In this technical note we propose a rapid and scalable software solution for the processing of PET list-mode data, which allows the efficient integration of list mode data processing into the workflow of image reconstruction and analysis. All processing is performed on the graphics processing unit (GPU), making use of streamed and concurrent kernel execution together with data transfers between disk and CPU memory as well as CPU and GPU memory. This approach leads to fast generation of multiple bootstrap realisations, and when combined with fast image reconstruction and analysis, it enables assessment of uncertainties of any image statistic and of any component of the image generation process (e.g. random correction, image processing) within reasonable time frames (e.g. within five minutes per realisation). This is of particular value when handling complex chains of image generation and processing. The software outputs the following: (1) estimate of expected random event data for noise reduction; (2) dynamic prompt and random sinograms of span-1 and span-11 and (3) variance estimates based on multiple bootstrap realisations of (1) and (2) assuming reasonable count levels for acceptable accuracy. In addition, the software produces statistics and visualisations for immediate quality control and crude motion detection, such as: (1) count rate curves; (2) centre of mass plots of the radiodistribution for motion detection; (3) video of dynamic projection views for fast visual list-mode skimming and inspection; (4) full normalisation factor sinograms. To demonstrate the software, we present an example of the above processing for fast uncertainty estimation of regional SUVR (standard uptake value ratio) calculation for a single PET scan of (18)F-florbetapir using the Siemens Biograph mMR scanner.

  16. A Bayesian-based multilevel factorial analysis method for analyzing parameter uncertainty of hydrological model

    Science.gov (United States)

    Liu, Y. R.; Li, Y. P.; Huang, G. H.; Zhang, J. L.; Fan, Y. R.

    2017-10-01

    In this study, a Bayesian-based multilevel factorial analysis (BMFA) method is developed to assess parameter uncertainties and their effects on hydrological model responses. In BMFA, Differential Evolution Adaptive Metropolis (DREAM) algorithm is employed to approximate the posterior distributions of model parameters with Bayesian inference; factorial analysis (FA) technique is used for measuring the specific variations of hydrological responses in terms of posterior distributions to investigate the individual and interactive effects of parameters on model outputs. BMFA is then applied to a case study of the Jinghe River watershed in the Loess Plateau of China to display its validity and applicability. The uncertainties of four sensitive parameters, including soil conservation service runoff curve number to moisture condition II (CN2), soil hydraulic conductivity (SOL_K), plant available water capacity (SOL_AWC), and soil depth (SOL_Z), are investigated. Results reveal that (i) CN2 has positive effect on peak flow, implying that the concentrated rainfall during rainy season can cause infiltration-excess surface flow, which is an considerable contributor to peak flow in this watershed; (ii) SOL_K has positive effect on average flow, implying that the widely distributed cambisols can lead to medium percolation capacity; (iii) the interaction between SOL_AWC and SOL_Z has noticeable effect on the peak flow and their effects are dependent upon each other, which discloses that soil depth can significant influence the processes of plant uptake of soil water in this watershed. Based on the above findings, the significant parameters and the relationship among uncertain parameters can be specified, such that hydrological model's capability for simulating/predicting water resources of the Jinghe River watershed can be improved.

  17. Uncertainty in the analysis of the overall equipment effectiveness on the shop floor

    Science.gov (United States)

    Rößler, M. P.; Abele, E.

    2013-06-01

    In this article an approach will be presented which supports transparency regarding the effectiveness of manufacturing equipment by combining the fuzzy set theory with the method of the overall equipment effectiveness analysis. One of the key principles of lean production and also a fundamental task in production optimization projects is the prior analysis of the current state of a production system by the use of key performance indicators to derive possible future states. The current state of the art in overall equipment effectiveness analysis is usually performed by cumulating different machine states by means of decentralized data collection without the consideration of uncertainty. In manual data collection or semi-automated plant data collection systems the quality of derived data often diverges and leads optimization teams to distorted conclusions about the real optimization potential of manufacturing equipment. The method discussed in this paper is to help practitioners to get more reliable results in the analysis phase and so better results of optimization projects. Under consideration of a case study obtained results are discussed.

  18. Managing Information Uncertainty in Wave Height Modeling for the Offshore Structural Analysis through Random Set

    Directory of Open Access Journals (Sweden)

    Keqin Yan

    2017-01-01

    Full Text Available This chapter presents a reliability study for an offshore jacket structure with emphasis on the features of nonconventional modeling. Firstly, a random set model is formulated for modeling the random waves in an ocean site. Then, a jacket structure is investigated in a pushover analysis to identify the critical wave direction and key structural elements. This is based on the ultimate base shear strength. The selected probabilistic models are adopted for the important structural members and the wave direction is specified in the weakest direction of the structure for a conservative safety analysis. The wave height model is processed in a P-box format when it is used in the numerical analysis. The models are applied to find the bounds of the failure probabilities for the jacket structure. The propagation of this wave model to the uncertainty in results is investigated in both an interval analysis and Monte Carlo simulation. The results are compared in context of information content and numerical accuracy. Further, the failure probability bounds are compared with the conventional probabilistic approach.

  19. Sensitivity and Uncertainty Analysis of Coupled Reactor Physics Problems : Method Development for Multi-Physics in Reactors

    NARCIS (Netherlands)

    Perkó, Z.

    2015-01-01

    This thesis presents novel adjoint and spectral methods for the sensitivity and uncertainty (S&U) analysis of multi-physics problems encountered in the field of reactor physics. The first part focuses on the steady state of reactors and extends the adjoint sensitivity analysis methods well

  20. Integrating model behavior, optimization, and sensitivity/uncertainty analysis: overview and application of the MOUSE software toolbox

    Science.gov (United States)

    This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...

  1. A biomechanical analysis of trunk and pelvis motion during gait in subjects with knee osteoarthritis compared to control subjects.

    Science.gov (United States)

    Linley, Heather S; Sled, Elizabeth A; Culham, Elsie G; Deluzio, Kevin J

    2010-12-01

    Trunk lean over the stance limb during gait has been linked to a reduction in the knee adduction moment, which is associated with joint loading. We examined differences in knee adduction moments and frontal plane trunk lean during gait between subjects with knee osteoarthritis and a control group of healthy adults. Gait analysis was performed on 80 subjects (40 osteoarthritis). To define lateral trunk lean two definitions were used. The line connecting the midpoint between two reference points on the pelvis and the midpoint between the acromion processes was projected onto the lab frontal plane and the pelvis frontal plane. Pelvic tilt was also measured in the frontal plane as the angle between the pelvic and lab coordinate systems. Angles were calculated across the stance phase of gait. We analyzed the data, (i) by extracting discrete parameters (mean and peak) waveform values, and (ii) using principal component analysis to extract shape and magnitude differences between the waveforms. Osteoarthritis subjects had a higher knee adduction moment than the control group (α=0.05). Although the discrete parameters for trunk lean did not show differences between groups, principal component analysis did detect characteristic waveform differences between the control and osteoarthritis groups. A thorough biomechanical analysis revealed small differences in the pattern of motion of the pelvis and the trunk between subjects with knee osteoarthritis and control subjects; however these differences were only detectable using principal component analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.

  2. Hydroelastic analysis of a rectangular plate subjected to slamming loads

    Science.gov (United States)

    Wang, Shan; Guedes Soares, C.

    2017-12-01

    A hydroelastic analysis of a rectangular plate subjected to slamming loads is presented. An analytical model based on Wagner theory is used for calculations of transient slamming load on the ship plate. A thin isotropic plate theory is considered for determining the vibration of a rectangular plate excited by an external slamming force. The forced vibration of the plate is calculated by the modal expansion method. Analytical results of the transient response of a rectangular plate induced by slamming loads are compared with numerical calculations from finite element method. The theoretical slamming pressure based on Wagner model is applied on the finite element model of a plate. Good agreement is obtained between the analytical and numerical results for the structural deflection of a rectangular plate due to slamming pressure. The effects of plate dimension and wave profile on the structural vibration are discussed as well. The results show that a low impact velocity and a small wetted radial length of wave yield negligible effects of hydroelasticity.

  3. Hydroelastic analysis of a rectangular plate subjected to slamming loads

    Science.gov (United States)

    Wang, Shan; Guedes Soares, C.

    2017-10-01

    A hydroelastic analysis of a rectangular plate subjected to slamming loads is presented. An analytical model based on Wagner theory is used for calculations of transient slamming load on the ship plate. A thin isotropic plate theory is considered for determining the vibration of a rectangular plate excited by an external slamming force. The forced vibration of the plate is calculated by the modal expansion method. Analytical results of the transient response of a rectangular plate induced by slamming loads are compared with numerical calculations from finite element method. The theoretical slamming pressure based on Wagner model is applied on the finite element model of a plate. Good agreement is obtained between the analytical and numerical results for the structural deflection of a rectangular plate due to slamming pressure. The effects of plate dimension and wave profile on the structural vibration are discussed as well. The results show that a low impact velocity and a small wetted radial length of wave yield negligible effects of hydroelasticity.

  4. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  5. Subjective dimension in the analysis of human development

    Directory of Open Access Journals (Sweden)

    LÓPEZ NOVAL, Borja

    2012-06-01

    Full Text Available In recent years subjective evaluations about own quality of life, resumed in levels of life satisfactionor happiness, are gaining importance as indicators of development. Some authors state that subjectivewell-being is a necessary and sufficient condition for human development. In this work the arguments ofthese authors are explained and it is discussed the role subjective evaluations must play on developmentstudies. The main conclusion is that although it is necessary to integrate subjective well-being into humandevelopment studies we cannot identify subjective well-being and development.

  6. Communicating Uncertainties on Climate Change

    Science.gov (United States)

    Planton, S.

    2009-09-01

    The term of uncertainty in common language is confusing since it is related in one of its most usual sense to what cannot be known in advance or what is subject to doubt. Its definition in mathematics is unambiguous but not widely shared. It is thus difficult to communicate on this notion through media to a wide public. From its scientific basis to the impact assessment, climate change issue is subject to a large number of sources of uncertainties. In this case, the definition of the term is close to its mathematical sense, but the diversity of disciplines involved in the analysis process implies a great diversity of approaches of the notion. Faced to this diversity of approaches, the issue of communicating uncertainties on climate change is thus a great challenge. It is also complicated by the diversity of the targets of the communication on climate change, from stakeholders and policy makers to a wide public. We will present the process chosen by the IPCC in order to communicate uncertainties in its assessment reports taking the example of the guidance note to lead authors of the fourth assessment report. Concerning the communication of uncertainties to a wide public, we will give some examples aiming at illustrating how to avoid the above-mentioned ambiguity when dealing with this kind of communication.

  7. Uncertainty Analysis of Certified Photovoltaic Measurements at the National Renewable Energy Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Emery, K.

    2009-08-01

    Discusses NREL Photovoltaic Cell and Module Performance Characterization Group's procedures to achieve lowest practical uncertainty in measuring PV performance with respect to reference conditions.

  8. Wildfire Decision Making Under Uncertainty

    Science.gov (United States)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  9. Laboratory transport experiments with antibiotic sulfadiazine: Experimental results and parameter uncertainty analysis

    Science.gov (United States)

    Sittig, S.; Vrugt, J. A.; Kasteel, R.; Groeneweg, J.; Vereecken, H.

    2011-12-01

    Persistent antibiotics in the soil potentially contaminate the groundwater and affect the quality of drinking water. To improve our understanding of antibiotic transport in soils, we performed laboratory transport experiments in soil columns under constant irrigation conditions with repeated applications of chloride and radio-labeled SDZ. The tracers were incorporated in the first centimeter, either with pig manure or with solution. Breakthrough curves and concentration profiles of the parent compound and the main transformation products were measured. The goal is to describe the observed nonlinear and kinetic transport behavior of SDZ. Our analysis starts with synthetic transport data for the given laboratory flow conditions for tracers which exhibit increasingly complex interactions with the solid phase. This first step is necessary to benchmark our inverse modeling approach for ideal situations. Then we analyze the transport behavior using the column experiments in the laboratory. Our analysis uses a Markov chain Monte Carlo sampler (Differential Evolution Adaptive Metropolis algorithm, DREAM) to efficiently search the parameter space of an advective-dispersion model. Sorption of the antibiotics to the soil was described using a model regarding reversible as well as irreversible sorption. This presentation will discuss our initial findings. We will present the data of our laboratory experiments along with an analysis of parameter uncertainty.

  10. Gaming Change: A Many-objective Analysis of Water Supply Portfolios under Uncertainty

    Science.gov (United States)

    Reed, P. M.; Kasprzyk, J.; Characklis, G.; Kirsch, B.

    2008-12-01

    This study explores the uncertainty and tradeoffs associated with up to six conflicting water supply portfolio planning objectives. A ten-year Monte Carlo simulation model is used to evaluate water supply portfolios blending permanent rights, adaptive options contracts, and spot leases for a single city in the Lower Rio Grande Valley. Historical records of reservoir mass balance, lease pricing, and demand serve as the source data for the Monte Carlo simulation. Portfolio planning decisions include the initial volume and annual increases of permanent rights, thresholds for an adaptive options contract, and anticipatory decision rules for purchasing leases and exercising options. Our work distinguishes three cases: (1) permanent rights as the sole source of supply, (2) permanent rights and adaptive options, and (3) a combination of permanent rights, adaptive options, and leases. The problems have been formulated such that cases 1 and 2 are sub-spaces of the six objective formulation used for case 3. Our solution sets provide the tradeoff surfaces between portfolios' expected values for cost, cost variability, reliability, frequency of purchasing permanent rights increases, frequency of using leases, and dropped (or unused) transfers of water. The tradeoff surfaces for the three cases show that options and leases have a dramatic impact on the marginal costs associated with improving the efficiency and reliability of urban water supplies. Moreover, our many-objective analysis permits the discovery of a broad range of high quality portfolio strategies. We differentiate the value of adaptive options versus leases by testing a representative subset of optimal portfolios' abilities to effectively address regional increases in demand during drought periods. These results provide insights into the tradeoffs inherent to a more flexible, portfolio-style approach to urban water resources management, an approach that should become increasingly attractive in an environment of

  11. Overview of the AVT-191 Project to Assess Sensitivity Analysis and Uncertainty Quantification Methods for Military Vehicle Design

    Science.gov (United States)

    Benek, John A.; Luckring, James M.

    2017-01-01

    A NATO symposium held in 2008 identified many promising sensitivity analysis and un-certainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not known. The STO Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic problems of interest to NATO. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper presents an overview of the AVT-191 program content.

  12. Summary Findings from the AVT-191 Project to Assess Sensitivity Analysis and Uncertainty Quantification Methods for Military Vehicle Design

    Science.gov (United States)

    Benek, John A.; Luckring, James M.

    2017-01-01

    A NATO symposium held in Greece in 2008 identified many promising sensitivity analysis and uncertainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not clear. The NATO Science and Technology Organization, Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic vehicle development problems. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper summarizes findings and lessons learned from the task group.

  13. Implementation of a methodology to perform the uncertainty and sensitivity analysis of the control rod drop in a BWR

    Energy Technology Data Exchange (ETDEWEB)

    Reyes F, M. del C.

    2015-07-01

    A methodology to perform uncertainty and sensitivity analysis for the cross sections used in a Trace/PARCS coupled model for a control rod drop transient of a BWR-5 reactor was implemented with the neutronics code PARCS. A model of the nuclear reactor detailing all assemblies located in the core was developed. However, the thermohydraulic model designed in Trace was a simple model, where one channel representing all the types of assemblies located in the core, it was located inside a simple vessel model and boundary conditions were established. The thermohydraulic model was coupled with the neutronics model, first for the steady state and then a Control Rod Drop (CRD) transient was performed, in order to carry out the uncertainty and sensitivity analysis. To perform the analysis of the cross sections used in the Trace/PARCS coupled model during the transient, Probability Density Functions (PDFs) were generated for the 22 parameters cross sections selected from the neutronics parameters that PARCS requires, thus obtaining 100 different cases for the Trace/PARCS coupled model, each with a database of different cross sections. All these cases were executed with the coupled model, therefore obtaining 100 different outputs for the CRD transient with special emphasis on 4 responses per output: 1) The reactivity, 2) the percentage of rated power, 3) the average fuel temperature and 4) the average coolant density. For each response during the transient an uncertainty analysis was performed in which the corresponding uncertainty bands were generated. With this analysis it is possible to observe the results ranges of the responses chose by varying the uncertainty parameters selected. This is very useful and important for maintaining the safety in the nuclear power plants, also to verify if the uncertainty band is within of safety margins. The sensitivity analysis complements the uncertainty analysis identifying the parameter or parameters with the most influence on the

  14. Manifestations and implications of uncertainty for improving healthcare systems: an analysis of observational and interventional studies grounded in complexity science.

    Science.gov (United States)

    Leykum, Luci K; Lanham, Holly J; Pugh, Jacqueline A; Parchman, Michael; Anderson, Ruth A; Crabtree, Benjamin F; Nutting, Paul A; Miller, William L; Stange, Kurt C; McDaniel, Reuben R

    2014-11-19

    The application of complexity science to understanding healthcare system improvement highlights the need to consider interdependencies within the system. One important aspect of the interdependencies in healthcare delivery systems is how individuals relate to each other. However, results from our observational and interventional studies focusing on relationships to understand and improve outcomes in a variety of healthcare settings have been inconsistent. We sought to better understand and explain these inconsistencies by analyzing our findings across studies and building new theory. We analyzed eight observational and interventional studies in which our author team was involved as the basis of our analysis, using a set theoretical qualitative comparative analytic approach. Over 16 investigative meetings spanning 11 months, we iteratively analyzed our studies, identifying patterns of characteristics that could explain our set of results. Our initial focus on differences in setting did not explain our mixed results. We then turned to differences in patient care activities and tasks being studied and the attributes of the disease being treated. Finally, we examined the interdependence between task and disease. We identified system-level uncertainty as a defining characteristic of complex systems through which we interpreted our results. We identified several characteristics of healthcare tasks and diseases that impact the ways uncertainty is manifest across diverse care delivery activities. These include disease-related uncertainty (pace of evolution of disease and patient control over outcomes) and task-related uncertainty (standardized versus customized, routine versus non-routine, and interdependencies required for task completion). Uncertainty is an important aspect of clinical systems that must be considered in designing approaches to improve healthcare system function. The uncertainty inherent in tasks and diseases, and how they come together in specific

  15. Differences in subjective well-being within households: An analysis ...

    African Journals Online (AJOL)

    We investigate differences in subjective well-being (life satisfaction) within the household using matched data on co-resident couples drawn from the 2008 National Income Dynamics Study for South Africa. The majority of men and women in co-resident partnerships report different levels of subjective wellbeing. We use ...

  16. Parameter optimization, sensitivity, and uncertainty analysis of an ecosystem model at a forest flux tower site in the United States

    Science.gov (United States)

    Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende

    2014-01-01

    Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.

  17. Parameter optimization, sensitivity, and uncertainty analysis of an ecosystem model at a forest flux tower site in the United States

    Science.gov (United States)

    Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende

    2014-06-01

    Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.

  18. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Vigil, Dena M.; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Lefantzi, Sophia (Sandia National Laboratories, Livermore, CA); Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Eddy, John P.

    2011-12-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.

  19. Analysis of embedded waste storage tanks subjected to seismic loading

    Energy Technology Data Exchange (ETDEWEB)

    Zaslawsky, M.; Sammaddar, S.; Kennedy, W.N.

    1991-12-31

    At the Savannah River Site, High Activity Wastes are stored in carbon steel tanks that are within reinforced concrete vaults. These soil-embedded tank/vault structures are approximately 80 ft. in diameter and 40 ft. deep. The tanks were studied to determine the essentials of governing variables, to reduce the problem to the least number of governing cases to optimize analysis effort without introducing excessive conservatism. The problem reduced to a limited number of cases of soil-structure interaction and fluid (tank contents) -- structure interaction problems. It was theorized that substantially reduced input would be realized from soil structure interaction (SSI) but that it was also possible that tank-to-tank proximity would result in (re)amplification of the input. To determine the governing seismic input motion, the three dimensional SSI code, SASSI, was used. Significant among the issues relative to waste tanks is to the determination of fluid response and tank behavior as a function of tank contents viscosity. Tank seismic analyses and studies have been based on low viscosity fluids (water) and the behavior is quite well understood. Typical wastes (salts, sludge), which are highly viscous, have not been the subject of studies to understand the effect of viscosity on seismic response. The computer code DYNA3D was used to study how viscosity alters tank wall pressure distribution and tank base shear and overturning moments. A parallel hand calculation was performed using standard procedures. Conclusions based on the study provide insight into the quantification of the reduction of seismic inputs for soil structure interaction for a ``soft`` soil site.

  20. Using predictive uncertainty analysis to optimise tracer test design and data acquisition

    Science.gov (United States)

    Wallis, Ilka; Moore, Catherine; Post, Vincent; Wolf, Leif; Martens, Evelien; Prommer, Henning

    2014-07-01

    Tracer injection tests are regularly-used tools to identify and characterise flow and transport mechanisms in aquifers. Examples of practical applications are manifold and include, among others, managed aquifer recharge schemes, aquifer thermal energy storage systems and, increasingly important, the disposal of produced water from oil and shale gas wells. The hydrogeological and geochemical data collected during the injection tests are often employed to assess the potential impacts of injection on receptors such as drinking water wells and regularly serve as a basis for the development of conceptual and numerical models that underpin the prediction of potential impacts. As all field tracer injection tests impose substantial logistical and financial efforts, it is crucial to develop a solid a-priori understanding of the value of the various monitoring data to select monitoring strategies which provide the greatest return on investment. In this study, we demonstrate the ability of linear predictive uncertainty analysis (i.e. “data worth analysis”) to quantify the usefulness of different tracer types (bromide, temperature, methane and chloride as examples) and head measurements in the context of a field-scale aquifer injection trial of coal seam gas (CSG) co-produced water. Data worth was evaluated in terms of tracer type, in terms of tracer test design (e.g., injection rate, duration of test and the applied measurement frequency) and monitoring disposition to increase the reliability of injection impact assessments. This was followed by an uncertainty targeted Pareto analysis, which allowed the interdependencies of cost and predictive reliability for alternative monitoring campaigns to be compared directly. For the evaluated injection test, the data worth analysis assessed bromide as superior to head data and all other tracers during early sampling times. However, with time, chloride became a more suitable tracer to constrain simulations of physical transport

  1. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    Science.gov (United States)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the

  2. Sensitivity Analysis of Uncertainty Parameter based on MARS-LMR Code on SHRT-45R of EBR II

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seok-Ju; Kang, Doo-Hyuk; Seo, Jae-Seung [System Engineering and Technology Co., Daejeon (Korea, Republic of); Bae, Sung-Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jeong, Hae-Yong [Sejong University, Seoul (Korea, Republic of)

    2016-10-15

    In order to assess the uncertainty quantification of the MARS-LMR code, the code has been improved by modifying the source code to accommodate calculation process required for uncertainty quantification. In the present study, a transient of Unprotected Loss of Flow(ULOF) is selected as typical cases of as Anticipated Transient without Scram(ATWS) which belongs to DEC category. The MARS-LMR input generation for EBR II SHRT-45R and execution works are performed by using the PAPIRUS program. The sensitivity analysis is carried out with Uncertainty Parameter of the MARS-LMR code for EBR-II SHRT-45R. Based on the results of sensitivity analysis, dominant parameters with large sensitivity to FoM are picked out. Dominant parameters selected are closely related to the development process of ULOF event.

  3. Uncertainty and investment of Dutch firms : an empirical analysis using stock market data

    NARCIS (Netherlands)

    Bo, Hong; Lensink, Robert

    2000-01-01

    This paper examines the investment-uncertainty relationship for a panel of Dutch firms. The uncertainty proxy is derived from daily stock market prices of individual firms. We show that some macro indicators, in combination with firm fixed effects, are able to give a reasonable explanation of the

  4. Exploratory modeling and analysis : A promising method to deal with deep uncertainty

    NARCIS (Netherlands)

    Agusdinata, B.

    2008-01-01

    Faced with policy problems with high stakes, decisionmakers have increasingly recognized the importance of appropriately handling uncertainties. The nature of policy problems, however, is changing. Of particular concern are policy problems involving deep uncertainty when analysts do not know or the

  5. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1)

    DEFF Research Database (Denmark)

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan

    2009-01-01

    -NH) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S-NO) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific...

  6. Uncertainty analysis of steady state incident heat flux measurements in hydrocarbon fuel fires.

    Energy Technology Data Exchange (ETDEWEB)

    Nakos, James Thomas

    2005-12-01

    The objective of this report is to develop uncertainty estimates for three heat flux measurement techniques used for the measurement of incident heat flux in a combined radiative and convective environment. This is related to the measurement of heat flux to objects placed inside hydrocarbon fuel (diesel, JP-8 jet fuel) fires, which is very difficult to make accurately (e.g., less than 10%). Three methods will be discussed: a Schmidt-Boelter heat flux gage; a calorimeter and inverse heat conduction method; and a thin plate and energy balance method. Steady state uncertainties were estimated for two types of fires (i.e., calm wind and high winds) at three times (early in the fire, late in the fire, and at an intermediate time). Results showed a large uncertainty for all three methods. Typical uncertainties for a Schmidt-Boelter gage ranged from {+-}23% for high wind fires to {+-}39% for low wind fires. For the calorimeter/inverse method the uncertainties were {+-}25% to {+-}40%. The thin plate/energy balance method the uncertainties ranged from {+-}21% to {+-}42%. The 23-39% uncertainties for the Schmidt-Boelter gage are much larger than the quoted uncertainty for a radiative only environment (i.e ., {+-}3%). This large difference is due to the convective contribution and because the gage sensitivities to radiative and convective environments are not equal. All these values are larger than desired, which suggests the need for improvements in heat flux measurements in fires.

  7. Uncertainty propagation analysis of an N2O emission model at the plot and landscape scale

    NARCIS (Netherlands)

    Nol, L.; Heuvelink, G.B.M.; Veldkamp, A.; Vries, de W.; Kros, J.

    2010-01-01

    Nitrous oxide (N2O) emission from agricultural land is an important component of the total annual greenhouse gas (GHG) budget. In addition, uncertainties associated with agricultural N2O emissions are large. The goals of this work were (i) to quantify the uncertainties of modelled N2O emissions

  8. Uncertainty analysis of an optical method for pressure estimation in fluid flows

    Science.gov (United States)

    Gomit, Guillaume; Acher, Gwenael; Chatellier, Ludovic; David, Laurent

    2018-02-01

    The analysis of the error propagation from the velocity field to the pressure field using the pressure estimation method proposed by Jeon et al (2015 11th Int. In Symp. Part. Image Velocim. PIV15) is achieved. The accuracy of the method is assessed based on numerical data. The flow around a rigid profile (NACA0015) with a free tip is considered. From the numerical simulation data, tomographic-PIV (TPIV)-like data are generated. Two types of error are used to distort the data: a Gaussian noise and a pixel-locking effect are modelled. Propagation of both types of error during the pressure estimation process and the effect of the TPIV resolution are evaluated. Results highlight the importance of the resolution to accurately estimate the pressure in presence of small structures but also to limit the propagation of error from the velocity to the pressure. The study of the sensitivity of the method for the two models of errors, Gaussian or pixel-locking, shows different trends. This reveals also the importance of the model of errors for the analysis of the uncertainties for PIV-based pressure.

  9. LCA of waste management systems: Development of tools for modeling and uncertainty analysis

    DEFF Research Database (Denmark)

    Clavreul, Julie

    Since the late 1990s, life cycle assessment (LCA) has been increasingly applied to waste management to quantify direct, indirect and avoided impacts from various treatment options. The construction of inventories for waste management systems differs from classical product-LCAs in that (1) these s......Since the late 1990s, life cycle assessment (LCA) has been increasingly applied to waste management to quantify direct, indirect and avoided impacts from various treatment options. The construction of inventories for waste management systems differs from classical product-LCAs in that (1...... are presented. First a review was carried out on all LCA studies of waste management systems published before mid-2012. This provided a global overview of the technologies and waste fractions which have attracted focus within LCA while enabling an analysis of methodological tendencies, the use of tools...... and databases and the application of uncertainty analysis methods. The major outcome of this thesis was the development of a new LCA model, called EASETECH, building on the experience with previous LCA-tools, in particular the EASEWASTE model. Before the actual implementation phase, a design phase involved...

  10. Managing uncertainty: a review of food system scenario analysis and modelling.

    Science.gov (United States)

    Reilly, Michael; Willenbockel, Dirk

    2010-09-27

    Complex socio-ecological systems like the food system are unpredictable, especially to long-term horizons such as 2050. In order to manage this uncertainty, scenario analysis has been used in conjunction with food system models to explore plausible future outcomes. Food system scenarios use a diversity of scenario types and modelling approaches determined by the purpose of the exercise and by technical, methodological and epistemological constraints. Our case studies do not suggest Malthusian futures for a projected global population of 9 billion in 2050; but international trade will be a crucial determinant of outcomes; and the concept of sustainability across the dimensions of the food system has been inadequately explored so far. The impact of scenario analysis at a global scale could be strengthened with participatory processes involving key actors at other geographical scales. Food system models are valuable in managing existing knowledge on system behaviour and ensuring the credibility of qualitative stories but they are limited by current datasets for global crop production and trade, land use and hydrology. Climate change is likely to challenge the adaptive capacity of agricultural production and there are important knowledge gaps for modelling research to address.

  11. Model parameter estimation and uncertainty analysis: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force Working Group-6.

    Science.gov (United States)

    Briggs, Andrew H; Weinstein, Milton C; Fenwick, Elisabeth A L; Karnon, Jonathan; Sculpher, Mark J; Paltiel, A David

    2012-01-01

    A model's purpose is to inform medical decisions and health care resource allocation. Modelers employ quantitative methods to structure the clinical, epidemiological, and economic evidence base and gain qualitative insight to assist decision makers in making better decisions. From a policy perspective, the value of a model-based analysis lies not simply in its ability to generate a precise point estimate for a specific outcome but also in the systematic examination and responsible reporting of uncertainty surrounding this outcome and the ultimate decision being addressed. Different concepts relating to uncertainty in decision modeling are explored. Stochastic (first-order) uncertainty is distinguished from both parameter (second-order) uncertainty and from heterogeneity, with structural uncertainty relating to the model itself forming another level of uncertainty to consider. The article argues that the estimation of point estimates and uncertainty in parameters is part of a single process and explores the link between parameter uncertainty through to decision uncertainty and the relationship to value-of-information analysis. The article also makes extensive recommendations around the reporting of uncertainty, both in terms of deterministic sensitivity analysis techniques and probabilistic methods. Expected value of perfect information is argued to be the most appropriate presentational technique, alongside cost-effectiveness acceptability curves, for representing decision uncertainty from probabilistic analysis.

  12. Respondent uncertainty in contingent valuation of preventing beach erosion: an analysis with a polychotomous choice question.

    Science.gov (United States)

    Logar, Ivana; van den Bergh, Jeroen C J M

    2012-12-30

    Respondent uncertainty is often considered as one of the main limitations of stated preference methods, which are nowadays being widely used for valuing environmental goods and services. This article examines the effect of respondent uncertainty on welfare estimates by applying the contingent valuation method. This is done in the context of beach protection against erosion. Respondent certainty levels are elicited using a five-category polychotomous choice question. Two different uncertainty calibration techniques are tested, namely one that treats uncertain responses as missing and another in which uncertain 'yes' responses are recoded as 'no' responses. We found no evidence that the former technique offers any gains over the conventional model assuming certainty. The latter calibration technique systematically reduces welfare estimates. The reduction is statistically significant only when the most certain 'yes' responses are recoded as 'no' responses. The article further identifies determinants of respondent uncertainty. Finally, it explores how real market experience affects respondent uncertainty. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Evaluation and analysis of uncertainty in tensile experiment results of modified PPR at elevated temperature

    Science.gov (United States)

    Xiang, Yu; Zhonghua, Su; Jinhua, Leng; Teng, Yun

    2017-08-01

    A high temperature tensile experiment of modified random copolymerized polypropylene was carried out by ASTM D 638-2014. It analyzed the factors influencing the accuracy of the high temperature mechanical properties of modified random copolymer polypropylene and discussed the causes of the uncer-tainty of measurement standards from the sample size measurement, the indication error of force value of experiment machines, its calibration, data acquisition of the experimental software, the temperature control, the numerical correction, and the material nonuniformity, etc. According to JJF 1059.1-2012, class A and class B evaluation were conducted on the above-mentioned uncertainty components, and all the uncertainty components were synthesized. By analyzing the uncertainty of the measurement results, this paper provides a reference for evaluating the uncertainty of the same type of measurement results.

  14. Uncertainty in artificial intelligence

    CERN Document Server

    Kanal, LN

    1986-01-01

    How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

  15. Regulating fisheries under uncertainty

    DEFF Research Database (Denmark)

    Hansen, Lars Gårn; Jensen, Frank

    2017-01-01

    Regulator uncertainty is decisive for whether price or quantity regulation maximizes welfare in fisheries. In this paper, we develop a model of fisheries regulation that includes ecological uncertainly, variable economic uncertainty as well as structural economic uncertainty. We aggregate...... qualification of the pro-price regulation message dominating the fisheries economics literature. We also believe that the model of a fishery developed in this paper could be applied to the regulation of other renewable resources where regulators are subject to uncertainty either directly or with some...

  16. A novel method for importance measure analysis in the presence of epistemic and aleatory uncertainties

    Directory of Open Access Journals (Sweden)

    Ren Bo

    2014-06-01

    Full Text Available For structural systems with both epistemic and aleatory uncertainties, research on quantifying the contribution of the epistemic and aleatory uncertainties to the failure probability of the systems is conducted. Based on the method of separating epistemic and aleatory uncertainties in a variable, the core idea of the research is firstly to establish a novel deterministic transition model for auxiliary variables, distribution parameters, random variables, failure probability, then to propose the improved importance sampling (IS to solve the transition model. Furthermore, the distribution parameters and auxiliary variables are sampled simultaneously and independently; therefore, the inefficient sampling procedure with an “inner-loop” for epistemic uncertainty and an “outer-loop” for aleatory uncertainty in traditional methods is avoided. Since the proposed method combines the fast convergence of the proper estimates and searches failure samples in the interesting regions with high efficiency, the proposed method is more efficient than traditional methods for the variance-based failure probability sensitivity measures in the presence of epistemic and aleatory uncertainties. Two numerical examples and one engineering example are introduced for demonstrating the efficiency and precision of the proposed method for structural systems with both epistemic and aleatory uncertainties.

  17. Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, Amy N [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-07-26

    This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, this paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.

  18. Sources of Judgmental Uncertainty

    Science.gov (United States)

    1977-09-01

    sometimes at the end. To avoid primacy or recency effects , which were not part of this first study, for half of the subjects the orders of information items...summarize, 72 subjects were randomly assigned to two conditions of control and exposed to three conditions of orderliness. Order effects and primacy / recency ...WORDS (Continue on reverie atids If necessary and Identity by block number) ~ Judgmental Uncertainty Primacy / Recency Environmental UncertaintyN1

  19. PEBBED Uncertainty and Sensitivity Analysis of the CRP-5 PBMR DLOFC Transient Benchmark with the SUSA Code

    Energy Technology Data Exchange (ETDEWEB)

    Gerhard Strydom

    2011-01-01

    The need for a defendable and systematic uncertainty and sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008. The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This report summarized the results of the initial investigations performed with SUSA, utilizing a typical High Temperature Reactor benchmark (the IAEA CRP-5 PBMR 400MW Exercise 2) and the PEBBED-THERMIX suite of codes. The following steps were performed as part of the uncertainty and sensitivity analysis: 1. Eight PEBBED-THERMIX model input parameters were selected for inclusion in the uncertainty study: the total reactor power, inlet gas temperature, decay heat, and the specific heat capability and thermal conductivity of the fuel, pebble bed and reflector graphite. 2. The input parameters variations and probability density functions were specified, and a total of 800 PEBBED-THERMIX model calculations were performed, divided into 4 sets of 100 and 2 sets of 200 Steady State and Depressurized Loss of Forced Cooling (DLOFC) transient calculations each. 3. The steady state and DLOFC maximum fuel temperature, as well as the daily pebble fuel load rate data, were supplied to SUSA as model output parameters of interest. The 6 data sets were statistically analyzed to determine the 5% and 95% percentile values for each of the 3 output parameters with a 95% confidence level, and typical statistical indictors were also generated (e.g. Kendall, Pearson and Spearman coefficients). 4. A SUSA sensitivity study was performed to obtain correlation data between the input and output parameters, and to identify the

  20. A joint analysis of Planck and BICEP2 B modes including dust polarization uncertainty

    Science.gov (United States)

    Mortonson, Michael J.; Seljak, Uroš

    2014-10-01

    We analyze BICEP2 and Planck data using a model that includes CMB lensing, gravity waves, and polarized dust. Recently published Planck dust polarization maps have highlighted the difficulty of estimating the amount of dust polarization in low intensity regions, suggesting that the polarization fractions have considerable uncertainties and may be significantly higher than previous predictions. In this paper, we start by assuming nothing about the dust polarization except for the power spectrum shape, which we take to be ClBB,dust propto l-2.42. The resulting joint BICEP2+Planck analysis favors solutions without gravity waves, and the upper limit on the tensor-to-scalar ratio is r0.14 are excluded with 99.5% confidence). We address the cross-correlation analysis of BICEP2 at 150 GHz with BICEP1 at 100 GHz as a test of foreground contamination. We find that the null hypothesis of dust and lensing with 0r= gives Δ χ2 < 2 relative to the hypothesis of no dust, so the frequency analysis does not strongly favor either model over the other. We also discuss how more accurate dust polarization maps may improve our constraints. If the dust polarization is measured perfectly, the limit can reach r < 0.05 (or the corresponding detection significance if the observed dust signal plus the expected lensing signal is below the BICEP2 observations), but this degrades quickly to almost no improvement if the dust calibration error is 20% or larger or if the dust maps are not processed through the BICEP2 pipeline, inducing sampling variance noise.

  1. Significance of uncertainties derived from settling tank model structure and parameters on predicting WWTP performance - A global sensitivity analysis study

    DEFF Research Database (Denmark)

    Ramin, Elham; Sin, Gürkan; Mikkelsen, Peter Steen

    2011-01-01

    uncertainty of settler models can therefore propagate, and add to the uncertainties in prediction of any plant performance criteria. Here we present an assessment of the relative significance of secondary settling model performance in WWTP simulations. We perform a global sensitivity analysis (GSA) based...... parameters for calibration is limited. The other SST model is a state-of-the-art, second-order, convection-dispersion tool (Plósz et al., 2007). The sensitivity results obtained from the four scenarios consistently indicate that the settler models and their parameters are among the most significant sources...

  2. Subjective Analysis and Objective Characterization of Adaptive Bitrate Videos

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Tavakoli, Samira; Brunnström, Kjell

    2016-01-01

    The HTTP Adaptive Streaming (HAS) technology allows video service providers to improve the network utilization and thereby increasing the end-users’ Quality of Experience (QoE).This has made HAS a widely used approach for audiovisual delivery. There are several previous studies aiming to identify...... the factors influencing on subjective QoE of adaptation events.However, adapting the video quality typically lasts in a time scale much longer than what current standardized subjective testing methods are designed for, thus making the full matrix design of the experiment on an event level hard to achieve....... In this study, we investigated the overall subjective QoE of 6 minutes long video sequences containing different sequential adaptation events. This was compared to a data set from our previous work performed to evaluate the individual adaptation events. We could then derive a relationship between the overall...

  3. A Unified Analysis for Subject Topics in Brazilian Portuguese

    Directory of Open Access Journals (Sweden)

    Aroldo de Andrade

    2014-06-01

    Full Text Available In this paper we discuss the phenomenon of subject topics, consisting of the movement of either a genitive or a locative constituent into subject position in Brazilian Portuguese. This construction occurs with different verb classes, shows subject-verb agreement and precludes a resumptive pronoun. The goal of the present text is to account for its distribution. To do so, we argue that the two subclasses of unaccusative verbs found with genitive and locative topics instantiate some sort of secondary predication, and that only specific configurations allow for the movement of a constituent out of the argument structure domain. Finally, we address the comparative issue involved in explaining why the derivation of such a construction is not possible in European Portuguese.

  4. Review of clinical brachytherapy uncertainties: Analysis guidelines of GEC-ESTRO and the AAPM☆

    Science.gov (United States)

    Kirisits, Christian; Rivard, Mark J.; Baltas, Dimos; Ballester, Facundo; De Brabandere, Marisol; van der Laarse, Rob; Niatsetski, Yury; Papagiannis, Panagiotis; Hellebust, Taran Paulsen; Perez-Calatayud, Jose; Tanderup, Kari; Venselaar, Jack L.M.; Siebert, Frank-André

    2014-01-01

    Background and purpose A substantial reduction of uncertainties in clinical brachytherapy should result in improved outcome in terms of increased local control and reduced side effects. Types of uncertainties have to be identified, grouped, and quantified. Methods A detailed literature review was performed to identify uncertainty components and their relative importance to the combined overall uncertainty. Results Very few components (e.g., source strength and afterloader timer) are independent of clinical disease site and location of administered dose. While the influence of medium on dose calculation can be substantial for low energy sources or non-deeply seated implants, the influence of medium is of minor importance for high-energy sources in the pelvic region. The level of uncertainties due to target, organ, applicator, and/or source movement in relation to the geometry assumed for treatment planning is highly dependent on fractionation and the level of image guided adaptive treatment. Most studies to date report the results in a manner that allows no direct reproduction and further comparison with other studies. Often, no distinction is made between variations, uncertainties, and errors or mistakes. The literature review facilitated the drafting of recommendations for uniform uncertainty reporting in clinical BT, which are also provided. The recommended comprehensive uncertainty investigations are key to obtain a general impression of uncertainties, and may help to identify elements of the brachytherapy treatment process that need improvement in terms of diminishing their dosimetric uncertainties. It is recommended to present data on the analyzed parameters (distance shifts, volume changes, source or applicator position, etc.), and also their influence on absorbed dose for clinically-relevant dose parameters (e.g., target parameters such as D90 or OAR doses). Publications on brachytherapy should include a statement of total dose uncertainty for the entire

  5. Review of clinical brachytherapy uncertainties: analysis guidelines of GEC-ESTRO and the AAPM.

    Science.gov (United States)

    Kirisits, Christian; Rivard, Mark J; Baltas, Dimos; Ballester, Facundo; De Brabandere, Marisol; van der Laarse, Rob; Niatsetski, Yury; Papagiannis, Panagiotis; Hellebust, Taran Paulsen; Perez-Calatayud, Jose; Tanderup, Kari; Venselaar, Jack L M; Siebert, Frank-André

    2014-01-01

    A substantial reduction of uncertainties in clinical brachytherapy should result in improved outcome in terms of increased local control and reduced side effects. Types of uncertainties have to be identified, grouped, and quantified. A detailed literature review was performed to identify uncertainty components and their relative importance to the combined overall uncertainty. Very few components (e.g., source strength and afterloader timer) are independent of clinical disease site and location of administered dose. While the influence of medium on dose calculation can be substantial for low energy sources or non-deeply seated implants, the influence of medium is of minor importance for high-energy sources in the pelvic region. The level of uncertainties due to target, organ, applicator, and/or source movement in relation to the geometry assumed for treatment planning is highly dependent on fractionation and the level of image guided adaptive treatment. Most studies to date report the results in a manner that allows no direct reproduction and further comparison with other studies. Often, no distinction is made between variations, uncertainties, and errors or mistakes. The literature review facilitated the drafting of recommendations for uniform uncertainty reporting in clinical BT, which are also provided. The recommended comprehensive uncertainty investigations are key to obtain a general impression of uncertainties, and may help to identify elements of the brachytherapy treatment process that need improvement in terms of diminishing their dosimetric uncertainties. It is recommended to present data on the analyzed parameters (distance shifts, volume changes, source or applicator position, etc.), and also their influence on absorbed dose for clinically-relevant dose parameters (e.g., target parameters such as D90 or OAR doses). Publications on brachytherapy should include a statement of total dose uncertainty for the entire treatment course, taking into account the

  6. muView: A Visual Analysis System for Exploring Uncertainty in Myocardial Ischemia Simulations

    KAUST Repository

    Rosen, Paul

    2016-05-23

    In this paper we describe the Myocardial Uncertainty Viewer (muView or μView) system for exploring data stemming from the simulation of cardiac ischemia. The simulation uses a collection of conductivity values to understand how ischemic regions effect the undamaged anisotropic heart tissue. The data resulting from the simulation is multi-valued and volumetric, and thus, for every data point, we have a collection of samples describing cardiac electrical properties. μView combines a suite of visual analysis methods to explore the area surrounding the ischemic zone and identify how perturbations of variables change the propagation of their effects. In addition to presenting a collection of visualization techniques, which individually highlight different aspects of the data, the coordinated view system forms a cohesive environment for exploring the simulations. We also discuss the findings of our study, which are helping to steer further development of the simulation and strengthening our collaboration with the biomedical engineers attempting to understand the phenomenon.

  7. Parameter sensitivity and uncertainty analysis for a storm surge and wave model

    Directory of Open Access Journals (Sweden)

    L. A. Bastidas

    2016-09-01

    Full Text Available Development and simulation of synthetic hurricane tracks is a common methodology used to estimate hurricane hazards in the absence of empirical coastal surge and wave observations. Such methods typically rely on numerical models to translate stochastically generated hurricane wind and pressure forcing into coastal surge and wave estimates. The model output uncertainty associated with selection of appropriate model parameters must therefore be addressed. The computational overburden of probabilistic surge hazard estimates is exacerbated by the high dimensionality of numerical surge and wave models. We present a model parameter sensitivity analysis of the Delft3D model for the simulation of hazards posed by Hurricane Bob (1991 utilizing three theoretical wind distributions (NWS23, modified Rankine, and Holland. The sensitive model parameters (of 11 total considered include wind drag, the depth-induced breaking γB, and the bottom roughness. Several parameters show no sensitivity (threshold depth, eddy viscosity, wave triad parameters, and depth-induced breaking αB and can therefore be excluded to reduce the computational overburden of probabilistic surge hazard estimates. The sensitive model parameters also demonstrate a large number of interactions between parameters and a nonlinear model response. While model outputs showed sensitivity to several parameters, the ability of these parameters to act as tuning parameters for calibration is somewhat limited as proper model calibration is strongly reliant on accurate wind and pressure forcing data. A comparison of the model performance with forcings from the different wind models is also presented.

  8. Estimation and uncertainty analysis of dose response in an inter-laboratory experiment

    Science.gov (United States)

    Toman, Blaza; Rösslein, Matthias; Elliott, John T.; Petersen, Elijah J.

    2016-02-01

    An inter-laboratory experiment for the evaluation of toxic effects of NH2-polystyrene nanoparticles on living human cancer cells was performed with five participating laboratories. Previously published results from nanocytoxicity assays are often contradictory, mostly due to challenges related to producing a reliable cytotoxicity assay protocol for use with nanomaterials. Specific challenges include reproducibility preparing nanoparticle dispersions, biological variability from testing living cell lines, and the potential for nano-related interference effects. In this experiment, such challenges were addressed by developing a detailed experimental protocol and using a specially designed 96-well plate layout which incorporated a range of control measurements to assess multiple factors such as nanomaterial interference, pipetting accuracy, cell seeding density, and instrument performance. Detailed data analysis of these control measurements showed that good control of the experiments was attained by all participants in most cases. The main measurement objective of the study was the estimation of a dose response relationship between concentration of the nanoparticles and metabolic activity of the living cells, under several experimental conditions. The dose curve estimation was achieved by imbedding a three parameter logistic curve in a three level Bayesian hierarchical model, accounting for uncertainty due to all known experimental conditions as well as between laboratory variability in a top-down manner. Computation was performed using Markov Chain Monte Carlo methods. The fit of the model was evaluated using Bayesian posterior predictive probabilities and found to be satisfactory.

  9. Construction strategies and lifetime uncertainties for nuclear projects: A real option analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jain, Shashi, E-mail: s.jain@cwi.nl [TU Delft, Delft Institute of Applied Mathematics, Delft (Netherlands); Nuclear Research Group, Petten (Netherlands); Roelofs, Ferry, E-mail: roelofs@nrg.eu [Nuclear Research Group, Petten (Netherlands); Oosterlee, Cornelis W., E-mail: c.w.oosterlee@cwi.nl [CWI – Centrum Wiskunde and Informatica, Amsterdam (Netherlands); TU Delft, Delft Institute of Applied Mathematics, Delft (Netherlands)

    2013-12-15

    Highlights: • Real options can be used to value flexibility of modular reactors. • Value of NPPs increases with implementation of long term cost reductions. • Levels of uncertainties affect the choice between projects. -- Abstract: Small and medium sized reactors, SMRs (according to IAEA, ‘small’ are reactors with power less than 300 MWe, and ‘medium’ with power less than 700 MWe) are considered as an attractive option for investment in nuclear power plants. SMRs may benefit from flexibility of investment, reduced upfront expenditure, and easy integration with small sized grids. Large reactors on the other hand have been an attractive option due to economy of scale. In this paper we focus on the advantages of flexibility due to modular construction of SMRs. Using real option analysis (ROA) we help a utility determine the value of sequential modular SMRs. Numerical results under different considerations, like possibility of rare events, learning, uncertain lifetimes are reported for a single large unit and modular SMRs.

  10. Key Process Uncertainties in Soil Carbon Dynamics: Comparing Multiple Model Structures and Observational Meta-analysis

    Science.gov (United States)

    Sulman, B. N.; Moore, J.; Averill, C.; Abramoff, R. Z.; Bradford, M.; Classen, A. T.; Hartman, M. D.; Kivlin, S. N.; Luo, Y.; Mayes, M. A.; Morrison, E. W.; Riley, W. J.; Salazar, A.; Schimel, J.; Sridhar, B.; Tang, J.; Wang, G.; Wieder, W. R.

    2016-12-01

    Soil carbon (C) dynamics are crucial to understanding and predicting C cycle responses to global change and soil C modeling is a key tool for understanding these dynamics. While first order model structures have historically dominated this area, a recent proliferation of alternative model structures representing different assumptions about microbial activity and mineral protection is providing new opportunities to explore process uncertainties related to soil C dynamics. We conducted idealized simulations of soil C responses to warming and litter addition using models from five research groups that incorporated different sets of assumptions about processes governing soil C decomposition and stabilization. We conducted a meta-analysis of published warming and C addition experiments for comparison with simulations. Assumptions related to mineral protection and microbial dynamics drove strong differences among models. In response to C additions, some models predicted long-term C accumulation while others predicted transient increases that were counteracted by accelerating decomposition. In experimental manipulations, doubling litter addition did not change soil C stocks in studies spanning as long as two decades. This result agreed with simulations from models with strong microbial growth responses and limited mineral sorption capacity. In observations, warming initially drove soil C loss via increased CO2 production, but in some studies soil C rebounded and increased over decadal time scales. In contrast, all models predicted sustained C losses under warming. The disagreement with experimental results could be explained by physiological or community-level acclimation, or by warming-related changes in plant growth. In addition to the role of microbial activity, assumptions related to mineral sorption and protected C played a key role in driving long-term model responses. In general, simulations were similar in their initial responses to perturbations but diverged over

  11. Flood Damage Analysis: First Floor Elevation Uncertainty Resulting from LiDAR-Derived Digital Surface Models

    Directory of Open Access Journals (Sweden)

    José María Bodoque

    2016-07-01

    Full Text Available The use of high resolution ground-based light detection and ranging (LiDAR datasets provides spatial density and vertical precision for obtaining highly accurate Digital Surface Models (DSMs. As a result, the reliability of flood damage analysis has improved significantly, owing to the increased accuracy of hydrodynamic models. In addition, considerable error reduction has been achieved in the estimation of first floor elevation, which is a critical parameter for determining structural and content damages in buildings. However, as with any discrete measurement technique, LiDAR data contain object space ambiguities, especially in urban areas where the presence of buildings and the floodplain gives rise to a highly complex landscape that is largely corrected by using ancillary information based on the addition of breaklines to a triangulated irregular network (TIN. The present study provides a methodological approach for assessing uncertainty regarding first floor elevation. This is based on: (i generation an urban TIN from LiDAR data with a density of 0.5 points·m−2, complemented with the river bathymetry obtained from a field survey with a density of 0.3 points·m−2. The TIN was subsequently improved by adding breaklines and was finally transformed to a raster with a spatial resolution of 2 m; (ii implementation of a two-dimensional (2D hydrodynamic model based on the 500-year flood return period. The high resolution DSM obtained in the previous step, facilitated addressing the modelling, since it represented suitable urban features influencing hydraulics (e.g., streets and buildings; and (iii determination of first floor elevation uncertainty within the 500-year flood zone by performing Monte Carlo simulations based on geostatistics and 1997 control elevation points in order to assess error. Deviations in first floor elevation (average: 0.56 m and standard deviation: 0.33 m show that this parameter has to be neatly characterized in order

  12. Uncertainty analysis of a coupled ecosystem response model simulating greenhouse gas fluxes from a temperate grassland

    Science.gov (United States)

    Liebermann, Ralf; Kraft, Philipp; Houska, Tobias; Breuer, Lutz; Müller, Christoph; Kraus, David; Haas, Edwin; Klatt, Steffen

    2015-04-01

    Among anthropogenic greenhouse gas emissions, CO2 is the dominant driver of global climate change. Next to its direct impact on the radiation budget, it also affects the climate system by triggering feedback mechanisms in terrestrial ecosystems. Such mechanisms - like stimulated photosynthesis, increased root exudations and reduced stomatal transpiration - influence both the input and the turnover of carbon and nitrogen compounds in the soil. The stabilization and decomposition of these compounds determines how increasing CO2 concentrations change the terrestrial trace gas emissions, especially CO2, N2O and CH4. To assess the potential reaction of terrestrial greenhouse gas emissions to rising tropospheric CO2 concentration, we make use of a comprehensive ecosystem model integrating known processes and fluxes of the carbon-nitrogen cycle in soil, vegetation and water. We apply a state-of-the-art ecosystem model with measurements from a long term field experiment of CO2 enrichment. The model - a grassland realization of LandscapeDNDC - simulates soil chemistry coupled with plant physiology, microclimate and hydrology. The data - comprising biomass, greenhouse gas emissions, management practices and soil properties - has been attained from a FACE (Free Air Carbon dioxide Enrichment) experiment running since 1997 on a temperate grassland in Giessen, Germany. Management and soil data, together with weather records, are used to drive the model, while cut biomass as well as CO2 and N2O emissions are used for calibration and validation. Starting with control data from installations without CO2 enhancement, we begin with a GLUE (General Likelihood Uncertainty Estimation) assessment using Latin Hypercube to reduce the range of the model parameters. This is followed by a detailed sensitivity analysis, the application of DREAM-ZS for model calibration, and an estimation of the effect of input uncertainty on the simulation results. Since first results indicate problems with

  13. Identification and uncertainty analysis of a hydrological water quality model with varying input data information content

    Science.gov (United States)

    Jiang, Sanyuan; Jomaa, Seifeddine; Rode, Michael

    2013-04-01

    The rivers in central Germany are moderately to heavily polluted by nutrient inputs from point and diffuse sources. The objectives of this study are (i) to assess the new HYPE model (HYdrological Predictions for the Environment) for simulating runoff and inorganic nitrogen (IN) emissions at nested and spatially heterogeneous mesoscale catchments; (ii) to investigate the temporal and spatial variations of IN leaching and (iii) to investigate effects of calibration data on hydrological parameter identification. A multi-site and multi-objective calibration approach with help of Markov chain Monte Carlo (MCMC) was employed for parameter optimisation and uncertainty analysis. Results showed that parameters related to evapotranspiration were most sensitive in runoff simulation, while the nitrogen processes were mainly controlled by plant uptake and denitrification. Runoff was reproduced quite well for both calibration (1994-1999) and validation (1999-2004) periods (including the extreme dry year of 2003) at all three gauge stations, with a lowest Nash-Sutcliffe (NSE) of 0.86. The dynamics of soil moisture during extreme climatological events were well captured. Corresponding to spatial variability of hydrological regimes and land use, IN concentrations showed an increase in magnitude and a decrease in dynamics from upstream to downstream, reflecting the combined effects of increasing nutrient inputs and decreasing IN in-stream retention. The IN load was simulated well at monthly time intervals, with a lowest NSE of 0.69. Results revealed high IN emissions in winter and low values in summer; the area-weighted IN emission load decreased along the stream channel. Therefore, it is concluded that the IN emission is mainly controlled by runoff in this study catchment. From the preliminary result, we found that the 95% parameter confidence intervals of hydrological parameters decreased when IN concentration observations were included in hydrological parameter calibration. In

  14. Automated longitudinal intra-subject analysis (ALISA) for diffusion MRI tractography

    DEFF Research Database (Denmark)

    Aarnink, Saskia H; Vos, Sjoerd B; Leemans, Alexander

    2014-01-01

    the inter-subject and intra-subject automation in this situation are intended for subjects without gross pathology. In this work, we propose such an automated longitudinal intra-subject analysis (dubbed ALISA) approach, and assessed whether ALISA could preserve the same level of reliability as obtained...

  15. A Web Survey Analysis of Subjective Well-being

    NARCIS (Netherlands)

    Guzi, M.; de Pedraza García, P.

    2015-01-01

    Purpose - This paper explores the role of work conditions and job characteristics with respect to three subjective well-being indicators: life satisfaction, job satisfaction and satisfaction with work-life balance. From a methodological point of view, the paper shows how social sciences can benefit

  16. Performance, Pinned Down: A Lacanian Analysis of Subjectivity at Work

    NARCIS (Netherlands)

    C.M.W. Hoedemaekers (Casper)

    2008-01-01

    textabstractThis study seeks to create an account of how the performing subject comes into being within a specific organizational context. It looks at some of the ways in which managerial practices impact upon the selfhood of employees by means of the language in which they are couched. Drawing

  17. Analysis of Idiom Variation in the Framework of Linguistic Subjectivity

    Science.gov (United States)

    Liu, Zhengyuan

    2012-01-01

    Idiom variation is a ubiquitous linguistic phenomenon which has raised a lot of research questions. The past approach was either formal or functional. Both of them did not pay much attention to cognitive factors of language users. By putting idiom variation in the framework of linguistic subjectivity, we have offered a new perspective in the…

  18. Pressure transient analysis of a horizontal well subject to four ...

    African Journals Online (AJOL)

    Reservoir characterization is essential for effective reservoir and wellbore management. But when a horizontal well is subject to constant-pressure external boundaries, the extent of reservoir characterization that is possible depends on the flow regimes that are encountered in a given flow time. In this paper dimensionless ...

  19. Analysis of Subjective Conceptualizations Towards Collective Conceptual Modelling

    DEFF Research Database (Denmark)

    Glückstad, Fumiko Kano; Herlau, Tue; Schmidt, Mikkel Nørgaard

    2013-01-01

    This work is conducted as a preliminary study for a project where individuals' conceptualizations of domain knowledge will thoroughly be analyzed across 150 subjects from 6 countries. The project aims at investigating how humans' conceptualizations differ according to different types of mother la...

  20. Analysis of Subjective Conceptualizations Towards Collective Conceptual Modelling

    DEFF Research Database (Denmark)

    Kano Glückstad, Fumiko; Herlau, Tue; Schmidt, Mikkel N.

    2013-01-01

    This work is conducted as a preliminary study for a project where individuals' conceptualizations of domain knowledge will thoroughly be analyzed across 150 subjects from 6 countries. The project aims at investigating how humans' conceptualizations dier according to dierent types of mother langua...

  1. AN ANALYSIS OF SUBJECT AGREEMENT ERRORS IN ENGLISH ...

    African Journals Online (AJOL)

    Windows User

    however, continuing prevalence of a wide range of errors in students' writing. ... were written before. In English, as in many other languages, one of the grammar rules is that the subjects and the verbs must agree both in number and in person. .... The incorrect sentences which were picked were the ones which had types of.

  2. Assessing and reporting uncertainties in dietary exposure analysis - Part II: Application of the uncertainty template to a practical example of exposure assessment.

    Science.gov (United States)

    Tennant, David; Bánáti, Diána; Kennedy, Marc; König, Jürgen; O'Mahony, Cian; Kettler, Susanne

    2017-11-01

    A previous publication described methods for assessing and reporting uncertainty in dietary exposure assessments. This follow-up publication uses a case study to develop proposals for representing and communicating uncertainty to risk managers. The food ingredient aspartame is used as the case study in a simple deterministic model (the EFSA FAIM template) and with more sophisticated probabilistic exposure assessment software (FACET). Parameter and model uncertainties are identified for each modelling approach and tabulated. The relative importance of each source of uncertainty is then evaluated using a semi-quantitative scale and the results expressed using two different forms of graphical summary. The value of this approach in expressing uncertainties in a manner that is relevant to the exposure assessment and useful to risk managers is then discussed. It was observed that the majority of uncertainties are often associated with data sources rather than the model itself. However, differences in modelling methods can have the greatest impact on uncertainties overall, particularly when the underlying data are the same. It was concluded that improved methods for communicating uncertainties for risk management is the research area where the greatest amount of effort is suggested to be placed in future. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Uncertainties in Classification System Conversion and an Analysis of Inconsistencies in Global Land Cover Products

    National Research Council Canada - National Science Library

    Miao Zhang; Mingguo Ma; Philippe De Maeyer; Alishir Kurban

    2017-01-01

    ... (vegetation, wetlands and others only), we studied spatial and areal inconsistencies in the three most recent multi-resource land cover products in a complex mountain-oasis-desert system and quantitatively discussed the uncertainties...

  4. NUCLEAR DATA UNCERTAINTY AND SENSITIVITY ANALYSIS WITH XSUSA FOR FUEL ASSEMBLY DEPLETION CALCULATIONS

    National Research Council Canada - National Science Library

    Zwermann, W; Aures, A; Gallner, L; Hannstein, V; Krzykacz-Hausmann, B; Velkov, K; Martinez, J.S

    2014-01-01

    Uncertainty and sensitivity analyses with respect to nuclear data are performed with depletion calculations for BWR and PWR fuel assemblies specified in the framework of the UAM-LWR Benchmark Phase II...

  5. Sensitivity and uncertainty analysis of reactivities for UO2 and MOX fueled PWR cells

    Energy Technology Data Exchange (ETDEWEB)

    Foad, Basma [Research Institute of Nuclear Engineering, University of Fukui, Kanawa-cho 1-2-4, Tsuruga-shi, Fukui-ken, 914-0055 (Japan); Egypt Nuclear and Radiological Regulatory Authority, 3 Ahmad El Zomar St., Nasr City, Cairo, 11787 (Egypt); Takeda, Toshikazu [Research Institute of Nuclear Engineering, University of Fukui, Kanawa-cho 1-2-4, Tsuruga-shi, Fukui-ken, 914-0055 (Japan)

    2015-12-31

    The purpose of this paper is to apply our improved method for calculating sensitivities and uncertainties of reactivity responses for UO{sub 2} and MOX fueled pressurized water reactor cells. The improved method has been used to calculate sensitivity coefficients relative to infinite dilution cross-sections, where the self-shielding effect is taken into account. Two types of reactivities are considered: Doppler reactivity and coolant void reactivity, for each type of reactivity, the sensitivities are calculated for small and large perturbations. The results have demonstrated that the reactivity responses have larger relative uncertainty than eigenvalue responses. In addition, the uncertainty of coolant void reactivity is much greater than Doppler reactivity especially for large perturbations. The sensitivity coefficients and uncertainties of both reactivities were verified by comparing with SCALE code results using ENDF/B-VII library and good agreements have been found.

  6. Aboveground Forest Biomass Estimation with Landsat and LiDAR Data and Uncertainty Analysis of the Estimates

    Directory of Open Access Journals (Sweden)

    Dengsheng Lu

    2012-01-01

    Full Text Available Landsat Thematic mapper (TM image has long been the dominate data source, and recently LiDAR has offered an important new structural data stream for forest biomass estimations. On the other hand, forest biomass uncertainty analysis research has only recently obtained sufficient attention due to the difficulty in collecting reference data. This paper provides a brief overview of current forest biomass estimation methods using both TM and LiDAR data. A case study is then presented that demonstrates the forest biomass estimation methods and uncertainty analysis. Results indicate that Landsat TM data can provide adequate biomass estimates for secondary succession but are not suitable for mature forest biomass estimates due to data saturation problems. LiDAR can overcome TM’s shortcoming providing better biomass estimation performance but has not been extensively applied in practice due to data availability constraints. The uncertainty analysis indicates that various sources affect the performance of forest biomass/carbon estimation. With that said, the clear dominate sources of uncertainty are the variation of input sample plot data and data saturation problem related to optical sensors. A possible solution to increasing the confidence in forest biomass estimates is to integrate the strengths of multisensor data.

  7. Information on Hydrologic Conceptual Models, Parameters, Uncertainty Analysis, and Data Sources for Dose Assessments at Decommissioning Sites

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Philip D.; Gee, Glendon W.; Nicholson, Thomas J.

    2000-02-28

    This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases.

  8. Examination of Experimental Designs and Response Surface Methods for Uncertainty Analysis of Production Forecast: A Niger Delta Case Study

    Directory of Open Access Journals (Sweden)

    Akeem O. Arinkoola

    2015-01-01

    Full Text Available The purpose of this paper is to examine various DoE methods for uncertainty quantification of production forecast during reservoir management. Considering all uncertainties for analysis can be time consuming and expensive. Uncertainty screening using experimental design methods helps reducing number of parameters to manageable sizes. However, adoption of various methods is more often based on experimenter discretions or company practices. This is mostly done with no or little attention been paid to the risks associated with decisions that emanated from that exercise. The consequence is the underperformance of the project when compared with the actual value of the project. This study presents the analysis of the three families of designs used for screening and four DoE methods used for response surface modeling during uncertainty analysis. The screening methods (sensitivity by one factor at-a-time, fractional experiment, and Plackett-Burman design were critically examined and analyzed using numerical flow simulation. The modeling methods (Box-Behnken, central composite, D-optima, and full factorial were programmed and analyzed for capabilities to reproduce actual forecast figures. The best method was selected for the case study and recommendations were made as to the best practice in selecting various DoE methods for similar applications.

  9. Recommendations for probabilistic seismic hazard analysis: Guidance on uncertainty and use of experts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time period. Due to large uncertainties in all the geosciences data and in their modeling, multiple model interpretations are often possible. This leads to disagreement among experts, which in the past has led to disagreement on the selection of ground motion for design at a given site. In order to review the present state-of-the-art and improve on the overall stability of the PSHA process, the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), and the Electric Power Research Institute (EPRI) co-sponsored a project to provide methodological guidance on how to perform a PSHA. The project has been carried out by a seven-member Senior Seismic Hazard Analysis Committee (SSHAC) supported by a large number other experts. The SSHAC reviewed past studies, including the Lawrence Livermore National Laboratory and the EPRI landmark PSHA studies of the 1980`s and examined ways to improve on the present state-of-the-art. The Committee`s most important conclusion is that differences in PSHA results are due to procedural rather than technical differences. Thus, in addition to providing a detailed documentation on state-of-the-art elements of a PSHA, this report provides a series of procedural recommendations. The role of experts is analyzed in detail. Two entities are formally defined-the Technical Integrator (TI) and the Technical Facilitator Integrator (TFI)--to account for the various levels of complexity in the technical issues and different levels of efforts needed in a given study.

  10. Comparison between different uncertainty propagation methods in multivariate analysis: An application in the bivariate case

    Energy Technology Data Exchange (ETDEWEB)

    Mullor, R. [Dpto. Estadistica e Investigacion Operativa, Universidad Alicante (Spain); Sanchez, A., E-mail: aisanche@eio.upv.e [Dpto. Estadistica e Investigacion Operativa Aplicadas y Calidad, Universidad Politecnica Valencia, Camino de Vera s/n 46022 (Spain); Martorell, S. [Dpto. Ingenieria Quimica y Nuclear, Universidad Politecnica Valencia (Spain); Martinez-Alzamora, N. [Dpto. Estadistica e Investigacion Operativa Aplicadas y Calidad, Universidad Politecnica Valencia, Camino de Vera s/n 46022 (Spain)

    2011-06-15

    Safety related systems performance optimization is classically based on quantifying the effects that testing and maintenance activities have on reliability and cost (R+C). However, R+C quantification is often incomplete in the sense that important uncertainties may not be considered. An important number of studies have been published in the last decade in the field of R+C based optimization considering uncertainties. They have demonstrated that inclusion of uncertainties in the optimization brings the decision maker insights concerning how uncertain the R+C results are and how this uncertainty does matter as it can result in differences in the outcome of the decision making process. Several methods of uncertainty propagation based on the theory of tolerance regions have been proposed in the literature depending on the particular characteristics of the variables in the output and their relations. In this context, the objective of this paper focuses on the application of non-parametric and parametric methods to analyze uncertainty propagation, which will be implemented on a multi-objective optimization problem where reliability and cost act as decision criteria and maintenance intervals act as decision variables. Finally, a comparison of results of these applications and the conclusions obtained are presented.

  11. A novel approach to parameter uncertainty analysis of hydrological models using neural networks

    Directory of Open Access Journals (Sweden)

    D. P. Solomatine

    2009-07-01

    Full Text Available In this study, a methodology has been developed to emulate a time consuming Monte Carlo (MC simulation by using an Artificial Neural Network (ANN for the assessment of model parametric uncertainty. First, MC simulation of a given process model is run. Then an ANN is trained to approximate the functional relationships between the input variables of the process model and the synthetic uncertainty descriptors estimated from the MC realizations. The trained ANN model encapsulates the underlying characteristics of the parameter uncertainty and can be used to predict uncertainty descriptors for the new data vectors. This approach was validated by comparing the uncertainty descriptors in the verification data set with those obtained by the MC simulation. The method is applied to estimate the parameter uncertainty of a lumped conceptual hydrological model, HBV, for the Brue catchment in the United Kingdom. The results are quite promising as the prediction intervals estimated by the ANN are reasonably accurate. The proposed techniques could be useful in real time applications when it is not practicable to run a large number of simulations for complex hydrological models and when the forecast lead time is very short.

  12. Uncertainty analysis of Lead cross sections on reactor safety for ELECTRA

    Science.gov (United States)

    Alhassan, Erwin; Sjöstrand, Henrik; Duan, Junfeng; Gustavsson, Cecilia; Pomp, Stephan; Österlund, Michael; Rochman, Dimitri; Koning, Arjan J.

    2014-06-01

    The Total Monte Carlo (TMC) method was used in this study to assess the impact of Pb-206, 207 and 208 nuclear data uncertainties on keff , βeff, coolant temperature coefficient, the coolant void worth for the ELECTRA reactor. Relatively large uncertainties were observed in the keff and the coolant void worth for all the isotopes with significant contribution coming from Pb-208 nuclear data. The large Pb-208 nuclear data uncertainty observed was further investigated by studying the impact of partial channels on the keff and the βeff. Various sections of ENDF file: elastic scattering (n, el), inelastic scattering (n, inl), neutron capture (n, γ), (n, 2n), resonance parameters and the angular distribution were varied randomly and distributions in keff and the βeff obtained. The dominant contributions to the uncertainty in the keff from Pb-208 came from uncertainties in the resonance parameters; however, elastic scattering cross section and the angular distribution also had significant impact. The impact of nuclear data uncertainties on the βeff was observed to be small.

  13. An Example Uncertainty and Sensitivity Analysis for Reactive Transport at the Horonobe Site for Performance Assessment Calculations.

    Energy Technology Data Exchange (ETDEWEB)

    James, Scott; Cohan, Alexander [Sandia National Laboratories, Albuquerque, NM

    2005-08-01

    Given pre-existing Groundwater Modeling System (GMS) models of the Horonobe Underground Research Laboratory (URL) at both the regional and site scales, this work performs an example uncertainty analysis for performance assessment (PA) applications. After a general overview of uncertainty and sensitivity analysis techniques, the existing GMS site-scale model is converted to a PA model of the steady-state conditions expected after URL closure. This is done to examine the impact of uncertainty in site-specific data in conjunction with conceptual model uncertainty regarding the location of the Oomagari Fault. A heterogeneous stochastic model is developed and corresponding flow fields and particle tracks are calculated. In addition, a quantitative analysis of the ratio of dispersive to advective forces, the F-ratio, is performed for stochastic realizations of each conceptual model. Finally, a one-dimensional transport abstraction is modeled based on the particle path lengths and the materials through which each particle passes to yield breakthrough curves at the model boundary. All analyses indicate that accurate characterization of the Oomagari Fault with respect to both location and hydraulic conductivity is critical to PA calculations. This work defines and outlines typical uncertainty and sensitivity analysis procedures and demonstrates them with example PA calculations relevant to the Horonobe URL. Acknowledgement: This project was funded by Japan Nuclear Cycle Development Institute (JNC). This work was conducted jointly between Sandia National Laboratories (SNL) and JNC under a joint JNC/U.S. Department of Energy (DOE) work agreement. Performance assessment calculations were conducted and analyzed at SNL based on a preliminary model by Kashima, Quintessa, and JNC and include significant input from JNC to make sure the results are relevant for the Japanese nuclear waste program.

  14. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    The quality of analytical results is expressed by their uncertainty, as it is estimated on the basis of an uncertainty budget; little effort is, however, often spent on ascertaining the quality of the uncertainty budget. The uncertainty budget is based on circumstantial or historical data......, and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  15. Performatising the knower: On semiotic analysis of subject and knowledge

    Directory of Open Access Journals (Sweden)

    Artuković Kristina

    2013-01-01

    Full Text Available This paper considers epistemological implications of the concept of performative, starting from the elaborate conception provided by Judith Butler’s theories. The primary postulate of this work is that various interpretations of the performative, with their semiotic shifting from the notions of truth-evaluability and the descriptive nature of meaning, form a line of aban­doning traditional epistemological distinction between subject and object. Through other semiotic concepts which will be presented and analysed, this line reveals the key epistemological issues in the light of semiology, while Judith Butler’s concept of performativity is viewed as a possible outcome of this course of semiology of knowledge, resulting in final transcending of the category of subject.

  16. Dynamic Analysis of an Inflatable Dam Subjected to a Flood

    OpenAIRE

    Lowery, Kristen Mary

    1997-01-01

    A dynamic simulation of the response of an inflatable dam subjected to a flood was carried out to determine the survivability envelope of the dam where it can operate without rupture, or overflow. A fully nonlinear free-surface flow was applied in two dimensions using a mixed Eulerian-Lagrangian formulation. An ABAQUS finite element model was used to determine the dynamic structural response of the dam. The problem was solved in the time domain which allows the prediction of a number ...

  17. Validation and quantification of uncertainty in coupled climate models using network analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bracco, Annalisa [Georgia Inst. of Technology, Atlanta, GA (United States)

    2015-08-10

    We developed a fast, robust and scalable methodology to examine, quantify, and visualize climate patterns and their relationships. It is based on a set of notions, algorithms and metrics used in the study of graphs, referred to as complex network analysis. This approach can be applied to explain known climate phenomena in terms of an underlying network structure and to uncover regional and global linkages in the climate system, while comparing general circulation models outputs with observations. The proposed method is based on a two-layer network representation, and is substantially new within the available network methodologies developed for climate studies. At the first layer, gridded climate data are used to identify ‘‘areas’’, i.e., geographical regions that are highly homogeneous in terms of the given climate variable. At the second layer, the identified areas are interconnected with links of varying strength, forming a global climate network. The robustness of the method (i.e. the ability to separate between topological distinct fields, while identifying correctly similarities) has been extensively tested. It has been proved that it provides a reliable, fast framework for comparing and ranking the ability of climate models of reproducing observed climate patterns and their connectivity. We further developed the methodology to account for lags in the connectivity between climate patterns and refined our area identification algorithm to account for autocorrelation in the data. The new methodology based on complex network analysis has been applied to state-of-the-art climate model simulations that participated to the last IPCC (International Panel for Climate Change) assessment to verify their performances, quantify uncertainties, and uncover changes in global linkages between past and future projections. Network properties of modeled sea surface temperature and rainfall over 1956–2005 have been constrained towards observations or reanalysis data sets

  18. Photometric Uncertainties

    Science.gov (United States)

    Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon

    2018-01-01

    The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.

  19. Using CV-GLUE procedure in analysis of wetland model predictive uncertainty.

    Science.gov (United States)

    Huang, Chun-Wei; Lin, Yu-Pin; Chiang, Li-Chi; Wang, Yung-Chieh

    2014-07-01

    This study develops a procedure that is related to Generalized Likelihood Uncertainty Estimation (GLUE), called the CV-GLUE procedure, for assessing the predictive uncertainty that is associated with different model structures with varying degrees of complexity. The proposed procedure comprises model calibration, validation, and predictive uncertainty estimation in terms of a characteristic coefficient of variation (characteristic CV). The procedure first performed two-stage Monte-Carlo simulations to ensure predictive accuracy by obtaining behavior parameter sets, and then the estimation of CV-values of the model outcomes, which represent the predictive uncertainties for a model structure of interest with its associated behavior parameter sets. Three commonly used wetland models (the first-order K-C model, the plug flow with dispersion model, and the Wetland Water Quality Model; WWQM) were compared based on data that were collected from a free water surface constructed wetland with paddy cultivation in Taipei, Taiwan. The results show that the first-order K-C model, which is simpler than the other two models, has greater predictive uncertainty. This finding shows that predictive uncertainty does not necessarily increase with the complexity of the model structure because in this case, the more simplistic representation (first-order K-C model) of reality results in a higher uncertainty in the prediction made by the model. The CV-GLUE procedure is suggested to be a useful tool not only for designing constructed wetlands but also for other aspects of environmental management. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Report of the Racism and Sexism in Subject Analysis Subcommittee to the RTSD/CCS Subject Analysis Committee, Midwinter 1980.

    Science.gov (United States)

    Dickinson, Elizabeth M.; And Others

    Directed toward the eradication of sexual and racial bias in bibliographic systems, the subcommittee reports its progress in the identification of areas of classification systems and subject headings requiring change. A policy statement and six guidelines establish a framework for three categories of projects: (1) the need for changes in Library…

  1. Maximum Entropy-Copula Method for Hydrological Risk Analysis under Uncertainty: A Case Study on the Loess Plateau, China

    Directory of Open Access Journals (Sweden)

    Aijun Guo

    2017-11-01

    Full Text Available Copula functions have been extensively used to describe the joint behaviors of extreme hydrological events and to analyze hydrological risk. Advanced marginal distribution inference, for example, the maximum entropy theory, is particularly beneficial for improving the performance of the copulas. The goal of this paper, therefore, is twofold; first, to develop a coupled maximum entropy-copula method for hydrological risk analysis through deriving the bivariate return periods, risk, reliability and bivariate design events; and second, to reveal the impact of marginal distribution selection uncertainty and sampling uncertainty on bivariate design event identification. Particularly, the uncertainties involved in the second goal have not yet received significant consideration. The designed framework for hydrological risk analysis related to flood and extreme precipitation events is exemplarily applied in two catchments of the Loess plateau, China. Results show that (1 distribution derived by the maximum entropy principle outperforms the conventional distributions for the probabilistic modeling of flood and extreme precipitation events; (2 the bivariate return periods, risk, reliability and bivariate design events are able to be derived using the coupled entropy-copula method; (3 uncertainty analysis highlights the fact that appropriate performance of marginal distribution is closely related to bivariate design event identification. Most importantly, sampling uncertainty causes the confidence regions of bivariate design events with return periods of 30 years to be very large, overlapping with the values of flood and extreme precipitation, which have return periods of 10 and 50 years, respectively. The large confidence regions of bivariate design events greatly challenge its application in practical engineering design.

  2. Geostatistical upscaling of rain gauge data to support uncertainty analysis of lumped urban hydrological models

    Science.gov (United States)

    Muthusamy, Manoranjan; Schellart, Alma; Tait, Simon; Heuvelink, Gerard B. M.

    2017-04-01

    Geostatistical methods have been used to analyse the spatial correlation structure of rainfall at various spatial scales, but its application to estimate the level of uncertainty in rainfall upscaling has not been fully explored mainly due to its inherent complexity and demanding data requirements. In this study we presented a method to overcome these challenges and predict AARI together with associated uncertainty using geostatistical upscaling. Rainfall data collected from a cluster of eight paired rain gauges in a 400 × 200 sq. m. urban catchment are used in combination with spatial stochastic simulation to obtain optimal predictions of the spatially averaged rainfall intensity at any point in time within the urban catchment. The uncertainty in the prediction of catchment average rainfall intensity is obtained for multiple combinations of intensity ranges and temporal averaging intervals. The two main challenges addressed in this study are scarcity of rainfall measurement locations and non-normality of rainfall data, both of which need to be considered when adopting a geostatistical approach. Scarcity of measurement points is dealt with by pooling sample variograms of repeated rainfall measurements with similar characteristics. Normality of rainfall data is achieved through the use of Normal Score Transformation. Geostatistical models in the form of variograms are derived for transformed rainfall intensity. Next spatial stochastic simulation which is robust to nonlinear data transformation is applied to produce realisations of rainfall fields. These realisations in transformed space are first back-transformed and next spatially aggregated to derive a random sample of the spatially averaged rainfall intensity. This study shows that for small time and space scales the use of a single geostatistical model based on a single variogram is not appropriate and a distinction between rainfall intensity classes and length of temporal averaging intervals should be made

  3. Climate change - An uncertainty factor in risk analysis of contaminated land.

    Science.gov (United States)

    Augustsson, Anna; Filipsson, Monika; Oberg, Tomas; Bergbäck, Bo

    2011-10-15

    Metals frequently occur at contaminated sites, where their potential toxicity and persistence require risk assessments that consider possible long-term changes. Changes in climate are likely to affect the speciation, mobility, and risks associated with metals. This paper provides an example of how the climate effect can be inserted in a commonly used exposure model, and how the exposure then changes compared to present conditions. The comparison was made for cadmium (Cd) exposure to 4-year-old children at a highly contaminated iron and steel works site in southeastern Sweden. Both deterministic and probabilistic approaches (through probability bounds analysis, PBA) were used in the exposure assessment. Potential climate-sensitive variables were determined by a literature review. Although only six of the total 39 model variables were assumed to be sensitive to a change in climate (groundwater infiltration, hydraulic conductivity, soil moisture, soil:water distribution, and two bioconcentration factors), the total exposure was clearly affected. For example, by altering the climate-sensitive variables in the order of 15% to 20%, the